MySQL is one of the most widely used open-source Relational Database Management Systems (RDMS) in the world. It is built to store and manage relational data in a tabular format in a normalized manner. However, relational databases have a hard time scaling as they require a lot of memory and compute power.

Non-relational databases, on the other hand, are developed to store hundreds of petabytes of data and run millions of queries per second. MongoDB supports a flexible schema approach and is capable of storing large quantities of unstructured and semi-structured data. This article will provide you with a step-by-step guide to easily migrate data from MySQL to MongoDB.

Replicate MySQL to MongoDB with Python

The representation of data in MongoDB is completely different from MySQL. Hence, migration of MySQL to MongoDB can be a challenging and tedious task. Fortunately, Python’s strong connectivity and data handling capabilities come to the rescue. This article will use a simple Python script to migrate your MySQL Table data to MongoDB Collections. The scripts will work with any Python 3+ version on any platform.

Follow the below-mentioned steps to achieve a MySQL to MongoDB migration successfully.

Replicate Data from MySQL in Minutes Using Hevo’s No-Code Data Pipeline

Seamlessly perform data replication from MySQL using Hevo’s no-code data pipeline platform. Try our personalized product demo today to experience entirely automated, hassle-free data replication! No Credit Card Required!

GET STARTED WITH HEVO FOR FREE

Step 1: Install the Required Modules

The first step requires you to install the required modules to connect to the MySQL and MongoDB database instances. To do so, you can use the mysql.connector official module to connect to MySQL database. Similarly, you can use pymongo module to connect to MongoDB from Python.

Run the PIP commands as shown below to install the required modules for migrating data from MySQL to MongoDB.

pip install mysql-connector 
pip install pymongo

PIP is a package manager for Python modules/packages.

Step 2: Read Data from MySQL Table

The next step of the migration requires you to read data from the source MySQL Table. This data is then prepared in a format supported by MongoDB. As you know, MongoDB is a NoSQL database and it stores data as JSON documents. Hence, it is recommended to convert the MySQL data to JSON format before loading it into the target MongoDB database. 

Python’s strong data handling capabilities make it easy to generate the MySQL data in JSON format. Execute the following script to generate data in JSON format.

import mysql.connector

mysqldb = mysql.connector.connect(
    host="localhost"t,
    database="employees",
    user="root",
    password=""
)
 
mycursor = mysqldb.cursor(dictionary=True)
mycursor.execute("SELECT * from categories;")
myresult = mycursor.fetchall()
 
print(myresult)

Compiling the script without any error will return the following output.

[
  {
     "id":4,
     "name":"Medicine",
     "description":"<p>Medicine<br></p>",
     "created_at":"",
     "updated_at":""
  },
  {
     "id":6,
     "name":"Food",
     "description":"<p>Food</p>",
     "created_at":"",
     "updated_at":""
  },
  {
     "id":8,
     "name":"Groceries",
     "description":"<p>Groceries<br></p>",
     "created_at":"",
     "updated_at":""
  },
  {
     "id":9,
     "name":"Cakes & Bakes",
     "description":"<p>Cakes & Bakes<br></p>",
     "created_at":d"",
     "updated_at":""
  }
]

The result is a JSON array as dictionary=True argument was passed to the cursor. Now that you have the MySQL source data in JSON format, let’s go ahead and write it to MongoDB Collections.

It is feasible to transfer your data directly from MySQL to MongoDB, but using a document database in a relational way is not the most optimal way to ensure that your queries will be efficient. You should transform data in a format that is better suited for MongoDB. You can use arrays and embedded objects to limit the necessary queries and lookups. This transformation can be done using the exported files from MySQL with some custom code or using an ETL (Extract Transform Load) tool.

Step 3: Write to MongoDB Collections

The next step requires you to load the JSON format source data into a MongoDB Collection. A MongoDB Collection is basically the NoSQL equivalent of a MySQL Table and it comprises a set of documents.

Plan your new data schemas already. MongoDB lets you have a flexible data schema. Also, keep in mind the following things to increase efficiency: 

  • Merge one-to-one relationships into the same document.
  • when the number of elements is known and restricted, One-to-many relationships can be embedded into one document.
  • Use some of the advanced patterns available in MongoDB for Many-to-many relationships.

You can use the insert_many() method of the collection class to return the list of Object IDs of the inserted documents. However, this method requires a length check before the call as it will throw an exception when an empty list is passed as the argument. Execute the following script to load data into a MongoDB Collection.

import pymongo
 
mongodb_host = "mongodb://localhost:27017/"
 
mongodb_dbname = "mymongodb"
 
myclient = pymongo.MongoClient(mongodb_host)
 
mydb = myclient[mongodb_dbname]
 
mycol = mydb["categories"]
 
if len(myresult) > 0:
 
       x = mycol.insert_many(myresult) #myresult comes from mysql cursor
 
       print(len(x.inserted_ids))

After successful execution of this script, you can check your MongoDB Database Instance to verify if the documents have been inserted into the collection.

Since MongoDB is schema-less, you don’t need to pre-define a schema to insert documents. The database, collection, and schema are created automatically by MongoDB.

Step 4: Complete Script

Just combine the above scripts to read a MySQL Table and insert it into a MongoDB Collection.

import mysql.connector
 
import pymongo
 
delete_existing_documents = True
 
mysql_host="localhost"
 
mysql_database="mydatabase"
 
mysql_schema = "myschema"
 
mysql_user="myuser"
 
mysql_password="********"
 
mongodb_host = "mongodb://localhost:27017/"
 
mongodb_dbname = "mymongodb"
 
mysqldb = mysql.connector.connect(
 
   host=mysql_host,
 
   database=mysql_database,
 
   user=mysql_user,
 
   password=mysql_password
 
)

mycursor = mysqldb.cursor(dictionary=True)
 
mycursor.execute("SELECT * from categories;")
 
myresult = mycursor.fetchall()
 
myclient = pymongo.MongoClient(mongodb_host)
 
mydb = myclient[mongodb_dbname]
 
mycol = mydb["categories"]
 
if len(myresult) > 0:
 
       x = mycol.insert_many(myresult) #myresult comes from mysql cursor
 
       print(len(x.inserted_ids))

Step 5: Enhance the Script to Load All Tables in a MySQL Schema

The next step of MySQL MongoDB data replication requires you to iterate through the list of all tables in the MySQL Database and insert the results in a new collection. You can perform this MySQL to MongoDB query using the information_schema.tables metadata table. This returns the list of all tables in a given schema. After that, you may iterate through the tables and call the above script to move the data from each table.

#Iterate through the list of tables in the schema
 
table_list_cursor = mysqldb.cursor()
 
table_list_cursor.execute("SELECT table_name FROM information_schema.tables WHERE table_schema = %s ORDER BY table_name;", (mysql_schema,))
 
tables = table_list_cursor.fetchall()
 
for table in tables:
 
    #Execute the migration script for 'table'

This can be accomplished by abstracting the migration logic into a function.

#Function migrate_table
 
def migrate_table(db, col_name):
 
   mycursor = db.cursor(dictionary=True)
 
   mycursor.execute("SELECT * FROM " + col_name + ";")
 
   myresult = mycursor.fetchall()
 
   mycol = mydb[col_name]
 
   if delete_existing_documents:
 
       #delete all documents in the collection
 
       mycol.delete_many({})
 
   #insert the documents
 
   if len(myresult) > 0:
 
       x = mycol.insert_many(myresult)
 
       return len(x.inserted_ids)
 
   else:
 
        return 0

You can use the print statements to track the progress of the script. You can refer to the complete script for a better understanding.

That’s it, with some basic knowledge of Python programming, you can easily migrate data from MySQL to MongoDB. However, this method is best suited for professions with technical experience. You can opt for a third-party solution if you don’t want to spend a lot of time writing custom scripts and resolving data issues.

Limitations of Using Manual Method to Migrate MySQL to MongoDB:

  • Complexities in MongoDB: MongoDB uses multi-document ACID (Atomicity, Consistency, Isolation, and Durability) transactions. It refers to the process of reviewing and eliminating unwanted data. The majority of the applications does not require transactions, although there are a few that may need it to update multiple documents and collections. This is one of the major limitations of MongoDB, as it may lead to corruption of data.

Also, MongoDB enables a limited size of only 16 MB for a document. Performance nesting for documents is also limited to only 100 levels.

  • Time-Consuming Process: Writing codes is a time-consuming process as it requires a lot of time for debugging and customizing code. 
  • Scalability: As already mentioned, MongoDB enables a limited size for a document. When dealing with large volumes of data, using the manual method will create an additional issue as it becomes cumbersome to customize and iterate code for large volumes of data.

Before wrapping up, let’s cover some basics.

Why Migrate from MySQL to MongoDB?

MySQL to MongoDB: Database structure
Image Source

MySQL came into existence in 1995 and it has been powering the most demanding Web, E-commerce, and Enterprise Data Management applications since then. But in today’s world, processing data, building applications, and analyzing results have all gotten considerably much more complex.

With data incoming from a variety of sources, organizations are finding it difficult to manage their increasingly complicated user loads with traditional Relational Databases such as MySQL. And, this is primarily the reason why it is important for companies to migrate from MySQL to MongoDB.

New world engineering applications demand the persistence of complex and dynamic forms of data to match the highly flexible and powerful languages used for software development in today’s world. MongoDB, the top NoSQL database, tackles the above-mentioned issues and provides users with a flexible and scalable solution to match the needs of modern applications.

The schema-less and unstructured design of MongoDB allows you to store documents with different datasets in contrast to a relational database. Setting an example, industry leaders such as Cisco, Verizon, and others have successfully transitioned from relational databases like MySQL to MongoDB. Take a look at how to migrate data from relational databases to MongoDB in this blog.

Coming back to the topic at hand, here’s what you’ll be getting after migrating from MySQL to MongoDB.

  • The Document Data Model allows you to work with data in the most efficient way possible.
  • A Distributed Systems Design allows you to intelligently put data wherever you want to.
  • A unified MongoDB experience frees you from vendor lock-in and gives you the freedom to future-proof your work.

Here are some of the use cases to convert MySQL database to MongoDB.

  • Handling Large Data: MongoDB can be used to integrate large volumes of data. This helps to create a single unified view that other databases cannot.
  • Dealing with Complex Data Structure: MySQL to MongoDB migration enables the handling of complex data structures. MongoDB allows the embedding of documents to describe nested structures and easily tolerate variations in data in generations of documents. It also supports specialized data formats like geospatial. 
  • Supports Multi-cloud applications: MongoDB can be used to run on a desktop, a massive cluster of computers in a data center, or in a public cloud, either as installed software or through MongoDB Atlas, a database-as-a-service product. 

Conclusion

The way applications are built and run in today’s world is pushing relational databases like MySQL to their limits. This is why NoSQL databases like MongoDB have exploded in popularity in recent years. MongoDB stores data into collections of self-contained JSON documents. MongoDB has gained widespread acceptance as a leading database owing to its dynamic schema, high scalability, and faster access.

This article introduced you to MySQL and MongoDB and later provided you with a step-by-step guide to migrating from MySQL to MongoDB. However, it is important to consolidate data from MySQL, MongoDB, and other data sources into a Single Storage System to get a complete overview of your business performance. Connecting MySQL and MongoDB to a data warehouse like Snowflake, Redshift, and BigQuery using a data integration tool like Hevo can save you a lot of time and effort.

visit our website to explore hevo

Hevo Data with its strong integration with 150+ Data Sources & BI tools such as MySQL and MongoDB, allows you to not only export data from sources & load data to the destinations, but also transform & enrich your data, & make it analysis-ready so that you can focus only on your key business needs and perform insightful analysis using BI tools.

Give Hevo Data a try and sign up for a 14-day free trial today. Hevo plans & pricing for different use cases and business needs, check them out!

Share your experience of migrating from MySQL to MongoDB in the comments section below.

Raj Verma
Business Analyst, Hevo Data

Raj, a data analyst with a knack for storytelling, empowers businesses with actionable insights. His experience, from Research Analyst at Hevo to Senior Executive at Disney+ Hotstar, translates complex marketing data into strategies that drive growth. Raj's Master's degree in Design Engineering fuels his problem-solving approach to data analysis.

No-code Data Pipeline For Your Data Warehouse