If you’ve landed on this blog, chances are you’re curious about AWS Database Migration Service (DMS) and how it can help you move your databases to the cloud. You’re in the right place! We’re going to walk through the AWS DMS architecture, break down how it works, and even discuss some limitations and considerations you should keep in mind. So, let’s dive right in!

What is AWS DMS?

Before jumping into the architecture, let’s quickly recap AWS DMS. AWS Database Migration Service is a tool that helps you migrate your databases to your target destination. Whether you’re moving from an on-premises database to AWS or between different types of databases (say, from Oracle to MySQL), DMS makes the whole process smoother and more manageable.

One of DMS’s features is that it supports both homogenous migrations (where the source and target databases are the same, like MySQL to MySQL) and heterogeneous migrations (where you’re moving from one database type to another, like Oracle to PostgreSQL). You can keep your source database operational during the migration using DMS’s Change Data Capture (CDC) capability. To read about it, check out our blog on AWS DMS CDC.

Accomplish seamless Data Migration with Hevo!

Looking for the best ETL tools to connect your data sources? Rest assured, Hevo’s no-code platform helps streamline your ETL process. Try Hevo and equip your team to: 

  • Integrate data from 150+ sources(60+ free sources).
  • Utilize drag-and-drop and custom Python script features to transform your data.
  • Risk management and security framework for cloud-based systems with SOC2 Compliance.

Try Hevo and discover why 2000+ customers, such as Postman and ThoughtSpot, have chosen Hevo over tools like AWS DMS to upgrade to a modern data stack.

Get Started with Hevo for Free

The Architecture of AWS DMS

Now that we have an introduction to what AWS DMS does let’s look at the architecture that makes it all possible.

AWS DMS Architecture diagram

The Key Components

The diagram above shows the data flow from your source database to your target database, with AWS DMS acting as the means to make it all happen.

AWS DMS has a few core components that work together to handle your database migration:

  1. Source Database
  2. Source Endpoint
  3. Replication Instance
  4. Replication Task
  5. Target Endpoint
  6. Target Database

Let’s break each of these down.

  1. Source Database
    This is where your data currently lives. It could be an on-premises database, a database running on an EC2 instance, or even another cloud-based database.
    AWS DMS supports a wide range of database engines for the source, including:
    • Amazon RDS
    • Amazon S3
    • Oracle
    • SQL Server
    • MySQL
    • PostgreSQL
    • MongoDB
    • MariaDB
    • SAP Adaptive Server Enterprise (ASE)
    • IBM Db2
    • Amazon Aurora
    • Microsoft Azure SQL Database
    • Google Cloud for MySQL and PostgreSQL
  1. Source Endpoint
    This is the connection point between your source database and AWS DMS. The Source Endpoint is responsible for securely connecting to your source database so that the Replication Instance can start pulling data.
    When configuring a Source Endpoint, specify the database type, connection details, and additional settings needed to access your database.
  1. Replication Instance
    The Replication Instance is the engine that drives the migration process. Think of it as the brain of the operation. When you set up a migration, the Replication Instance runs the tasks that move data from your source to your target database.

    When you create a Replication Instance, you choose the instance class that determines the compute and memory capacity. This is important because the size and performance of the Replication Instance directly impact how fast and efficiently your data gets moved.

    The Replication Instance does all the heavy lifting, including data transformation (in the case of heterogeneous migrations), monitoring, and logging. It’s also the component connecting the source and target databases through endpoints.
  1. Replication Task
    This is where you define what gets migrated and how. This task defines what data gets migrated and how. You can set up a replication task to perform a full load (migrating all data), ongoing replication using Change Data Capture (CDC), or both.
    Each replication task is configured with a set of rules that define the scope of the migration, such as which schemas, tables, or specific rows/columns to include or exclude. You can also define transformation rules if you need to modify the data during migration.
    • Full Load: Copies all the data from the source database to the target.
    • CDC (Change Data Capture): Captures ongoing changes in the source database and replicates them to the target in near real-time.
    • Full Load + CDC: First copies all data and then continues to capture changes in real time.
  1. Target Endpoint
    The Target Endpoint is where your data is headed after the migration. It could be another database engine, an S3 bucket (for data lakes), or a Redshift cluster, among other things.

    During the migration, the Replication Instance pushes data to the Target Endpoint, and depending on your needs, it can transform the data to match the target database schema.
  1. Target Database
    After the migration, your data will live in the target database. Depending on your use case, this might be an identical copy of your source database, or you might use the migration as an opportunity to upgrade or reorganize your data.

    AWS DMS supports a wide range of target databases, including the same engines as the source (Oracle, SQL Server, MySQL, MariaDB, PostgreSQL, etc.) and Amazon-specific services like Amazon RDS, Amazon Aurora, and Amazon Redshift.

Limitations and Considerations

Like any tool, AWS DMS isn’t without its limitations. Let’s talk about some of the key considerations you should keep in mind when planning your migration.

  1. Limited Data Transformations
    While AWS DMS does offer some data transformation capabilities, it’s not a full-fledged ETL (Extract, Transform, Load) tool. If you have complex transformation requirements, you might need to pair DMS with other services like AWS Glue or an ETL tool to handle the heavy lifting.
  1. Performance Considerations
    The performance of your migration largely depends on the size and configuration of your Replication Instance and the network bandwidth between your source and target databases. For large datasets, migrations can take longer than expected, especially if you’re dealing with heterogeneous migrations that involve data transformation.

    Additionally, if your source database is highly transactional, the CDC process can generate a lot of changes to replicate, which may impact performance. In such cases, it’s important to carefully monitor the migration and optimize your Replication Instance and network settings.
  1. Database Compatibility
    AWS DMS supports many database engines but is not a one-size-fits-all solution. It requires either your source or target inside your AWS Ecosystem. Some database features or custom configurations might not be fully supported, so that additional manual intervention could be required during the migration. Reviewing the AWS DMS documentation and testing your migration in a staging environment before going live is always a good idea.
  1. Schema Conversion
    For heterogeneous migrations, AWS DMS requires that you use the AWS Schema Conversion Tool (SCT) to convert the schema of your source database to match the target database. While SCT is a powerful tool, it might not always produce a perfect conversion, especially for complex schemas with custom functions, stored procedures, or triggers. You may need to adjust the converted schema to ensure compatibility manually.
  1. Costs
    AWS DMS is a managed service, which means you pay for the resources you consume. This includes the hourly cost of the Replication Instance and any additional storage or data transfer costs. Depending on the scale of your migration, these costs can add up, so it’s important to factor them into your budget.
  1. Security Considerations
    Security is paramount when migrating sensitive data. While AWS DMS supports encrypted connections to both source and target databases, you must ensure that your security groups, VPC configurations, and IAM roles are set up correctly to prevent unauthorized access. Additionally, if you’re migrating data across regions, consider the impact of data sovereignty and compliance requirements.

Final Thoughts: Is AWS DMS Right for You?

AWS DMS is a powerful tool that can significantly simplify the process of migrating databases to the cloud. Its flexibility, support for a wide range of database engines, and minimal downtime make it an attractive option for many organizations.

However, while AWS DMS simplifies many aspects of your database migration, it’s not without its challenges. As we discussed earlier, limitations like performance considerations, limited transformation capabilities, and potential schema conversion difficulties mean that DMS might not cover all your needs out of the box. Planning is key; in some cases, you may need to supplement DMS with additional tools or manual steps to ensure a successful migration.

But what if you’re looking for something that takes simplicity to the next level? What if you want a tool that doesn’t just help with database migrations but also streamlines the entire process of moving data from various sources to your data warehouse? That’s where Hevo comes into play.

Introducing Hevo: A No-Code Data Pipeline Solution

Hevo is a no-code data pipeline platform that can revolutionize data migration and integration. If you’re looking for a solution that’s easy to set up and manage yet powerful enough to handle complex data flows, Hevo might be the perfect fit.

Why Choose Hevo?

  1. No-Code Pipeline Setup in Just 2 Steps
    One of Hevo’s biggest selling points is its simplicity. You don’t need to be a data engineering wizard to set up your data pipelines. With Hevo, you can get up and running in just 2 steps. This ease of use makes it accessible to a wide range of users, from data analysts to business managers.
  2. Support for 150+ Data Sources (Including 60+ Free Sources)
    Hevo boasts a vast library of over 150 data sources, including 60+ that are completely free. Whether you’re pulling data from cloud storage, databases, SaaS applications, or custom sources, Hevo has you covered. This extensive support ensures you can connect to virtually any data source in your ecosystem.
  3. Comprehensive Pre and Post-Load Transformations
    Data transformation is often a critical step in data integration, and Hevo excels in this area. Whether you need to clean, aggregate, or enrich your data, Hevo provides both pre-load and post-load transformation capabilities. You can write custom Python scripts or use Hevo’s intuitive drag-and-drop interface to define your transformations, making it easy to customize your data flows to meet your specific needs.
  4. Support for Both ETL and ELT Processes
    Hevo offers the flexibility to choose between ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes. Depending on your use case, you can decide whether to transform your data before loading it into the target destination or after it’s been loaded. This flexibility allows you to optimize your data pipelines for performance and cost.
  5. Real-Time Data Ingestion and Batch Streaming
    Whether you need to ingest data in real-time or prefer batch processing, Hevo has you covered. The platform supports real-time data ingestion and batch streaming, allowing you to choose the approach that best fits your business requirements. Real-time data ingestion is particularly useful for scenarios where timely data is crucial, such as real-time analytics or monitoring.

In conclusion, while AWS DMS is a great tool for specific database migration tasks, Hevo offers a broader solution for organizations to streamline their entire data integration process. If you’re ready to take your data pipelines to the next level, try Hevo—you won’t be disappointed!

FAQ on AWS DMS Architecture

What is DMS used for in AWS?

AWS Database Migration Service (DMS) is used to migrate databases from one platform to another with minimal downtime. It supports homogenous migrations (e.g., Oracle to Oracle) and heterogeneous migrations (e.g., Oracle to MySQL). DMS can also replicate ongoing changes to keep source and target databases in sync.

Is AWS DMS an ETL tool?

AWS DMS is primarily a data migration tool rather than a full ETL (Extract, Transform, Load) tool. While it can perform some transformations during migration, it’s not designed for complex data transformations like traditional ETL tools. DMS focuses on data extraction and loading with minimal transformation capabilities.

Is AWS DMS fully managed?

Yes, AWS DMS is fully managed. AWS handles the underlying infrastructure, including provisioning, scaling, monitoring, and maintenance, so you can focus on the migration process without worrying about server management.

Chirag Agarwal
Principal CX Engineer, Hevo Data

Chirag is a seasoned support engineer with over 7 years of experience, including over 4 years at Hevo Data, where he's been pivotal in crafting core CX components. As a team leader, he has driven innovation through recruitment, training, process optimization, and collaboration with multiple technologies. His expertise in lean solutions and tech exploration has enabled him to tackle complex challenges and build successful services.