In the modern data-driven business landscape, extracting actionable insights from available data helps improve performance and growth. Amazon RDS (Relational Database Service) is a popular choice for a fully-managed cloud environment to manage MySQL databases. Connecting MySQL on Amazon RDS to Firebolt will provide you with advanced data processing and real-time analytical capabilities.

Let’s look at the different methods to connect these two platforms. 

Methods to Connect MySQL on Amazon RDS to Firebolt


  • A running instance of Amazon RDS MySQL.
  • SELECT and REPLICATION privileges are granted to the database user.
  • A Firebolt account and its credentials (username and password).
  • A Firebolt database with a General Purpose engine attached to it.
  • An S3 bucket in the same region as the Firebolt database.

Method 1: Load Data from MySQL on Amazon RDS to Firebolt Using CSV Files

Step 1: Export Data from Amazon RDS MySQL as CSV Files

  • Use any SQL client to connect to your Amazon RDS instance. You can use the MySQL command-line client or MySQL Workbench.
  • Run the SELECT command to retrieve data from the table you want to export.
SELECT * FROM users;

This SQL statement exports data from a table called users.

  • Use the INTO OUTFILE clause in the SELECT statement to export to a CSV file.
SELECT * INTO OUTFILE S3 's3://bucket-name/path/to/users.csv' 
FROM users;

This SQL statement exports data from the users table into Amazon S3 as users.csv.

Step 2: Upload the Data from S3 to Firebolt

Here are the steps that will help move your data from the S3 bucket to a Firebolt database:

  • Create an External Table: Firebolt uses an external table, a special virtual table, to connect to your data source. Create one by running the CREATE EXTERNAL TABLE SQL command. This will be the connector between your S3 bucket resources and Firebolt. Here’s a sample SQL code to create an external table:
 Name TEXT,
 LevelType TEXT,
 MinPointsToPass INTEGER,
 MaxPoints INTEGER, 
 NumberOfLaps INTEGER,
 MaxPlayers INTEGER,
 MinPlayers INTEGER,
 PointsPerLap REAL,
 MusicTrack TEXT,
 SceneDetails TEXT,
 MaxPlayTimeSeconds INTEGER,
 LevelIcon TEXT
URL = 's3://firebolt-publishing-public/help_center_assets/firebolt_sample_dataset/'
-- CREDENTIALS = ( AWS_KEY_ID = '******' AWS_SECRET_KEY = '******' )
OBJECT_PATTERN = 'help_center_assets/firebolt_sample_dataset/levels.csv'

This SQL query will create an external table ex_levels if it doesn’t already exist. The column definitions that are specified define the structure of the table and will map to data fields in the source data file. They’re specified as sources in the INSERT INTO statement for ingestion.

  1. The URL specifies the data source in S3.
  2. The CREDENTIALS are optional and specify a role or AWS key details for permission to read from the S3 location.
  3. The OBJECT_PATTERN defines the object pattern within the S3 bucket. All files that match the OBJECT_PATTERN in the location specified in the URL will be processed during ingestion.
  4. TYPE specifies the format of the external data source.
  • Create a Fact Table: A fact table is required to store the data in Firebolt for querying. Run the CREATE FACT TABLE command to create one.
  • Run the INSERT INTO Command: Use a general-purpose engine to run the INSERT INTO command. This will load the data from the external table into the fact table.

Here are a few advantages of using CSV files to replicate data from MySQL on Amazon RDS to Firebolt:

  • By moving data from Amazon RDS MySQL in CSV files, you can perform data verification and validation before loading it into Firebolt. This helps ensure data accuracy and integrity during the data loading process.
  • Since the data is within your infrastructure during this process, it ensures data privacy and security. This method is beneficial for handling sensitive data instead of handing it over to third-party integration tool providers.
  • This method is useful for exporting small datasets or one-time data transfer from Amazon RDS MySQL to Firebolt.

Here are some limitations associated with this method:

  • This method requires technical expertise, especially SQL. Exporting data from Amazon RDS MySQL and creating external and fact tables in Firebolt both need SQL knowledge.
  • Manually exporting MySQL on Amazon RDS data as CSV files and then loading the data into Firebolt involves latency. Because of this latency, data updates cannot happen in real time. If your business relies on real-time analytics for decision-making, this might not be a feasible solution.

Method 2: Use a No-Code Tool to Automate the MySQL on Amazon RDS Firebolt Migration Process

No-code ETL tools can help overcome the limitations of the previous method. Such tools are also associated with multiple benefits, including:

  • Simplified Process: No-code ETL tools simplify the process of setting up a data migration pipeline. Not requiring complex codes or scripts and with an intuitive interface, even non-technical users can use these tools. 
  • Scalability: No-code tools are designed for scalability and are suitable for both large-scale and small-scale data migrations. As data volumes increase, the data pipelines adapt by scaling horizontally, ensuring no data is lost in the migration process.
  • Real-time Data Migration: No-code ETL tools are typically designed for real-time data integration. This ensures data accuracy across systems and offers you a competitive advantage of real-time analytics for impactful decision-making.
  • Pre-Built Integrations and Connectors: Most no-code tools have pre-built integrations and connectors for popular data sources and destinations. These connectors simplify the process of connecting two platforms for data migration and eliminate the need for excessive manual efforts.

While there is a range of no-code tools that you can use, Hevo Data makes an excellent choice. It is a fully-automated data migration solution that can replicate data from over 150+ sources and 15+ destinations. Whether you want to integrate multiple data sources or prepare your data for transformation, Hevo can simplify your tasks.

To set up a MySQL on Amazon RDS to Firebolt ETL using Hevo, here’s the process to follow.

Before you start replicating your data, here are a few prerequisites to keep in mind while using Hevo:

  • Hevo is authorized to connect to your database port via SSH Tunnel or IP whitelisting.
  • Hostname and port number of the source instance.
  • If the Pipeline mode is BinLog:
    • Enable Binary Log (Binlog) replication.
    • The Amazon RDS MySQL database is running MySQL v5.6 or higher.

Once these prerequisites are taken care of, here are the steps to set up the data migration pipeline:

Step 1: Configure Amazon RDS MySQL as a Source

MySQL on Amazon RDS to Firebolt: Configure MySQL on Amazon RDS as a Source

Step 2: Configure Firebolt as the Destination

MySQL on Amazon RDS to Firebolt: Configure Firebolt as a Destination

What Can You Achieve by Migrating Data from MySQL on Amazon RDS to Firebolt?

A MySQL on Amazon RDS Firebolt integration provides businesses with the following solutions:

  • Aggregate the data of individual interaction of the product for any event.
  • Finding the customer journey within the product (website/application).
  • Integrating transactional data from different functional groups (sales, marketing, product, Human Resources) and finding answers. For examples:
    • Which Development features were responsible for an App.
    • Outage in a given duration? Which product categories on your website were most profitable?
    • How does the Failure Rate in individual assembly units affect Inventory Turnover?


Migrating data from MySQL on Amazon RDS to Firebolt will allow you to harness the true potential of your data. By leveraging Firebolt’s blazing-fast analytical capabilities, you can gain actionable insights to promote business growth.

There are two methods that you can use to move data from MySQL on Amazon RDS to Firebolt. Using CSV files is one way to do this. However, this method has limitations, like being time-consuming, requiring technical expertise, and lacking automation or real-time integration. The other method is to use a no-code ETL tool like Hevo Data.

Hevo provides a hassle-free solution and helps you directly transfer data between the two platforms effortlessly. This fully-managed platform completely automates the ETL process, from extracting data from the source and transforming it to loading it to the destination.

If you don’t want SaaS tools with unclear pricing that burn a hole in your pocket, opt for a tool that offers a simple, transparent pricing model. Hevo has 3 usage-based pricing plans starting with a free tier, where you can ingest up to 1 million records.

Schedule a demo to see if Hevo would be a good fit for you, today!

Freelance Technical Content Writer, Hevo Data

Suchitra's profound enthusiasm for data science and passion for writing drives her to produce high-quality content on software architecture, and data integration

All your customer data in one place.