Summary IconKey takeaways
  • You can connect MySQL to BigQuery in three main ways: automated no-code pipelines, manual ETL scripts, or Google Cloud’s native BigQuery Data Transfer Service (BQ DS).
  • Automated tools like Hevo Data offer the simplest and most reliable option, with real-time sync, zero maintenance, and fast setup.
  • Manual ETL requires exporting data, transforming it, uploading files to GCS, and loading them into BigQuery, best for one-time or low-frequency migrations.
  • Google Cloud’s BQ DTS provides scheduled batch transfers for teams already using GCP, but it doesn’t support real-time sync or complex transformations.

Migrating data from MySQL to BigQuery is a common requirement for teams that need faster analytics and reporting. MySQL is built for transactional workloads, but it does not perform well for large analytical queries. 

Moving your data into BigQuery helps you run queries at scale, work with bigger datasets, and reduce load on your MySQL database.

This guide explains how to set up MySQL to BigQuery integration using three easy methods.

  • The first method uses an automated pipeline with almost no manual work. 
  • The second method walks through a manual ETL process
  • The third method covers Google Cloud’s native BQ DTS service for scheduled batch transfers.

Here is a quick comparison to help you choose the right method:

CategoryHevo Data (Automated Pipeline)Manual ETL (Custom Scripts + GCS + Load Jobs)Google Cloud Native (BQ DTS)
Best ForNo-code, reliable MySQL to BigQuery integrationOne-time or low-frequency transfersGoogle Cloud users needing scheduled batch transfers
Setup EffortVery LowHighModerate
Data Sync FrequencyContinuous or batch, as per needManual / ScheduledScheduled (Daily/Hourly/Custom)
MaintenanceFully ManagedHighModerate
Best Use CaseTeams needing simple, automated, and scalable pipelines without engineering effortEngineering teams can manage extraction, transformation, and loading manuallyTeams that prefer Google-native tools
Looking for the easiest way to connect MySQL and BigQuery?

Hevo lets you build a fully automated pipeline in minutes, no scripts, no maintenance, no complexity. Just set it once and let Hevo keep your data flowing reliably.

Why Hevo stands out:

  • Loved by 2000+ customers (4.7/5 on Capterra)
  • Easy-to-use interface
  • Automated scaling and monitoring
  • 24/7 expert support
Get Started with Hevo for Free

What Data Can You Migrate to BigQuery from MySQL?

MySQL stores data in a relational, table-based format, and all of this structured data can be moved into BigQuery. Since both platforms use tables and SQL, most MySQL datasets transfer smoothly, including:

  • Transactional tables
  • Analytics tables
  • Lookup and dimension tables
  • Historical or archived datasets
  • Any structured data stored in rows and columns

BigQuery supports almost all MySQL data types, although some require conversion. Fields like integers, strings, timestamps, and numeric types map directly, while types such as ENUM or SET must be transformed into compatible formats.

Once migrated, the data is stored as managed BigQuery tables, secured inside Google’s warehouse, and ready for advanced analytics, BI, or machine learning workloads.

What are the Methods to Connect MySQL to BigQuery?

There are multiple ways to move data from MySQL into Google BigQuery, depending on how frequently your data changes, your team’s technical skills, and how much maintenance you want to handle. Choosing the right method depends on balancing setup effort, reliability, and long-term scalability.

Let us look at three practical approaches:

Method 1: Automated Pipeline (Hevo Data)

A no-code, fully managed solution that syncs MySQL data to BigQuery in real time without scripting or infrastructure maintenance. This works best for teams that want automation, continuous updates, and predictable pricing.

Method 2: Manual ETL Scripts

A hands-on process that involves writing custom extraction and transformation scripts, uploading data to Google Cloud Storage, and loading it into BigQuery. This is suitable for one-time migrations or infrequent updates where automation is not required.

Method 3: Google Cloud Native (BQ DTS)

A built-in Google Cloud option that schedules batch transfers from MySQL to BigQuery using BigQuery Data Transfer Service.

Let’s explore each method in detail so you can choose the approach that aligns best with your data workflow and migration needs.

Method 1: Using Automated No-Code Pipeline (Hevo Data) 

Hevo provides a fully managed, no-code pipeline that moves data from MySQL to BigQuery in just a few clicks. There is no scripting, no infrastructure to manage, and no need to worry about schema changes or pipeline failures. You simply configure your source and destination once, and Hevo handles extraction, loading, retries, and monitoring automatically.

Step 1: Connect and Configure Your MySQL Database

  • Click Pipelines in the Navigation Bar
  • Click + Create
  • Select MySQL as the source
  • Enter your connection settings (host, port, database, credentials)
SQL source configuration

Step 2: Choose BigQuery as the Destination

  • Click Destinations
  • Click + Create
  • Select Google BigQuery
  • Enter your BigQuery project, dataset, and authentication details
Google bigquery destination configuration

It is that simple. While you relax, Hevo will fetch the data and send it to your destination Warehouse.

“Hevo Data makes setting up and maintaining data pipelines extremely simple. The no-code interface, wide range of connectors, and automated schema mapping reduce the effort of integrating multiple data sources into a central warehouse. Its real-time replication capability ensures that analytics teams always have fresh data available without complex engineering setups.”

– Ravi Shankar S.

  Full stack developer

Advantages of using the Hevo method

  • Fast, No-Code Setup: Connect MySQL and BigQuery in minutes without building or maintaining scripts, servers, or cron jobs.
  • Automatic Error Handling & Schema Management: Built-in retries, auto-healing, and seamless handling of schema changes prevent data gaps and broken pipelines.
  • Fully Managed and Low Maintenance: Hevo takes care of infrastructure, scaling, updates, and reliability so teams don’t spend engineering time managing ETL operations.
  • Transparent Pricing and Complete Visibility: Predictable pricing and detailed monitoring dashboards help teams control costs and track pipeline health easily.
Load Data from MySQL to BigQuery
Load Data From MongoDB to BigQuery
Load Data from MS SQL Server to BigQuery

Method 2: Manual ETL Process to Connect MySQL to BigQuery

The manual approach to MySQL–BigQuery integration involves writing custom extract, transform, and load scripts. While this method offers full control over the pipeline, it also requires significant engineering effort and ongoing maintenance. Manual ETL is typically used for one-time migrations, periodic batch loads, or when teams want complete control over their export format.

There are two ways to implement manual ETL:

  1. Full Dump and Load
  2. Incremental Dump and Load

Below is a clear walkthrough of how each approach works.

1. Full Dump and Load

A full dump and load extracts the entire MySQL table and replaces (or overwrites) the corresponding table in BigQuery. This is usually done during initial migrations or for smaller tables that can be fully reloaded without performance concerns.

Bucket selection

The high-level steps to be followed to replicate MySQL to BigQuery are:

  • Step 1: Extract Data from MySQL
  • Step 2: Clean and Transform the Data
  • Step 3: Upload to Google Cloud Storage(GCS)
  • Step 4: Upload to the BigQuery Table from GCS
Step 1: Extract Data from MySQL

You can extract data in several ways:

Option 1: Using SELECT … INTO OUTFILE

Exports table data directly into a CSV file.

SELECT *

INTO OUTFILE 'filename.csv'

CHARACTER SET 'utf8'

FIELDS TERMINATED BY '\t'

FROM your_table;
Option 2: Using mysqldump

Creates a logical backup of a table or database.

mysqldump -u <username> -h <host> -p db_name table_name > table_name.sql

You must convert the SQL output into CSV using a script or a conversion tool.

Option 3: Using SQL Query + CLI Utilities

Run a SQL query and redirect the output to a text file, then clean it with tools like sed or awk.

mysql -B -u user -h mysql_host -e "select * from table_name;" > table_name.txt
Step 2: Clean and Transform the Data

Before loading into BigQuery, ensure:

  • Data is UTF-8 encoded
  • Column types match BigQuery equivalents
  • Dates follow the YYYY-MM-DD format
  • Timestamps use colon-separated time
  • Text fields are properly quoted
  • Unsupported types (ENUM, SET) are mapped to compatible formats
MySQL Data TypeBigQuery Data Type
TINYINTINT64
SMALLINTINT64
MEDIUMINTINT64
INTINT64
BIGINTINT64
DECIMALNUMERIC
FLOATFLOAT64
DOUBLEFLOAT64
BITBOOL
CHARSTRING
VARCHARSTRING
BINARYBYTES
VARBINARYBYTES
TINYTEXTSTRING
TEXTSTRING
MEDIUMTEXTSTRING
LONGTEXTSTRING
ENUMNo type for ENUM.Must use any type which can represent values in ENUM
SETNo type for SET.Must use any type which can represent values in SET
DATEDATE
TIMETIME
DATETIMEDATETIME
TIMESTAMPTIMESTAMP
Step 3: Upload Files to Google Cloud Storage (GCS)

You can upload files using:

Option 1: Upload Files Using gsutil (Command Line Tool)

gsutil is part of the Google Cloud SDK and is the most efficient method for CLI-based uploads.

  • Upload a single file: gsutil cp table_data.csv gs://your-bucket/path/
  • Upload an entire directory: gsutil cp -r /local/folder/ gs://your-bucket/path/
  • Upload from Amazon S3 to GCS directly: gsutil cp -r s3://source-bucket/path gs://your-gcs-bucket/path
Option 2: Upload Using Storage Transfer Service (Automated Transfers)

The Storage Transfer Service is useful when:

  • Your MySQL exports are stored in S3 or another cloud provider
  • You want scheduled automatic transfers
  • You need periodic syncing between storage systems
Option 3: Upload Using the Google Cloud Console (Web UI)

If your files are stored locally on your machine and you prefer a visual interface, the GCP Console provides a simple alternative the command line.

Steps:

  1. In the GCP Console, go to Storage → Browser
GCP browser selection
  1. Select your bucket
  1. Click Upload Files
Upload files interface
  1. Choose your CSV or other extracted files
Selection of files
  1. Wait for the upload to complete

This method is best for ad-hoc or one-time file uploads that do not require automation.

Step 4: Load Data from GCS into BigQuery
Option 1: Load Data Using the bq Command-Line Tool (Most Common Method)

The bq load command gives you full control over schema, file formats, and load behavior.

General syntax:

bq --location=[LOCATION] load \

--source_format=CSV \

[DATASET].[TABLE] \

gs://bucket/path/to/file.csv \

schema.json
Optio 2: Load Using the BigQuery Web Console (UI Method)

The BigQuery Console provides an intuitive, step-by-step interface for loading data manually.

Steps:

  1. Go to BigQuery in the Google Cloud Console
  2. Select your project and dataset
  3. Click Create Table
  4. Choose Google Cloud Storage as the source
  5. Paste the GCS file path (gs://bucket/file.csv)
  6. Configure format (CSV)
  7. Choose schema detection or upload a schema file
  8. Select write mode:
    • Write if empty
    • Append
    • Overwrite
  9. Click Create Table
Option 3: Load Data programmatically using BigQuery API

BigQuery provides REST APIs and client libraries (Python, Java, Go, Node.js) for automated loads, especially in production pipelines or CI/CD systems.

A simplified Python example:

from google.cloud import bigquery

client = bigquery.Client()

table_id = "project.dataset.table"

job_config = bigquery.LoadJobConfig(

    source_format=bigquery.SourceFormat.CSV,

    autodetect=True,

)

uri = "gs://my-bucket/orders.csv"

load_job = client.load_table_from_uri(uri, table_id, job_config=job_config)

load_job.result()

2. Incremental Dump and Load

When you need to refresh BigQuery with only new or updated rows, an incremental load is more efficient than reloading entire tables.

Step 1: Extract Incremental Records

Use a timestamp column (updated_at, last_modified, etc.) to pull only changed rows.

mysql -B -u user -h mysql_host \

-e "select * from table_name where updated_at > '#last_run_ts#';" \

> table_name_delta.csv
Step 2: Merge Data in BigQuery

Load the extracted data into a staging table, then update the main table.

Option 1: Update existing rows and insert new ones
UPDATE dataset.final_table t

SET t.value = s.value

FROM dataset.staging_table s

WHERE t.id = s.id;

INSERT INTO dataset.final_table (id, value)

SELECT id, value

FROM dataset.staging_table

WHERE id NOT IN (SELECT id FROM dataset.final_table);
Option 2: Delete and reload
DELETE FROM dataset.final_table

WHERE id IN (SELECT id FROM dataset.staging_table);

INSERT INTO dataset.final_table

SELECT * FROM dataset.staging_table;

Pros

  • Better control over extraction, transformation, and load logic
  • Flexible for one-time or highly customized migrations

Cons

  • High engineering effort to build and maintain scripts
  • Breaks easily when schemas change or data volume grows
  • No real-time sync; requires manual or scheduled refresh
  • Complex monitoring, error handling, and retries must be built manually
  • Not scalable for production-grade pipelines

Method 3: Google Cloud Native – Using BigQuery Data Transfer Service (BQ DTS)

BigQuery Data Transfer Service (BQ DTS) is Google Cloud’s built-in solution for moving data from MySQL to BigQuery without writing code.

Below is the step-by-step procedure to set up MySQL to BigQuery using BQ DTS.

Step 1: Prepare Your MySQL Source

BQ DTS supports:

  • Cloud SQL for MySQL
  • On-premises MySQL
  • MySQL hosted on AWS, Azure, or other cloud providers

Before enabling transfers, ensure:

  • Binary logging is enabled (binlog_format = ROW)
  • The MySQL user has the required read permissions
  • Network access is configured (public IP, VPC peering, or Cloud VPN/Interconnect)

Step 2: Enable Required APIs in Google Cloud

In the Google Cloud Console, enable the following services:

  • BigQuery Data Transfer API
  • Cloud Storage API (used for staging intermediate files)
  • BigQuery API

This ensures DTS can extract and stage data from MySQL before loading it into BigQuery.

Step 3: Create a BigQuery Dataset

BQ DTS requires a target dataset where transferred tables will be created.

  1. Go to BigQuery Console
  2. Click + Create Dataset
  3. Select:
    • Location (must match GCS bucket region)
    • Default table expiration (optional)
  4. Click Create

Step 4: Configure the Transfer

  1. In the BigQuery Console, navigate to Transfers
  2. Click + Create Transfer
  3. Choose MySQL as the data source
  4. Configure:
    • Connection details (host, port, username, SSL options)
    • Authentication method (password or Cloud SQL connection)
    • Tables to include or exclude
    • Data refresh window (how many previous days to reload)

Step 5: Choose the Transfer Schedule

BQ DTS supports:

  • Daily, Hourly, or Custom scheduling
  • On-demand manual runs

Set your preferred schedule based on how frequently MySQL data changes.

Step 6: Establish Connectivity

Depending on your environment:

  • Cloud SQL MySQL: Direct connection
  • External MySQL (AWS/Azure/on-prem):
    • Use a public IP with allowedlist
    • OR configure private networking via Cloud VPN / Interconnect

BQ DTS validates the credentials and network accessibility before creating the transfer.

Step 7: Start the Initial Load

Once you save the configuration, DTS automatically:

  1. Extracts data from MySQL
  2. Stages it in GCS (managed internally)
  3. Loads it into BigQuery as new tables

Tables are created with schema mappings based on column definitions in MySQL.

Step 8: Monitor and Manage Transfers

BQ DTS provides built-in monitoring tools:

  • Transfer history
  • Failed vs successful runs
  • Execution logs
  • Alerts for network or schema issues

You can view logs in BigQuery, Cloud Logging, or set up email alerts.

Pros

  • Fully managed by Google Cloud
  • Works well for teams already on GCP

Cons

  • Not ideal for large, frequently changing, or complex datasets
  • Supports only batch transfers, not real-time sync
  • Limited transformation capabilities
  • Requires network configuration and MySQL permission setup

What are the Advantages of Connecting MySQL To BigQuery?

MySQL to BigQuery
Image Source

Syncing MySQL data into BigQuery helps teams keep operational workloads in MySQL while unlocking fast, scalable analytics in Google Cloud. This setup ensures smoother reporting, better performance, and stronger data governance.

1. Faster Analytics on Large Datasets

BigQuery is built for high-speed analytical queries. It handles large datasets far more efficiently than MySQL, without affecting production performance.

2. Consistent, Unified Source of Truth

Sending MySQL data to BigQuery keeps dashboards, reports, and models fully aligned. Analysts avoid manual exports and always work with up-to-date, consistent data.

3. Better Security and Access Control

Teams can analyze data in BigQuery without exposing or overloading the operational MySQL database, reducing risk and improving governance.

4. Optimized OLAP Capabilities

MySQL is designed for transactions, while BigQuery is designed for analytics. BigQuery’s serverless, columnar, and distributed architecture enables deeper insights and complex queries at scale.

Conclusion

Connecting MySQL to BigQuery unlocks faster analytics, reduces load on your transactional systems, and creates a single source of truth for your teams. While manual ETL and BQ DTS can work for specific use cases, they often require engineering time, scripting, and ongoing monitoring.

Hevo removes all of this complexity. Its automated MySQL to BigQuery pipeline handles extraction, schema changes, retries, and monitoring with zero maintenance. For teams that want a dependable, scalable, and transparent solution, Hevo is the most practical choice. 

Try it free and experience the difference.

FAQ MySQL to BigQuery

1. How to transfer data from MySQL to BigQuery?

Transferring data from MySQL to Google BigQuery involves several steps. Here’s a general outline of the process:
1. Export Data from MySQL
2. Prepare the Data
3. Upload to GCS
4. Load Data into BigQuery

2. How to connect MySQL to Google BigQuery?

To connect MySQL to Google BigQuery, you can use several methods:
1. ETL Tools
2. Google Cloud Dataflow
3. Custom Scripts

3. Is BigQuery Based on MySQL?

No, BigQuery is not based on MySQL. BigQuery is a fully managed, serverless data warehouse developed by Google. It uses a proprietary columnar storage format and a distributed architecture optimized for analytical queries and large-scale data processing.

mm
Freelance Technical Content Writer, Hevo Data

Faisal loves data science and combines his problem-solving ability and passion for writing to help data teams in solving complex business problems.