Are you struggling to load data into BigQuery? Are you confused, which is the best method to load data into BigQuery? If yes, then this blog will answer all your queries. In this article, you will learn how to load data into BigQuery, and explore some different data type uploads to the Google Big Query Cloud Storage, including CSV and JSON files.

You will also learn about the ways of uploading through an API or add-on. If you need to analyze terabytes of data in a few seconds, Google BigQuery is the most affordable option.

What is Google BigQuery?

Google BigQuery is serverless, highly scalable, and cost-effective multi-cloud data warehouse designed for business agility. 

Here are few features of Google BigQuery:

  • BigQuery allows us to analyze petabytes of data at a quick speed with zero operational overhead. 
  • No cluster deployment, no virtual machines, no setting keys or indexes, and no software are required.
  • Stream millions of rows per second for real-time analysis.
  • Thousands of cores are used per query.
  • Separate storage and computing. 

To understand more about Google BigQuery, please refer to the following Hevo Data article. 

Want to Simplify BigQuery ETL with Hevo?

Hevo makes BigQuery ETL effortless with its no-code data pipelines. Automate data ingestion, transformation, and loading seamlessly while enjoying real-time updates and robust data integrity. Transform your ETL process and focus on insights, not complexities.

Here’s why you should choose Hevo:

  1. Plug-and-play transformations
  2. Real-time data transfer
  3. 24/5 Live Support
Get Started with Hevo for Free

Types of Data Load in BigQuery

Following types of data loads are supported in Google BigQuery: 

  • You can load data from cloud storage or a local file. The supported records are in the Avro, CSV or JSON format. 
  • Data exports from Firestore and Datastore can be uploaded into Google BigQuery.
  • You can load data from other Google Services such as Google Ads Manager and Google Analytics. 
  • Streaming inserts can be actively loaded in BigQuery.
  • Data Manipulation Language (DML) statements are also used for bulk data upload.  

Data uploading through Google Drive is NOT yet supported, but data can be queried in the drive using an external table

Best Practices: High Performance ETL to BigQuery
Download Your Free EBook Now

Data Ingestion Format

Proper Data Ingestion format is necessary to carry out a successful upload of data. The following factors play an important role in deciding the data ingestion format:

  1. Schema Support: One important feature of BigQuery is that it creates a table schema automatically based on the source data. Data formats like Avro, ORC, and Parquet are self-describing formats. No specific schema support is needed for these, but for data formats like JSON and CSV, an explicit schema can be provided. 
  2. Flat Data/Nested and Repeated Fields: Nested and Repeated data helps in expressing hierarchical data. All the formats including Avro, ORC, Parquet, Firestore exports, support data with Nested and Repeated Fields. 
  3. Embedded Newlines: When data is being loaded from JSON files, the rows need to be newline delimited. Query expects newline-delimited JSON files to contain a single record per line. 
  4. Encoding: BigQuery supports UTF-8 encoding for both nested, repeated and flat data. For the CSV files, BigQuery supports ISO-8859-1 encoding for flat data. 

Methods To Load Data into BigQuery

Method 1: Loading Data Into BigQuery Using An Automated Data Pipeline

This method is simple as it requires two easy steps to be followed. Using Hevo, an automated, no-code solution, you can migrate your data from your source to BigQuery in a few minutes.

Step 1: Configure your source

An example of MySQL as a source is shown below.

Step 2: Configure your destination

Method 2: To load data into BigQuery manually

  1. Before you upload any data, you need to create a dataset and table in Google BigQuery. To do this on the BigQuery, go to the home page and select the resource in which you want to create a dataset. 
Load data into BigQuery: Select  the resource
  1. In the Create dataset window, give your dataset an ID, select a data location, and set the default table expiration period.
    Note: If you select “Never” for table expiration, the physical storage location will not be defined. For temporary tables, you can specify the number of days to store them.
  1. Next, create a table in the dataset. 

After table creation, you can now load data into BigQuery. Let’s explore the different data type uploads to the Google Big Query Cloud Storage:

1. Upload Data from CSV File

To upload data from CSV file, go to the create table window, select a data source and use the upload function. 

  1. File Selection: Choose the file and format (e.g., CSV, JSON).
  2. Destination: Define the project and dataset name in Google BigQuery.
  3. Table Type: Select between native and external tables.
  4. Table Structure:
  • Auto-detect by BigQuery or
  • Manually add fields via text revision or ‘+ Add field’ button.
  1. Advanced Options: Adjust parsing settings for CSV files as needed.

2. Upload Data from JSON Files

To upload data from JSON files, repeat all the steps to create or select the dataset and table that you are working with and then select JSON as the file format. You can upload a JSON file from your computer, Google Cloud Storage, or Google Drive Disk. 

Further information about the JSON format is available on Google Cloud Documentation.

3. Upload Data from Google Cloud Storage

Google Cloud Storage allows you to securely store and transfer data online. The following file formats can be uploaded from Google Cloud Storage to Google Big Query:

  • CSV
  • JSON
  • Avro
  • Parquet 
  • ORC 
  • Cloud Datastore

You can read more about using Cloud Storage with big data in the following documentation. 

Integrate MySQL to BigQuery
Integrate PostgreSQL to BigQuery
Integrate HubSpot to BigQuery

4. Upload Data from Other Google Services

  1. Configure BigQuery Data Transfer Service:
  • Select or create a data project.
  • Enable billing for the project (mandatory for certain services).
  1. Services Requiring Billing:
  • Campaign Manager
  • Google Ads Manager
  • Google Ads
  • YouTube – Channel Reports
  • YouTube – Content Owner Reports
  1. Starting the Service:
  • Go to the BigQuery Home Page and select Transfers from the left-hand menu.
  • Admin access is required to create a Transfer.
  1. Select Data Source:
  • Choose your desired data source in the subsequent window.
  1. Access Methods:
  • Access the BigQuery Data Transfer Service via:
    • Platform console
    • Classic bq_ui
    • bq command-line tool
    • BigQuery Data Transfer Service API
  1. Data Upload:
  • The service automatically uploads data to BigQuery regularly.
  • Note: The service cannot be used to download data from BigQuery.

5. Download Data with the API

With Cloud Client Libraries, you can use your favourite programming language to work with the Google BigQuery API. 

Here you can read more about downloading data using API from

To start, you need to create or select the project with which you want to work. On the home page, go to the APIs section.

Learn more about Google Analytics to BigQuery Integration.

Additional Resources on Load Data into Bigquery

Conclusion

In this blog, you learned about the Google BigQuery and how to load data into BigQuery. You also explored some different data type uploads to the Google Big Query Cloud Storage, including CSV and JSON files. But, if you want to automate your data flow, try Hevo.

Hevo is a No-code Data Pipeline. It supports pre-built integrations from 150+ data sources. You can load data into BigQuery from your desired data source in a few minutes.

Give Hevo a try by signing up for a 14-day free trial today and see how Hevo will suit your organization’s needs. Check out the pricing details to find the right plan for you.

Load data into BigQuery and share your experience with us in the comment section below.

Frequently Asked Questions

1. Is BigQuery a database or data warehouse?

Google BigQuery is a fully managed, serverless data warehouse offered by Google Cloud.

2. What is the fastest way to load files into BigQuery?

The fastest way to load files into BigQuery generally involves using Google Cloud Storage as an intermediary, leveraging the bq command-line tool or API, optimizing file formats, and using features like partitioning and clustering.

3. How do I load data to Google BigQuery?

To load data to Google BigQuery, you can upload files directly (e.g., CSV, JSON) via the BigQuery web interface or use the BigQuery Data Transfer Service to import data from various Google services. Alternatively, you can use the bq command-line tool or BigQuery API for more advanced loading options.

4. How do I load data into an existing table in BigQuery?

To load data into an existing table in BigQuery, use the BigQuery web interface to select the table, then upload your data file or configure a data transfer. You can also use the bq command-line tool with the command bq load specifying the table name and file source.

5. How do I upload a database file to BigQuery?

To upload a database file to BigQuery, convert the database file (e.g., CSV or JSON) into a supported format and use the BigQuery web interface to import the file directly. Alternatively, you can use the bq command-line tool with the bq load command to specify the dataset and table for the upload.

Muhammad Faraz
Technical Content Writer, Hevo Data

Muhammad Faraz is an AI/ML and MLOps expert with extensive experience in cloud platforms and new technologies. With a Master's degree in Data Science, he excels in data science, machine learning, DevOps, and tech management. As an AI/ML and tech project manager, he leads projects in machine learning and IoT, contributing extensively researched technical content to solve complex problems.