Do you use Google BigQuery to store your data? Do you want to export your data from Excel to BigQuery? If yes, then this blog will answer all your queries. Excel is a common spreadsheet software that supports mathematical and statistical calculations, graphs, charts, tables, etc. Moreover, Data connectors and ETL tools  often provide streamlined interfaces for a Bigquery import excel process, automating the intermediate conversion steps.

You can load data from multiple sources in Google BigQuery for better insights. This blog will teach you about different approaches to connecting Excel to BigQuery. You will also go through the limitations of these methods.

Note: It’s important to check for data quality issues and adjust your BigQuery table schema accordingly before you attempt to upload Excel to BigQuery. When considering how to connect BigQuery to Excel, understand that you’ll primarily pull data from BigQuery into Excel spreadsheets, not push from Excel to BigQuery.

Methods to Load Data from Excel to BigQuery

Connect Excel to BigQuery using Hevo’s no-code Data Pipeline

Hevo is the only real-time ELT No-code Data Pipeline platform that cost-effectively automates data pipelines that are flexible to your needs. With integration with 150+ Data Sources (40+ free sources), we help you not only export data from sources & load data to the destinations but also transform & enrich your data, & make it analysis-ready.

Start for free now!

Get Started with Hevo for Free

Method 1: Load Data from Excel to BigQuery Using CSV

The steps to upload Excel file to Biquery using CSV are listed below:

  • You can go to your Web console and click “Create table” and then “Create a table from”.
  • Next, you can specify the CSV file, which will act as a source for your new table.
Excel to BigQuery: Create Table
Image Source
  • The “Source” dropdown will let you select amongst various sources like Cloud storage.
  • In “File format”, select CSV.
  • Select a database and give your table a name.
  • You can either upload a sample JSON to specify the schema or leave the schema definition to “auto-detect”.
Excel to BigQuery: Select Automatically detect schema
Image Source

Some other configurable parameters are field delimiter/skip header rows/number of errors allowed/jagged rows etc.

  • Clicking on “Create Table” will now fetch your CSV, ascertain the schema, create the table, and populate it with the CSV data.

Limitations

Some limitations of using CSV to connect Excel to Google BigQuery are listed below:

  • Files must be loaded individually, no wildcards or multiple selections are allowed when you load files from a local data source. 
  • Excel files loaded from a local data source must be 10 MB or less and must contain fewer than 16,000 rows.

Method 2: Upload XLSX to BigQuery Using BigQuery API 

The BigQuery API allows you to store data in the cloud from various sources, including Excel. BigQuery API allows you to upload files via the Multipart Method, which is a good fit for smaller files, where an unsuccessful upload starts again from the beginning. 

For larger files, the “Resumable Upload” method can be used, which allows you to create a session and resume partially completed uploads from where they were interrupted. The downside of this strategy is that you will need to have developer resources and will need to adjust your programs in the future.

Method 3: Load Data from Excel to BigQuery Using Hevo Data

Hevo is a No-code Data Pipeline. It supports pre-built integration from 150+ data sources. It allows you to load Excel/CSV files from various sources, like S3, FTP/SFTP, Google Drive, Box, etc., and load them into BigQuery. With Hevo, you can easily create an ETL pipeline using multiple Excel sheets and load that data into BigQuery or any warehouse of your choice. Hevo also allows you to perform transformations and preprocess your Excel data, before loading it into BigQuery. 

Many limitations listed in the other subsequent methods do not exist or are managed easily by Hevo.

Some of the key features of Hevo Data are listed below:

  • Fully Managed: It requires no management and maintenance as Hevo is a fully automated platform.
  • Data Transformation: It provides a simple interface to perfect, modify, and enrich the data you want to transfer. 
  • Real-Time: Hevo offers real-time data migration. So, your data is always ready for analysis.
  • Schema Management: Hevo can automatically detect the schema of the incoming data and maps it to the destination schema.
  • Live Monitoring: Advanced monitoring gives you a one-stop view to watch all the activities that occur within pipelines.
  • Live Support: Hevo team is available round the clock to extend exceptional support to its customers through chat, email, and support calls. 
SIGN UP HERE FOR A 14-DAY FREE TRIAL!

Method 4: Load Data from Excel to BigQuery Using DataPrep 

DataPrep is a cloud-based data service to visually explore, prune, pre-process, and prepare data for analysis and machine learning. Cloud Dataprep can directly import Excel workbooks and folders containing workbooks, as datasets. 

Steps to load data from Excel to BigQuery using DataPrep are listed below:

  1. Step 1: Import Data in DataPrep
  2. Step 2: Load Data from DataPrep to BigQuery

Step 1: Import Data in DataPrep

  • On the DataPrep console, inside the Library page, click “Import Data”.
  • Select the Excel Workbook that you need to import. 
  • By default, all worksheets in the workbook are imported as individual datasets. You can change how data imports from the EDIT option (e.g. you can select to import all worksheets to a single dataset).
  • After you add the dataset, you can edit the name and description information for them. 

Step 2: Load Data from DataPrep to BigQuery

  • If you want any data transformations, define a recipe in DataPrep that works on the newly created datasets. 
  • In the left navigation bar, click the Jobs icon.
  • It opens the Jobs Definition page, click the job identifier to open the job on the Job Details page. 
  • Click the Output Destinations tab and select BigQuery as the destination.
Excel to BigQuery: Load Data from DataPrep to BigQuery
Image Source

Limitations 

Some limitations of using DataPrep for loading data from Excel to BigQuery are listed below:

  • If your data in Excel cells has quotes, be very particular to have matching terminating quotes, else it could lead to undefined results. 
  • Compressed and Password protected files are not supported.
  • Object and Array data types in DataPrep, are written back to BigQuery as string values. Hence, try not to make them nested, in the source Excel files. 
  • BigQuery does not support destinations with a dot (.) in the name. 

Method 5: Load Data from Excel to BigQuery Using Cloud Storage

You can upload your Excel data to Google Storage Cloud, which will store it in CSV format. Next, you have the following methods to load this data into BigQuery:

  • Using the bq load command, via the command line.

The following syntax may be used to transfer CSV files from your local computer to the GCS bucket:

gsutil cp path/file_name.csv gs://bucket_name/

Next, import your CSV data into the BigQuery database using the bq load command. Indicate the destination table and dataset along with the file’s GCS location. Other parameters that you might select include defining the schema and delimiters as needed.

bq load --source_format=CSV dataset_name.table_name gs://bucket_name/file_name.csv

After making the necessary field replacements, run the command. Use the bq display command to see the current status of the load task.

For eg: 

For a “user_details_json” is a table containing user data, this is the output for the in bq command line

bq show test-applications-315905:test_dataset.user_details_json
Excel to BigQuery
Image Source
  • Using the Cloud console or WebUI (you can specify a JSON file containing the desired schema definition).
Excel to BigQuery
Image Source
  • Using jobs.insert API method.
from google.oauth2 import service_account
from google.cloud import bigquery
 
# Create Authentication Credentials
project_id = "test-applications-xxxxx"
table_id = f"{project_id}.test_dataset.user_details_python_csv"
gcp_credentials = service_account.Credentials.from_service_account_file('test-applications-xxxxx-74dxxxxx.json')
 
# Create BigQuery Client
bq_client = bigquery.Client(credentials=gcp_credentials)
 
# Create Table Schema
job_config = bigquery.LoadJobConfig(
    schema=[
        bigquery.SchemaField("user_id", "INTEGER"),
        bigquery.SchemaField("first_name", "STRING"),
        bigquery.SchemaField("last_name", "STRING"),
        bigquery.SchemaField("age", "INTEGER"),
        bigquery.SchemaField("address", "STRING"),
    ],
    skip_leading_rows=1,
    source_format=bigquery.SourceFormat.CSV,
)
 
# CSV File Location (Cloud Storage Bucket)
uri = "https://storage.cloud.google.com/test_python_functions/user_details.csv"
 
# Create the Job
csv_load_job = bq_client.load_table_from_uri(
uri, table_id, job_config=job_config
)
 
csv_load_job.result()
  • Using client libraries (custom programming) for Java/Python/C#/NodeJS etc. 
Image Source

Limitations

Some limitations to take care of would be:- 

  • All your datum must be singular values, nested or repeated data is not supported. 
  • You cannot use both uncompressed and compressed files, together in a single load job. 
  • Your DATE columns must have the “-” separator only, and the only supported format is YYYY-MM-DD (year-month-day). 
  • Same for TIMESTAMP, additionally, the hh:mm: ss (hour-minute-second) portion of the timestamp must use a colon (:) separator. 

Learn more about – How to import excel into MySQL and excel into PostgreSQL

BigQuery to Excel: Use Cases 

Transferring data from Excel File to BigQuery can offer various advantages, and below are some of the use cases:

  • Advanced Analytics: With BigQuery’s robust data processing capabilities, you can execute complex data analysis and queries on your Excel file data, which would not have been feasible with an Excel file alone.
  • Data Consolidation: If you are employing multiple data sources along with an Excel File, syncing them with BigQuery enables you to centralize your data in one place, providing a comprehensive view of your operations. Moreover, you can set up a change data capture process to prevent any discrepancies in your data.
  • Data Security and Compliance: BigQuery boasts robust data security features. By syncing Excel File data to BigQuery, you can ensure your data is secure and set up advanced data governance and compliance management.
  • Scalability: BigQuery can accommodate large volumes of data without impacting performance. This makes it an ideal solution for growing businesses with expanding Excel File data.
  • Reporting and Visualization: While Excel File provides reporting tools, data visualization tools like Tableau, PowerBI, and Looker (Google Data Studio) can link with BigQuery, providing more advanced business intelligence options. If you need to convert an Excel File table to a BigQuery table, Airbyte can do it automatically.

Conclusion

In this blog, you have learned about Excel and Google BigQuery. You also learned about five different approaches to load data from Excel to BigQuery. You can use any of the stated methods according to your requirements and business needs. All the methods encounter some limitations. So, if you are looking for a fully automated solution to load data from Excel to BigQuery, then try Hevo.

Hevo is a No-code Data Pipeline. It supports pre-built integration from 150+ data sources. Hevo provides you with a completely automated solution within minutes.

VISIT OUR WEBSITE TO EXPLORE HEVO

Want to take Hevo for a spin?

SIGN UP and experience the feature-rich Hevo suite first hand.

Tell us about your experience of loading data from Excel to BigQuery in the comment section below.

Pratik Dwivedi
Freelance Technical Content Writer, Hevo Data

Pratik writes about various topics related to data industry who loves creating engaging content on topics like data analytics, machine learning, AI, big data, and business intelligence.

mm
Customer Experience Engineer, Hevo Data

Dimple, an experienced Customer Experience Engineer, possesses four years of industry proficiency, with the most recent two years spent at Hevo. Her impactful contributions significantly contribute to refining customer experiences within the innovative data integration platform.