The exponential rate of data generation in modern businesses from various SaaS applications, marketing channels, etc., has compelled them to move from on-premise databases to cloud-based data warehouses.

In traditional data warehousing solutions, business teams often find it challenging to run large queries or apply machine learning to study trends, creating a need for a highly scalable and fast solution.

Google BigQuery provides an effective and unified solution for data scaling demands. It can handle varying workloads and optimally allocate computing resources to query requests, ensuring high performance.

It is often required to export data from BigQuery to CSV for further business analysis. In this article, you will learn how to export data to CSV format easily. 

What is BigQuery?

BigQuery Logo

Google BigQuery is a fully managed and serverless enterprise cloud data warehouse. It uses Dremel technology, which transforms SQL queries into tree structures. BigQuery provides an outstanding query performance owing to its column-based storage system.

Key Features:

  • Machine Learning: BigQuery ML allows users to train and run machine learning models in BigQuery using only SQL syntax.
  • Serverless Architecture: BigQuery manages servers and storage in the background, so a user does not need to.
  • High Scalability: It scales seamlessly to handle petabytes of data.
  • SQL Compatibility: It supports ANSI SQL, which is useful for people who already know SQL and want to write and run queries. This also allows a user to combine various BI tools for data visualization.

Methods to Export BigQuery Table to CSV

The option to download the BigQuery table as a CSV file offers a valuable solution for users seeking flexibility and accessibility. BigQuery doesn’t allow you to export table data directly to your local storage, Google Sheets, etc. Firstly, the data is exported to Google Cloud Storage(GCS), and from there, you can download your data in CSV format to your desired destination. 

Before beginning the export process, ensure that you have the following permissions to execute the BigQuery export to CSV operation:

  • bigquery.tables.export permissions to export data from the tables.
  • bigquery.jobs.create permissions to run an export job.
  • storage.objects.create and storage.objects.delete permissions to write data in an existing Google Cloud Bucket.
Use Hevo for Simplified BigQuery Data Migrations

Looking for the best ETL tools to connect your data sources? Rest assured, Hevo’s no-code platform helps streamline your ETL process. Try Hevo and equip your team to: 

  1. Integrate data from 150+ sources(60+ free sources).
  2. Utilize drag-and-drop and custom Python script features to transform your data.
  3. Risk management and security framework for cloud-based systems with SOC2 Compliance.
Start BigQuery Data Migrations

Method 1: Using Cloud Console

This is the easiest way to export BigQuery data to CSV, as it does not require any technical knowledge. 

  • Step 1: Go to the Google Cloud Console in BigQuery. 
  • Step 2: Navigate to the Explorer panel and select the desired table from your project.
  • Step 3: From the details panel, click on the Export option and select Export to Cloud Storage.
Export BigQuery Table to CSV Dialog Box
  • Step 4: An Export table to Google Cloud Storage dialog box will pop on your screen. Select the desired destination bucket, folder, or file to store your data as CSV. 
  • Step 5: From the Drop Menu in Export Format, select CSV. Also, Select None for no Compression or GZIP compression supported by CSV.
  • Step 6: Click on the Export button to complete the process.

Limitations of this method

  • Table data cannot be exported to Google Sheets, Google Drive, or a local file. Cloud Storage is the only supported export location.
  • There is no assurance about the produced file size when utilizing the EXPORT DATA statement.
  • An export task may create a variable number of files.
  • INT64 (integer) data types are encoded as JSON strings when exported in JSON format in order to maintain 64-bit accuracy when other systems read the data.
  • A single export task cannot export data from more than one table.
  • When you export data via the Google Cloud interface, you are limited to the GZIP compression option.
  • Unless you utilize the EXPORT DATA statement and include an ORDER BY clause in the query_statement, the order of the exported table data is not guaranteed.
Export BigQuery to PostgreSQL
Export BigQuery to MS SQL Server
Export BigQuery to Databricks

Method 2: Using bq extract command in the bq command-line tool  

Using the bq command-line tool, you can execute the BigQuery Export query results to CSV by following the sample code given below:  

bq --location=location extract 
--destination_format format 
--compression compression_type 
--field_delimiter delimiter 
--print_header=boolean 
project_id:dataset.table 
gs://bucket/filename.ext
  • bq --location=location extract: Extracts data from a BigQuery table in a specified location.
  • --destination_format format: Sets the format of the output file (e.g., CSV, JSON, etc.).
  • --compression compression_type: Specifies whether to compress the output (e.g., GZIP).
  • --field_delimiter delimiter: Defines the character separating fields in the output file (e.g., comma for CSV).
  • project_id:dataset.table gs://bucket/filename.ext: Exports the table to a file stored in Google Cloud Storage (GCS).

The above code snippet has the following parameters:

  • Location: The location of your BigQuery data. For example, if you are in the USA, the location is ’US.’
  • Destination Format: The file format you need in Google Cloud Storage(GCS).
  • Compression Type: The specific compression type is supported by the file format.
  • Delimiter: For CSV, t and tab indicate the boundary between columns.
  • Boolean: The default value is true allowing header rows to be printed to the exported data.  
  • Project_ID, Dataset, Table: Details of the Table you export your data from.
  • Bucket, Filename.ext: The file’s storage location in GCS and the filename with the desired format.
Google BigQuery Performance Optimization: Best Practices Guide
Download Your Free EBook Now

Given below is an example of exporting ‘maindataset.mytable’ in CSV filename onefile.csv with GZIP compression in a GCS bucket named bucket-one.

bq extract 
--compression GZIP 
'maindataset.mytable' 
gs:// bucket-one/onefile.csv
  • bq extract: Command to export data from a BigQuery table.
  • --compression GZIP: Compresses the exported data using GZIP.
  • 'maindataset.mytable': Specifies the BigQuery table to be exported (in maindataset and named mytable).
  • gs://bucket-one/onefile.csv: Destination file path in Google Cloud Storage (GCS) where the exported data will be saved.
  • The table data is exported as a compressed CSV file to the specified GCS location.

Method 3: Submitting an Extract Job via Client Libraries (APIs)

It is also possible to Export BigQuery Table to CSV Python format using various programming environments such as C#, Go, Java, Node.js, PHP, Python, and Ruby. To begin, you need to Install the Client Libraries and then start writing queries. For instance, you can use the following sample code to execute BigQuery Export Query Results to CSV Python:

from google.cloud import bigquery
client = bigquery.Client()
bucket_name = 'bucket-one'
project = "project-one"
dataset_id = "one"
table_id = "onefile"

destination_uri = "gs://{}/{}".format(bucket_name, "onefile.csv")
dataset_ref = bigquery.DatasetReference(project, dataset_id)
table_ref = dataset_ref.table(table_id)

extract_job = client.extract_table(
    table_ref,
    destination_uri,
    # Location must match that of the source table.
    location="US",
)  # API request
extract_job.result()  # Waits for job to complete.

print(
    "Exported {}:{}.{} to {}".format(project, dataset_id, table_id, destination_uri)
)
  • from google.cloud import bigquery: Imports the BigQuery library for interacting with Google BigQuery.
  • client = bigquery.Client(): Creates a BigQuery client to execute commands.
  • The code defines the project, dataset, table, and a Cloud Storage bucket where the table will be exported as a CSV file.
  • client.extract_table(...): Initiates an export job to extract the BigQuery table and save it to the specified Google Cloud Storage (GCS) location.
  • extract_job.result(): Waits for the export job to finish, and then prints a confirmation message once completed.

Learn how to export data from SQL*Plus to CSV for easy data manipulation and reporting in your preferred format.

Conclusion

In this article, you learned how to import a BigQuery Table to CSV using three techniques. Google BigQuery has become immensely popular among companies due to its growing data storage needs. BigQuery provides a reliable, secure, and scalable data warehouse with accelerated analytics.

Suppose you are using Google BigQuery as a data warehousing and Analysis platform for your business and searching for a stress-free alternative to manual data integration. In that case, Hevo can accurately automate this for you. Hevo, with its strong integration with 150+ data sources, allows you to not only export & load data but also transform & enrich your data & make it analysis-ready in a jiffy.

Sign up for a 14-day free trial and simplify your Data Integration process. Check out the Hevo Pricing details to understand which plan fulfills all your business needs.

Frequently Asked Questions

1. How to export Oracle query results to CSV?

Using SQLPlus, SQL Developer, or SQLLoader.

2. How do I export an Access query to a CSV file?

Using Microsoft Access’s export feature or VBA code.

3. How to convert SQL query result to CSV in Python?

Using pandas with a database connector to execute the query and save results to a CSV file.

Sanchit Agarwal
Research Analyst, Hevo Data

Sanchit Agarwal is an Engineer turned Data Analyst with a passion for data, software architecture and AI. He leverages his diverse technical background and 2+ years of experience to write content. He has penned over 200 articles on data integration and infrastructures, driven by a desire to empower data practitioners with practical solutions for their everyday challenges.