Connect Google BigQuery to Google Sheets: 3 Easy Methods
Google Sheets offers its users a convenient way of interacting with data. Google Sheets can be shared easily and they are especially great for analyzing data, creating charts and graphs, organizing email lists, managing accounting ledgers among many other business use cases. These phenomenal features have made businesses heavily rely on Google Sheets for day-to-day operations.
Table of Contents
Given the ease with which data can be stored and analyzed on Google Sheets, businesses often look to export data from their Data Warehouse such as Google BigQuery to Google Sheets to serve a variety of use cases.
This article provides 3 methods for setting up your Google BigQuery to Google Sheets integration. In the first method, you will learn how to write a background Cloud Function that reacts to triggers in Cloud Storage. Once a CSV file is staged to Cloud Storage, the function will update a Google Sheets document with the contents of the file using the Sheets API.
The second method uses Google Sheets add-on functionality to migrate data and tables from Google BigQuery into Google Sheets in a scheduled manner. Our third method automates this whole process by using a Data Pipeline which can encompass your other data sources as well. Read along to decide which method suits you the best.
Table of Contents
- Prerequisites
- What is Google BigQuery?
- What is Google Sheets?
- Methods to Connect Google BigQuery to Google Sheets
- Conclusion
Prerequisites
- A Google Cloud Storage Bucket.
- A Google BigQuery table.
- Google Cloud SDK installed on your system.
- Linux or Mac OS running on your system.
What is Google BigQuery?
Google BigQuery is a Cloud-based Data Warehouse that provides seamless scalability, simplicity, and abstraction. Unlike its peers, Google BigQuery is serverless and hence provides services dynamically which means that no provisioning or hardware management is required from the user.
Google BigQuery is built in such a way that it’s best utilized while running complex and analytical queries. This helps a great deal in supporting the load of running such heavy queries on your relational Databases.
The specialty of this Data Warehouse lies in its association with other Google services like Spreadsheets, Google Drives, etc. This makes Google BigQuery a very attractive market choice. Moreover, it does not require maintenance of any kind and is present in all locations where Google Cloud is available.
To learn more about Google BigQuery visit here.
What is Google Sheets?
Google Sheets is a free web-based spreadsheet program provided by Google. It allows users to create and edit spreadsheets, but also, more importantly, it allows multiple users to simultaneously collaborate on a single document in real-time. It’s a part of the Google suite of applications, a collection of free productivity apps owned and maintained by Google.
Despite being free, Google Sheets is a fully functional spreadsheet program, with most of the capabilities and features of more expensive spreadsheet software. Google Sheets is compatible with the most popular spreadsheet formats so that you can continue your work.
With Google Sheets, like all Google Drive programs, your files are accessible via computer and/or mobile devices, and you can begin working right where you left off, using its auto-save feature.
To know more about Google Sheets, visit this link.
Methods to Connect Google BigQuery to Google Sheets
Method 1: Manual ETL Process to Set up Google BigQuery to Google Sheets Integration
The manual method of connecting Google BigQuery to Google Sheets contains 7 steps in which you can use the Google BigQuery PHP client library to export table data into Cloud Storage. Further, you can use a simple PHP script to run the export job. You can later deploy a background Cloud function that listens to changes in Cloud Storage and then updates the Google Sheets Spreadsheet that you specified.
Method 2: Scheduling Data Transfer Using Google Sheets Add-on
Google Sheets offers a ton of add-ons to support data integration between Google Services like BigQuery and Google Sheets. This method connects your data in BigQuery to be visualized in Google Sheets using a free add-on OWOX BI.
Method 3: Using Hevo Data to Set up Google BigQuery to Google Sheets Integration
Hevo Data is an automated Data Pipeline platform that can move your data from Google BigQuery to Google Sheets very quickly without writing a single line of code. It is simple, hassle-free, and reliable.
Moreover, Hevo offers a fully-managed solution to set up data integration from Google Sheets for free and 100+ other data sources (including 40+ free data sources) and will let you directly load data to a Data Warehouse such as Google BigQuery, Snowflake, Amazon Redshift, etc. or the destination of your choice.
It will automate your data flow in minutes without writing any line of code. Its Fault-Tolerant architecture makes sure that your data is secure and consistent. Hevo provides you with a truly efficient and fully automated solution to manage data in real-time and always have analysis-ready data.
Get Started with Hevo for FreeMethods to Connect Google BigQuery to Google Sheets
You can transfer your data from Google BigQuery to Google Sheets using the following 2 methods:
- Method 1: Manual ETL Process to Set Up Google BigQuery to Google Sheets Integration
- Method 2: Scheduling Data Transfer Using Google Sheets Add-on
- Method 3: Using Hevo Data to Set Up Google BigQuery to Google Sheets Integration
Method 1: Manual ETL Process to Set up Google BigQuery to Google Sheets Integration
For the scope of this blog, you will be using Google BigQuery’s PHP library for exporting data from BQ to Cloud Storage. You will use a custom PHP exporter script to achieve this.
Currently, you cannot connect Google BigQuery and Google Sheets. However, you can export data from Google BigQuery to Cloud Storage in Avro, CSV, and JSON formats and then import this data from GCS into Google Sheets and then export Google BigQuery table data to Google Sheets via Cloud Storage.
To set up your manual ETL process, the following steps are required:
- Step 1: Grant the Permissions
- Step 2: Set up the Authentication to BQ
- Step 3: Set up the Authentication
- Step 4: Install the Google BigQuery PHP Client Library
- Step 5: Export the Google BigQuery Table Data
- Step 6: Create a Cloud Function to upload CSV Files to a specific Cloud Bucket
- Step 7: Save the Cloud Function
Step 1: Grant the Permissions
The first step in the process of exporting data from Google BigQuery to Google Sheets is setting permissions to grant access to the Google BigQuery table that has your data. Also, you require permission to run an export job from Google BigQuery and to write data in your Cloud Storage Bucket. You will need to grant your service account with the following Cloud IAM roles:
- bigquery.admin Cloud IAM role.
- storage.admin Cloud IAM role.
- Enable the Google Sheets API in Google Cloud Console to allow read and write access to Google sheets.
Step 2: Set up the Authentication to BQ
In the Google Cloud Console;
- Open the Create Service Account Key page.
- From the list choose New service account.
- Enter your name of choice in the service name field.
- From the Role list, choose Project > Owner.
- Click Create. A JSON file will be automatically downloaded to your computer. This file contains your key.
Step 3: Set up the Authentication
Launch your CLI and set the GOOGLE_APPLICATION_CREDENTIALS environment variable to the local path of the JSON file you just downloaded on your computer.
export GOOGLE_APPLICATION_CREDENTIALS="[LOCAL_PATH]"
For example:
export GOOGLE_APPLICATION_CREDENTIALS="/home/user/Downloads/MY_FIRST_PROJECT.json"
Step 4: Install the Google BigQuery PHP Client Library
From your CLI run this command:
composer require google/cloud-bigquery
This command installs the Google BigQuery PHP client library. This library provides an easy-to-use interface to the Google BigQuery API using familiar PHP constructs.
Step 5: Export the Google BigQuery Table Data
The following PHP will initialize a client and perform an extract job on a Google BigQuery public dataset. The table data will be loaded into the Google Cloud Storage bucket specified in the environment variables. Save this bash script and run it in your CLI.
// Import the libraries
use GoogleCloudBigQueryBigQueryClient;
// Set the environment variables
$projectId = 'The Google project ID';
$datasetId = 'The BigQuery dataset ID';
$tableId = 'The BigQuery table ID';
$bucketName = 'The Cloud Storage bucket Name';
// Initialize a BigQuery Client by instantiating the BigQueryClient class
$bigQuery = new BigQueryClient([
'projectId' => $projectId,
]);
$dataset = $bigQuery->dataset($datasetId);
$table = $dataset->table($tableId);
$destinationUri = "gs://{$bucketName}/{$tableId}.csv";
// Define the format to use.
$format = 'COMMA_DELIMITED_CSV';
// Create the extract job
$extractConfig = $table->extract($destinationUri)->destinationFormat($format);
// Run the job
$job = $table->runJob($extractConfig); // Waits for the job to complete
printf('Exported %s to %s' . PHP_EOL, $table->id(), $destinationUri);
Step 6: Create a Cloud Function to Upload CSV Files to a Specific Cloud Bucket
Once CSV files are staged in Cloud Storage, the Function will update a Google Sheet spreadsheet using the Sheets API. The function is written in Node.js 8 meaning it cannot work for Node.js 6 runtimes. To start with:
- Open the Functions Overview page in the Google Cloud Console and create a Function.
- Name your function.
- Select Cloud Storage in the Trigger field.
- In the Source code field, select Inline editor and paste the following script:
// Copyright 2018 Google LLC.
// SPDX-License-Identifier: Apache-2.0
const {google} = require("googleapis");
const {Storage} = require("@google-cloud/storage");
exports.csv2sheet = async (data,context) => {
var filename = data.name;
// Basic check that this is a *.csv file
if (!filename.endsWith(".csv")){
console.log("Not a .csv file, ignoring.");
return;
}
// Define name of new sheet
const sheetName = filename.slice(0,-4);
// Setup auth and Sheets API
// Block on auth + Get the Sheets API object
const auth = await google.auth.getClient({
scopes: [
"https://www.googleapis.com/auth/spreadsheets",
"https://www.googleapis.com/auth/devstorage.read_only"
]
});
// Setup the Sheets API
const sheetsAPI = google.sheets({version: 'v4',auth});
// Use the Sheets API to create an empty sheet
// This function creates a new sheet in the spreadsheet with the given name at position 2,
// With 26 columns and 2000 rows with the first row frozen.
// Returns its sheetId
function addEmptySheet(sheetName) {
return new Promise((resolve, reject) => {
const emptySheetParams = {
spreadsheetId: process.env.SPREADSHEET_ID,
resource: {
requests: [
{
addSheet: {
properties: {
title: sheetName,
index: 1,
gridProperties: {
rowCount: 2000,
columnCount: 26,
frozenRowCount: 1
}
}
}
}
]
}
};
sheetsAPI.spreadsheets.batchUpdate( emptySheetParams, function(err, response) {
if (err) {
reject("The Sheets API returned an error: " + err);
} else {
const sheetId = response.data.replies[0].addSheet.properties.sheetId;
console.log("Created empty sheet: " + sheetId);
resolve(sheetId);
}
});
});
}
// read data from a storage CSV file
// this function reads data from the CSV file uploaded to the storage bucket
// and returns a string of CSV values with carriage returns
function readCSVContent(file) {
return new Promise((resolve, reject) => {
const storage = new Storage();
let fileContents = new Buffer('');
storage.bucket(file.bucket).file(file.name).createReadStream()
.on('error', function(err) {
reject('The Storage API returned an error: ' + err);
})
.on('data', function(chunk) {
fileContents = Buffer.concat([fileContents, chunk]);
})
.on('end', function() {
let content = fileContents.toString('utf8');
console.log("CSV content read as string : " + content );
resolve(content);
});
});
}
// populate the newly created sheet
// this function populates the sheet referenced by its ID with the data received (a CSV string)
// the function uses 'batchUpdate' to allow for multiple 'requests' to be sent in a single batch.
// Style: set first row font size to 11 and to Bold.
function populateAndStyle(theData, sheetId) {
return new Promise((resolve, reject) => {
const dataAndStyle = {
spreadsheetId: process.env.SPREADSHEET_ID,
resource: {
requests: [
{
pasteData: {
coordinate: {
sheetId: sheetId,
rowIndex: 0,
columnIndex: 0
},
data: theData,
delimiter: ","
}
},
{
repeatCell: {
range: {
sheetId: sheetId,
startRowIndex: 0,
endRowIndex: 1
},
cell: {
userEnteredFormat: {
textFormat: {
fontSize: 11,
bold: true
}
}
},
fields: "userEnteredFormat(textFormat)"
}
}
]
}
};
sheetsAPI.spreadsheets.batchUpdate(dataAndStyle, function(err, response) {
if (err) {
reject("The Sheets API returned an error: " + err);
} else {
console.log(sheetId + " sheet populated with " + theData.length + " rows and column style set.");
resolve();
}
});
});
}
// create a new sheet and remember its ID (based on the filename, removing the .csv extension)
const sheetId = await addEmptySheet(sheetName);
const theData = await readCSVContent(data);
await populateAndStyle(theData, sheetId);
}
- Select Node.js 8 in the Runtime dropdown.
- Click the “More” link at the bottom of the screen and use the dropdown to select your service account.
- Set an environment variable called SPREADSHEET_ID whose value corresponds to the name of your sheet document:
Step 7: Save the Cloud Function
Click “Create” to deploy your function. Now each time you export a CSV file into the cloud storage bucket you will see the same data materialize in your Google Sheet!
After implementing all the above 7 steps, your Google BigQuery to Google Sheets connection will be ready!
Limitations of Building a Custom Code
You may face the following limitations if you will manually set up the Google BigQuery to Google Sheets integration:
- You can not export table data to a file that exceeds 1 GB. Such a use case will require that you use a wildcard to partition the export output into multiple files.
- You cannot export nested or repeated data using this method. For this, you need to instead export the data in Avro or JSON formats.
- Your Google BigQuery Dataset and Cloud Storage Bucket must all be co-located in the same location for this to work.
Method 2: Scheduling Data Transfer Using Google Sheets Add-on
While you might feel intimidated using the custom script method, here’s a simpler approach to transfer your data from Google BigQuery to Sheets, for free.
This method employs the use of a free Google Sheets add-on called OWOX BI, which can be downloaded and installed on your Google Sheets platform from here- OWOX BI Google Workspace Marketplace.
To set up your data transfer from BigQuery to Sheets, all you need to do is follow these three simple steps:
Step 1: Install OWOX BI Add-on to your Google Sheets. You can visit the Menu bar > Add-ons and select Get add-ons. Google Sheets will redirect you to Google Workspace Marketplace wherein you can search for OWOX BI and install the add-on.
Alternatively, simply visit the link highlighted above and click on Install to install the OWOX BI add-on to your Google Sheets account.
Step 2: Once your OWOX BI add-on is installed, it will be made available under the Add-ons menu, as shown in the image below. You can access its features by simply clicking on the add-on.

Step 3: To get data from BigQuery to Google Sheets, click on Add a new report option. A new bar will appear on the right side of your screen.
From here, you can either choose the name of your BigQuery project you wish to see in your Sheets or create a new SQL query for the selected project. ONOX BI offers options to select or customize a previously used query from the drop-down list. To edit an existing query, you can enter the ONOX BI Query Editor and enter your new query(s).

You can also define dynamic parameters for your query if necessary.
Then, choose Add & Run. Your data is now ready to be transferred from Google BigQuery to a new Google Sheets sheet.
Method 3: Using Hevo Data to Set up Google BigQuery to Google Sheets Integration
Hevo Data, a No-code Data Pipeline, helps you directly transfer data from Google BigQuery and 100+ other data sources to Google Sheets, Data Warehouses such as Redshift, Databases, BI tools, or a destination of your choice in a completely hassle-free & automated manner.
Hevo is fully managed and completely automates the process of not only loading data from your desired source but also enriching the data and transforming it into an analysis-ready form without having to write a single line of code. Its fault-tolerant architecture ensures that the data is handled in a secure, consistent manner with zero data loss.
Hevo Data takes care of all your Data Preprocessing needs and lets you focus on key business activities and draw a much more powerful insight on how to generate more leads, retain customers, and take your business to new heights of profitability. It provides a consistent & reliable solution to manage data in real-time and always have analysis-ready data in your desired destination.
Sign up here for a 14-Day Free Trial!The Advantages of Using Hevo:
- No Data Loss – Hevo is a fault-tolerant architecture that ensures the data is moved from Google BigQuery to Google Sheets reliably without data loss.
- 100’s of Out of the Box Integrations – In addition to Google BigQuery, Hevo brings data from Databases, Cloud Applications, SDKs, and so on into Google Sheets. This ensures that you have a reliable partner to cater to your growing data needs.
- Minimal Setup: The data load from Google BigQuery to Google Sheets can be set up on a point and click interface, without any developer assistance.
- Automatic Schema Detection and Mapping: Hevo scans the schema of incoming data automatically. When there are any detected changes, it handles seamlessly by incorporating the change on the destination.
- Exceptional Support: Hevo provides live support over email and chat. This ensures that you always have a team that can solve your troubles promptly.
Conclusion
The article introduced you to Google BigQuery and Google Sheets and explained why it is beneficial to transfer data from Google BigQuery to Google Sheets. It provided 2 step-by-step methods which can successfully connect these two platforms. The manual method although effective will require you to custom code the whole ETL process.
Things would have been very simple if you were looking to export data just once from Google BigQuery to Google Sheets. You can simply run a query in the Google BigQuery console and then export the result directly to Google Cloud Storage before loading it into Google Sheets. More often than not, customers are looking to connect Google BigQuery to Google sheets for day-to-day reporting & analysis.
In addition to Google BigQuery, Hevo can load data from a multitude of other data sources including Databases, Cloud Applications, SDKs, and more. This allows you to scale up your data infrastructure on demand and start moving data from all the applications important for your business.
Visit our Website to Explore HevoBusinesses can use automated platforms like Hevo Data to set this integration and handle the ETL process. It helps you directly transfer data from Google Sheets for free or source of your choice to a Data Warehouse, Business Intelligence tools, or any other desired destination in a fully automated and secure manner without having to write any code and will provide you a hassle-free experience.
Want to try Hevo? Sign Up for a 14-day free trial and experience the feature-rich Hevo suite first hand. Have a look at our unbeatable pricing, which will help you choose the right plan for you.
Do try this out yourself and let us know about your experience loading data from Google BigQuery to Google Sheets in the comments!