Are you trying to move your data from Google Analytics to BigQuery? Are you confused about how to do this easily? If yes, then you are in the right place. This blog covers various methods to connect Google Analytics to BigQuery in a few simple steps.

Methods to Export Data from Google Analytics to BigQuery

Google Analytics to BigQuery Integration image
Image Source:

There are two methods to export data from Google Analytics to BigQuery:

Method 1: Migrating Data From Google Analytics to BigQuery Using Hevo

Hevo, a No-code Data Pipeline, can be set up to export data from Google Analytics to BigQuery for free without using BigQuery Export in 2 simple steps:

  • Connect the data source by authenticating Google Analytics, as shown in the image below.
Configure Source: Google Analytics Connection Settings
Configure Google Analytics as a Source
  • Configure the BigQuery Data Warehouse setting and set where you want to move your Google Analytics data, as shown in the image below.
Configure Destination: BigQuery Connection Settings
Configure BigQuery as a Destination

The resulting Hevo data pipeline will now reliably move Google Analytics data to the BigQuery Data Warehouse for further analysis.

Simplify your Data Analysis with Hevo today! 


Advantages of Using Hevo 

Hevo’s data integration platform lets you move data from Google Analytics to BigQuery seamlessly. Here are some other advantages:

  • Zero Data Loss – Hevo’s unique fault-tolerant architecture is built to ensure the completeness of data. Hevo ensures that data is reliably moved without data loss. 
  • Unlimited Integrations – Hevo can connect to any source and any destination. All while providing a common interface.
  • Low time to Implementation – Once the simple setup procedure is complete, Hevo can migrate data in no time.
  • Automatic Schema Detection, Mapping, and Evolution – Hevo analyses the schema of the data it receives for replication and automatically maps it seamlessly onto the BigQuery table structure.
  • Fully Managed – The Hevo platform is fully managed and works out of the box. This will let you focus on extracting insights from your data and not worry about data availability.
  • Alerts and Notification – If any issues occur in the data replication, Hevo automatically notifies the relevant stakeholders of your team with real-time alerts via email or Slack, allowing them to take timely action.
  • Scalability – Hevo is built to handle data of any scale. With Hevo, your business can grow without any data hiccups. 
  • Exceptional Support – Technical support for Hevo is provided on a 24/7 basis over both email and Slack.

Effective business intelligence requires accurate and up-to-date data. Hevo ensures that your access to this data is never compromised. Hevo ensures that the data is accurately moved in real-time. 

Visit our Website to Explore Hevo

Method 2: Export Google Analytics to BigQuery using the BigQuery Data Transfer Service

To begin with, Google Analytics 360 and BigQuery are both Google products. That being the case, it is not surprising that there is also a pre-existing data migration solution for getting information from Google Analytics to BigQuery. This is known as the BigQuery Data Transfer Service. However, this solution is not the simplest nor the easiest to implement.

Follow the steps below to export Google Analytics to BigQuery using BigQuery Data Transfer Service:

Step 1: Create a Google APIs Console Project

In order to use the BigQuery Data Transfer Service for migrating data, you must first set up BigQuery Export. This requires logging into the Google APIs Console and either creating a new project or selecting an existing one, as shown in the image below.

Google APIs Console image
Image Source:

Step 2: Enable BigQuery within the Project

Edit the API Library settings of the chosen project to enable the BigQuery API.

Enable Google BigQuery image
Image Source:

Step 3: Setup Billing for the Project

Ensure that a billing account has been set up. This is required to start using the service for data replication. Billing may be validated by creating a data set in your BigQuery project. If the data set is created with no errors, then billing has been set up correctly. 

In case you want to try this setup without having to input your billing details, you could also try to set up a BigQuery Sandbox.

Step 4: Add the Service Account to the Project

The Google Analytics service account,, must be added as a member of the project, with project-level permission set to Editor. This is required to allow Google Analytics to export data to BigQuery.

Step 5: Link BigQuery to Google Analytics 360

  • You would need to have an E-Mail address that has:
    (a) EDIT access to Google Analytics Property.
    (b) OWNER access to the BigQuery project. 
  • In the Admin panel > Property, link BigQuery by entering your project ID.

Once completed, data transfer will commence within 24 hours. 

Limitations to Export Google Analytics to BigQuery Using BigQuery Data Transfer Service

Despite the native status of the BigQuery Data Transfer Service and the BigQuery export, this solution has its drawbacks. Listed below are the limitations to exporting Google Analytics to BigQuery using BigQuery Data Transfer Service:

  • Limited Updates Set by Google: When using Google’s Data Streaming option, the number of updates to the Data Warehouse is determined and set by Google. If you are looking to get Data Streaming in real-time, this could be a limitation.
  • Pipeline Maintenance: Exporting via the BigQuery Data Transfer Service requires that several key components be completed and maintained.
    • The Google Analytics Service account must always have EDIT access to the project. If, for whatever reason, this is changed, then all proceeding exports will fail until permission has been restored.
    • If the BigQuery API was somehow disabled in the project settings, then the exports will fail.
  • Irretrievable Data Loss: The greatest drawback of using the BigQuery Data Transfer Service is the risk of data loss. If any of the previously mentioned issues were to occur, or anything else that may cause an interruption in the data export, then that data would be lost permanently. The BigQuery Data Transfer Service has no facilities to mitigate data loss due to a failed export.
  • Limited Scope and Usability: Another major issue with using the BigQuery Data Transfer Service is its limitation regarding integration. It cannot be used to integrate platforms and services outside of Google. In fact, there are even some Google services, such as Google Drive, that BigQuery does not integrate with.
  • Limited Scope for Transforming Data: Let’s say you want to change the time from PST to UTC while moving the data from Google Analytics to BigQuery, or you do not want to move the data of specific campaigns. With BigQuery Data Transfer Service, you will not be able to achieve these simple modifications.
Save 20 Hours of Frustration Every Week

Did you know that 75-90% of data sources you will ever need to build pipelines for are already available off-the-shelf with No-Code Data Pipeline Platforms like Hevo? 

Ambitious data engineers who want to stay relevant for the future automate repetitive ELT work and save more than 50% of their time that would otherwise be spent on maintaining pipelines. Instead, they use that time to focus on non-mediocre work like optimizing core data infrastructure, scripting non-SQL transformations for training algorithms, and more. 

Step off the hamster wheel and opt for an automated data pipeline like Hevo. With a no-code intuitive UI, Hevo lets you set up pipelines in minutes. Its fault-tolerant architecture ensures zero maintenance. Moreover, data replication happens in near real-time from 150+ Data Sources to the destination of your choice including Snowflake, BigQuery, Redshift, Databricks, and Firebolt. 

Start saving those 20 hours with Hevo today.

Get started for Free with Hevo!

Method 3: Using Google APIs Console

If your data is in Google Analytics 4, you can export it to a free instance of BigQuery sandbox through the following steps.

1. Create a Google-APIs-Console project

You can create a new project or select an existing project by logging into the Google APIs Console.

2. Enable BigQuery

  • Head onto the APIs table.
  • Go to the Navigation menu and select APIs & Services, then click on Library.
  • Select BigQuery API in the Google Cloud APIs section.
  • Click Enable on the next page.
  • Add a service account to your Cloud project. Make sure that firebase-measurement@system. is a project member that has the editor role assigned.

3. Link BigQuery to a Google Analytics 4 property

  • Go to the Admin tab after logging into your Google Analytics account. Find the Analytics property you need to link to BigQuery.
  • Click BigQuery Linking in the Property column.
  • Click Choose a BigQuery project to see your projects. Click Learn More to create a new BigQuery project.
  • Click Confirm after selecting your project. 
  • Select a location. (Applicable only if your project doesn’t have a dataset for the Analytics property.) and click Next. 
  • Select the data you want to export.
  • If you need to include advertising identifiers, check to include advertising identifiers for mobile app streams.
  • Set the Frequency based on your requirement and export.
  • Then, click Submit. Your Google Analytics 4 information will be in your BigQuery project within 24 hours.


This article introduced you to Google Analytics and Google BigQuery. Furthermore, it provided you with a comprehensive guide to exporting data from Google Analytics to BigQuery. It also provided the limitations of doing it manually and the advantages of using Hevo Data.

What’s more? Hevo integrates with a wide array of 150+ data sources, such as Cloud Applications,  Cloud Storage, Databases, and more, opening the door for a wide range of future possibilities that your business may need.

Want to take Hevo for a spin?

Sign Up and experience efficient and effective data export from Google Analytics to BigQuery.

Share your experience of exporting your data from Google Analytics to BigQuery in the comment section below!

Vernon DaCosta
Freelance Technical Content Writer, Hevo Data

Vernon is enthusiastic about data science and loves to write on diverse topics related to data, software architecture, and integration.

No-Code Data Pipeline for BigQuery