Press "Enter" to skip to content

Zendesk to BigQuery: Load Data in Minutes

Zendesk to BigQueryDo you have a lot of customer service data in Zendesk? Do you want to take a deep dive into the data and perform advanced analytics? If this applies to you, then transferring your data from Zendesk to a data warehouse like Google BigQuery might be a good way to initiate this process. This blog post will look at two ways through which you can move data from Zendesk to BigQuery. Thus enabling you to pick the one which best suits your needs.

Introduction to Zendesk:

Zendesk is a customer support platform that enables you to track customer support activity across multiple channels. Zendesk unifies these activities through its platform and enables you to track all this activity from a single location. It does this through its comprehensive ticketing system. Zendesk also has CRM functionality that can help you monitor the entire customer lifecycle from first contact to ensuring customer retention.

Zendesk data can be accessed by making calls to its REST API.

Key Highlights of Zendesk:

  • Easy to set up: Most of Zendesk’s features do not require additional configuration and will work immediately after it’s set-up.
  • Customer Support and Community: You are able to get constant feedback from customers through the Zendesk platform. Furthermore, Zendesk provides around the clock support for your business. 
  • Comprehensive Ticketing System: Zendesk’s ticketing system brings together requests from social media, email, chat, etc. in one location. This makes it easy and fast to respond.
  • Reporting Functionality: Zendesk provides high-level reporting where you can look at an overview of metrics for your customer support activity. 

Introduction to Google BigQuery:

Google BigQuery is a cloud-based data warehouse that is part of the Google Cloud Products stack. It provides a fast, scalable and easy analysis of big data with SQL code. BigQuery has earned a good reputation on the market due to its high performance, which can be attributed to its columnar storage and tree-architecture. BigQuery is also a Google-managed service, which makes it easier to use than many other competing data warehouses on the market. You can read more about Google BigQuery features here.

Key Highlights of BigQuery:

  • Managed Service: Google handles performance tuning and backend configuration for BigQuery. This is different from many other data warehouses where you are required to perform these tasks yourself.
  • Distributed Architecture: You do not have to manually manage compute clusters as Google manages compute resources dynamically.
  • Easy to Use: You only need to load your data into BigQuery and then pay for what you use without having to build your own data center.  
  • Fast and Detailed Insights: BigQuery integrates seamlessly with many front-end analytics tools like Google Data Studio and Tableau. This makes it very easy to understand the data that you have loaded.

Loading Data from Zendesk to BigQuery

There are two popular methods to perform Zendesk to BigQuery data replication.

Method 1: Use a fully-managed automated data pipeline platform like Hevo Data

Method 2: Manually build custom ETL scripts to move data

Zendesk to BigQuery: Loading Data Using Hevo

It is very easy to move data from Zendesk to BigQuery with Hevo. Here are the steps:

  1. Authenticate and connect the Zendesk data source
    Zendesk to BigQuery - Configuring Source
  2. Connect to your BigQuery account and transfer the data
    Zendesk to BigQuery - Configuring Destination

Hevo will automatically map your data to its relevant tables in BigQuery, giving you access to your Zendesk data in real-time. Sign up for a free, zero-risk 14 day trial with Hevo and experience a smooth data migration from Zendesk to BigQuery. 

Hevo also helps you clean, transform and enrich the data before and after you move it to BigQuery, ensuring that your data is analysis-ready at any point in the data warehouse. 

Other advantages of Hevo:

  • Simplicity: Hevo is very intuitive and easy to use. Using Hevo to transfer your data will ensure that it is transferred in just a few clicks.
  • Reliable Data Load: Hevo’s fault-tolerant architecture ensures that its data loads are done consistently and reliably with minimal data loss.
  • Real-time: Using Hevo allows you to gain insights in real-time. Hevo’s real-time streaming architecture ensures that you are able to move data instantly and without any delays.
  • Minimal Setup: Setting up Hevo requires minimal effort from your end because it is fully managed and completely automated. 
  • Scalability: Hevo can handle data from a wide variety of sources (sales and marketing applications, databases, analytics applications, etc.) at any scale. Thus, it is able to help you scale your data infrastructure to meet expanding needs.

Zendesk to BigQuery: By manually writing ETL Scripts

  1. Extract the data from Zendesk
  2. Prepare the data
  3. Load the data into BigQuery

Step 1: Extract the Data from Zendesk

You can gain access to Zendesk data by making calls to its API. To do this, you must be an administrator in Zendesk as you would not have the right to access your data otherwise. The data provided is on customers, groups, agents, tickets and other aspects of operations. For example, GET /api/v2/tickets.json will return of the following form in JSON (a maximum of 100 tickets per page)


      "id":      35436,

      "subject": "Help I need somebody!",




      "id":      20057623,

      "subject": "Not just anybody!",






Further information on Zendesk’s API can be obtained here:

Step 2: Prepare the Data

Before the data can be loaded, you have to ensure that the data types that are coming from Zendesk will match their appropriate BigQuery equivalents. You also have to create a schema for your data tables that will receive the Zendesk data. Special accommodation (creating additional tables) will have to be made if they are nested objects in the JSON from Zendesk. Specific information for the data types for each Zendesk endpoint can be found here

BigQuery provides support for many of the widely used data types today. Specific information on the data types can be found here

Step 3: Load the Data into BigQuery

The data can be loaded by:

  1. Using gsutil to load the data into Google Cloud Storage (GCS)
  2. Use the bq load command to write code to create tables to store the data and specify the schema. This is done via bq, the BigQuery command line interface.
  3. Load the data into your table(s)

More information on loading data through BigQuery’s command line interface can be found here:

Zendesk to BigQuery: Limitations of the Manual Approach

  • Difficulties with Real-time Data: This data transferral method captures data at a point in time. Thus, you will have to configure cron jobs to even have any type of basic real-time functionality.
  • Time-consuming: This method involves writing a lot of custom code, which can be problematic in situations where you have to meet tight deadlines.
  • Requires Constant Maintenance: This method is especially susceptible to connectivity issues or Zendesk API issues. Thus, you always have to monitor the connection to prevent the ingestion of inaccurate data. 
  • Hard to Perform Data Transformations: It is hard to perform fast standardizations like times, currency conversions, etc. under this method. You will need to build additional code to achieve that.

Do you want an easier method to transfer your data, without the limitations and hassle of manually writing code? Try Hevo Data. Sign up for a free trial today to simplify your data transfer process from Zendesk to BigQuery.

ETL Data to Redshift, Bigquery, Snowflake

Move Data from any Source to Warehouse in Real-time

Sign up today to get $500 Free Credits to try Hevo!
Start Free Trial