Does your company use Intercom to manage its customer support activities? Are you trying to move the data generated by Intercom to BigQuery Data Warehouse for deeper analytics? If any of this applies to you, then you will find this blog helpful. This post will outline 2 methods of moving your data from Intercom to BigQuery. This will allow you to evaluate both and select the one which best suits your needs. 

This blog discusses both approaches in detail, allowing you to weigh the pros and cons critically. Before we get started, here is a brief understanding of Intercom and BigQuery for the uninitiated. The blog will then provide a step-by-step guide of the 2 methods using which you can set up your Intercom to BigQuery Integration.


  • Working knowledge of Google BigQuery.
  • Working knowledge of Intercom.
  • An Intercom account.
  • Access to Intercom Developer Hub.
  • Required credentials to access Google BigQuery.

Introduction to Intercom

Intercom is a messaging platform that allows you to communicate with your business’s prospective and existing customers. It does this by providing messaging apps that can be customized and scaled to suit your Marketing, Sales, and Support needs. These can be in the form of In-app Messages, E-Mails, and Chats. Additionally, it also offers tracking and segmentation functionality on the data it collects. 

Key features of Intercom

  • Easy Collaboration: Intercom’s interface allows for quick and easy collaboration. You can also access old customer interactions from over a year ago.
  • Automated Campaigns: Intercom allows you to automatically trigger an E-Mail or Chat Message campaign based on user or visitor activity. This, in turn, allows contextually reaching out to your target audience and drive business outcomes. 
  • Real-time Metrics: Intercom allows you to quickly access metrics like Open Rates and Click-Throughs. However, often, businesses would need to import this data into BigQuery to combine it with other sources and get deeper insights.

Introduction to Google BigQuery

BigQuery Logo.

Google BigQuery, a part of the Google Cloud Products (GCP) stack, is a Cloud-based Data Warehouse. BigQuery enables Fast, Easy, and Scalable analysis of data using SQL. BigQuery has become a popular Data Warehouse on the market in part due to its columnar storage and tree architecture. It is offered as a Google-managed service and so is easier to use in comparison with similar tools on the market.

Key Features of Google BigQuery

  • Easy to Use: It is easy to set up BigQuery and start using it. There is no need to build or manage any infrastructure. Additionally, they also have a pay-as-you-go pricing model that lowers the barrier to get started instantly.
  • Fast and Detailed Insights: BigQuery seamlessly integrates with front-end analytics tools like Tableau and Google Data Studio, making it easy to gain insights into your data. 
  • Managed Service: Google handles performance tuning and backend configuration. This makes it different from similar tools on the market that require you to do this manually.
  • Distributed Architecture: Google manages the compute resources dynamically and so does not require you to manage the compute clusters manually. 

To learn more about Google BigQuery visit here.

Methods to Connect Intercom to BigQuery

Method 1: Using Custom ETL Scripts to Connect Intercom to BigQuery

Intercom provides a robust set of APIs that allow you to extract data programmatically. Next, you would need to transform this data and load it to BigQuery. This would need you to invest in engineering bandwidth, who understand both Intercom and BigQuery infrastructures and can set up the data migration from scratch. Also, this would be a time-consuming approach.

Method 2: Using Hevo Data to Connect Intercom to BigQuery

Hevo Data, a fully managed, No-code Data Pipeline platform, helps you load data from 150+ Sources (including 40+ free sources such as Intercom) to Google BigQuery in real-time, in an effortless manner for free. Hevo with its minimal learning curve can be set up in a matter of minutes making the users ready to load data without compromising performance. Its strong integration with various sources such as Databases, Files, Analytics Engines, etc gives users the flexibility to bring in data of all different kinds in a way that’s as smooth as possible, without having to write a single line of code.


Methods to Connect Intercom to BigQuery

Multiple methods can be used to connect Intercom to BigQuery and load data easily:

Method 1: Using Custom ETL Scripts to Connect Intercom to BigQuery

Intercom provides a wide range of APIs that help extract various data points from Intercom. Intercom uses the OAuth protocol for authenticating its app integrations. The APIs are rate limited with headers in the responses to inform developers about the remaining number of requests. 

Here are the basic building blocks of implementing a custom code to copy data from Intercom to BigQuery

Step 1: Create an Application in the Intercom Developer Hub

The first step is to create an application in the Intercom developer hub and retrieve an access token. Since we will only be using our own data, let us skip OAuth for now. After creating the application in the developer hub, the access token can be retrieved from the Configure > Authentication section as shown in the below image.

Intercom to BigQuery - creating access token on Intercom developer hub.
Image Source

Step 2: Extract Data using Intercom API

You can extract your data by making calls to Intercom’s REST API. You can make simple GET requests to the API directly or with a client like Postman or curl. Intercom’s API provides endpoints with data on resources like conversations, tags, etc. The data is returned in JSON format. 

Here is an example request to extract data for a specific Intercom conversation.

GET /conversations/[id]

In response, you will get a JSON file as follows: 

  "type": "conversation",
  "id": "147",
  "created_at": 1400850973,
  "updated_at": 1400857494,
  "conversation_message": {
    "type": "conversation_message",
    "subject": "",
    "body": "
Hi Alice,

 We noticed you using our product. Do you have any questions?

- Virdiana

    "author": {
      "type": "admin",
      "id": "25"
    "attachments": [
        "name": "signature",
        "url": ""
  "user": {
    "type": "user",
    "id": "536e564f316c83104c000020"
  "assignee": {
    "type": "admin",
    "id": "25"
  "open": true,
  "read": true,
  "conversation_parts": {
    "type": "conversation_part.list",
    "conversation_parts": [
      //... List of conversation parts
  "tags": { "type": 'tag.list', "tags": [] } }

Note that this is just one of the many APIs exposed by Intercom. You can read more about the other APIs here.

Step 3: Prepare the Data

There are a few things you need to keep in mind here: 

  • It might be helpful to create a schema for tables in your BigQuery Data Warehouse to receive the specific endpoints you are importing from Intercom. 
  • You also have to ensure that the data types in your Intercom data match those in BigQuery. BigQuery provides support for a wide variety of data types. BigQuery allows loading data as is, in JSON format. You can read more about them here.
  • You can additionally flatten the JSON as well. Depends on your use case. 

Step 4: Load Data into BigQuery

There are multiple ways in which you can load data to BigQuery. Here are two simple approaches:

  1. Using BigQuery’s web UI → This is a manual approach, but fairly straightforward
  2. Loading data to Google Cloud Storage and then to BigQuery using bq commands. You can read more about bq commands here.

You can choose the data load process that suits your needs best. 

Limitations of migrating from Intercom to BigQuery natively

  • Time-consuming: This method is time-consuming as you have to write a lot of code to perform your ETL. This is also taxing on your Engineering resources.
  • Constant Maintenance: Inaccurate data will be returned if there are any issues with the Intercom API. Thus, you have to constantly monitor the API.
  • Hard to perform Data Transformations: Fast standardizations and transformations like time and currency conversions are impossible to perform under this method.
  • Limitations in Loading Data in Real-time: You have to put in a lot of extra effort to write code and configure cron jobs to achieve basic real-time functionality. Else, you will only be able to execute the data load in batches, crippling you from performing real-time data analysis.

Method 2: Using Hevo Data to Connect Intercom to BigQuery

Hevo Logo - Intercom to BigQuery

Hevo Data, a No-code Data Pipeline helps to transfer data from 150+ sources (including 30+ free sources such as Intercom) to Google BigQuery for free and visualize it in a BI tool. Hevo is fully managed and completely automates the process of not only loading data from your desired source but also enriching the data and transforming it into an analysis-ready form without having to write a single line of code. Its fault-tolerant architecture ensures that the data is handled in a secure, consistent manner with zero data loss.

It provides a consistent & reliable solution to manage data in real-time and always have analysis-ready data in your desired destination. It allows you to focus on key business needs and perform insightful analysis using various BI tools. 

Hevo Data focuses on two simple steps to get you started and connect Intercom to BigQuery with ease:

Step 1: Creating a Data Pipeline for Intercom

Create a new Data Pipeline for your Intercom data. Select Webhooks as the medium to transfer data from your data source Intercom. In case you already have a Data Pipeline set up, you need to switch to a new Webhooks URL, as the old Webhooks URL will get depreciated on 31st October 2020. Hevo Data will now provide you with an HTTP endpoint, that you need to specify in your Intercom account, through which Intercom can push data via a Webhook to Hevo. The configuration of the Intercom as a data source for Hevo is represented by the below image.

Configuring Intercom as data source.

Step 2: Integrating Data with BigQuery

Load data from Intercom to BigQuery by providing your BigQuery Database credentials such as your authorized Google BigQuery account, along with a name for your Database, Destination, and Project Id. The configuration of BigQuery as the data destination is represented by the below image.

Configuring Google BigQuery as destination.

You can now start transferring your data in the JSON format of size up to 16 MB, to a data warehouse of your choice using Hevo Data’s Webhook Functionality, without writing any code. Hevo will automatically perform data refreshes to ensure that you always have up-to-date data in your data warehouse.

Check out what makes Hevo amazing:

  • Real-Time Data Export: Hevo with its strong integration with 100+ sources, allows you to transfer data quickly & efficiently. This ensures efficient utilization of bandwidth on both ends.
  • Live Support: The Hevo team is available round the clock to extend exceptional support to its customers through Chat, E-Mail, and Support Calls.
  • Schema Management: Hevo takes away the tedious task of schema management & automatically detects schema of incoming data and maps it to the destination schema.
  • Minimal Learning: Hevo with its simple and interactive UI, is extremely simple for new customers to work on and perform operations.
  • Secure: Hevo has a fault-tolerant architecture that ensures that the data is handled in a secure, consistent manner with zero data loss.
  • Live Monitoring: Hevo allows you to monitor the data flow so you can check where your data is at a particular point in time.


This article introduced you to Intercom and Google BigQuery and also explained their key features. Moreover, the article discussed in detail the 2 easy methods of transferring your data from Intercom to BigQuery. The first method involves writing scripts for setting up the ETL process which although effective, will consume a lot of your time and resources.

Share your experience of understanding of moving data from Intercom to BigQuery in the comment section below! We would love to hear your thoughts.

Rashid Y
Technical Content Writer, Hevo Data

Rashid is a technical content writer with a passion for the data industry. Leveraging his problem-solving skills, he delivers informative and engaging content on data science. With a deep understanding of complex data concepts and a talent for clear, compelling communication, Rashid creates content that informs and captivates his audience.

No-code Data Pipeline for BigQuery