You are currently at a stage where your organization might need to analyze your data on Intercom to unearth invaluable patterns and insights from your customers’ interactions data. You have chosen to set up a Snowflake Data Warehouse to house your Intercom data (and possibly, data from a lot of other sources as well). If you want to learn about the methods to move data from Intercom to Snowflake, you have landed on the right post. 

This blog aims at helping you discover the means to easily load data from Intercom to Snowflake. First, the article will introduce you to Snowflake and Intercom and explain their key features. Afterwards, it will discuss the two methods for connecting Intercom to Snowflake. Read along to learn about the methods of loading data from Intercom to Snowflake.


  • Working knowledge of Snowflake.
  • Working knowledge of Intercom.
  • An Intercom account.
  • Access to Intercom Developer Hub.
  • Required credentials to access Snowflake.

Introduction to Snowflake

Snowflake Logo.
Image Source

Snowflake is an SQL-based Data Warehouse solution that supports structured and semi-structured data sets. The key benefit provided by Snowflake is the separation of Storage, Compute, and Metadata Services. Using Snowflake primitives, DBAs can centrally manage logical assets (Data Warehouses, Databases, etc.) and not technical assets (servers, buckets, etc.).

Snowflake has the ability to spin up new nodes (CPU and storage) in seconds while the Data Warehouse is online. It also dramatically reduces cost compared to traditional Data Warehouse platforms. 

Introduction to Intercom

Intercom Logo.

Image Source

Intercom is a customer communication platform. The platform enables teams in internet businesses to communicate with their customers on their Websites, Inside the Web, and Mobile Applications, as well as by E-Mail. Intercom exposes many resources through its API which you can export onto your Database.

Key features of Intercom:

  • Easy Collaboration: Intercom’s interface allows for quick and easy collaboration. You can also access old customer interactions from over a year ago.
  • Automated Campaigns: Intercom allows you to automatically trigger an E-Mail or Chat Message campaign based on user or visitor activity. This, in turn, allows contextually reaching out to your target audience and drive business outcomes. 
  • Real-time Metrics: Intercom allows you to quickly access metrics like Open Rates and Click-Throughs. However, often, businesses would need to import this data into BigQuery to combine it with other sources and get deeper insights.
Methods to Connect Intercom to Snowflake

Method 1: Using Hevo to Connect Intercom to Snowflake

Hevo, a fully managed, No-code Data Pipeline platform, helps you load data from 150+ Sources (including 30+ free sources such as Intercom) to Snowflake in real-time, in an effortless manner for free. Hevo with its minimal learning curve can be set up in a matter of minutes making the users ready to load data without compromising performance. Its strong integration with various sources such as Databases, Files, Analytics Engines, etc gives users the flexibility to bring in data of all different kinds in a way that’s as smooth as possible, without having to write a single line of code.


Method 2: Using Custom ETL Scripts to Connect Intercom to Snowflake

Intercom provides a rich set of APIs that allow you to extract data. This approach will need you to understand the Intercom APIs, Clean and Transform this data beforehand, and finally, Load it to Snowflake. Additionally, you will need to set up a data validation mechanism so that you verify if the data loaded into Snowflake is accurate. Given this is an effort-intensive process, the project timelines would be fairly high.

Methods to Connect Intercom to Snowflake

You can transfer your data from Intercom to Snowflake using the following 2 methods:

Method 1: Using Hevo to Connect Intercom to Snowflake

Hevo Data Logo.
Image Source

Hevo, a No-code Data Pipeline helps to transfer data from 150+ sources (including 30+ free sources such as Intercom) to Snowflake for free and visualize it in a BI tool. Hevo is fully managed and completely automates the process of not only loading data from your desired source but also enriching the data and transforming it into an analysis-ready form without having to write a single line of code. Its fault-tolerant architecture ensures that the data is handled in a secure, consistent manner with zero data loss.

It provides a consistent & reliable solution to manage data in real-time and always have analysis-ready data in your desired destination. It allows you to focus on key business needs and perform insightful analysis using various BI tools. 

Sign up here for a 14-Day Free Trial!

Method 2: Using Custom ETL Scripts to Connect Intercom to Snowflake

Following are the steps to transfer data from Intercom to Snowflake using Custom Scripts:

Step 1: Extract Data from Intercom

It’s easy to export your Intercom data as a CSV file using the RESTful Web API. There are 4 types of datasets that you can export:

  • Inbox reports
  • Messages reports
  • Articles reports
  • Operators reports

Perhaps you might want to get a list of all messages to analyze their performance or even to join message engagement data with user data external to Intercom to attribute performance. In this case, you will need to use the Export API and provide the range of dates that you want to be exported.

To learn more about RESTful API visit here.

To export all of your conversation data for a specific date range (UNIX Epoch time), you are going to use a simple CURL (Request).

-H 'Authorization:Bearer <Your access token>' 
-H 'Accept: application/json' 
-H 'Content-Type: application/json' -d '
  "created_at_after": 1577836800,
  "created_at_before": 1582900395,

You can view the status of your job by using this curl command:

-H 'Authorization:Bearer <Your access token>'
-H 'Accept: application/json'

When a job has a status of complete, and thus it has generated a download URL, you can download your data by hitting that provided URL.

HTTP/1.1 200 OK
  "download_expires_at": "1582870440",
  "download_url": "",

Your exported message data will be available to you in a gzipped CSV format that you will need to unzip using this command.

curl --compressed 
-H 'Authorization:Bearer <Your access token>' 
-H 'Accept: application/octet-stream'

Step 2: Prepare Data to Load

Snowflake natively supports structured data, which means structured data can be loaded into SQL relational tables. Any valid single-byte delimiter is supported; the default is CSV.

Step 3: Upload the CSV File to Amazon S3

To upload the CSV file to Amazon S3, you must first create an Amazon S3 bucket to hold your data. Next, install the AWS CLI on your system after which you should execute the following command:

aws s3 cp /path/to/messages.csv  s3://{YOUR_BUCKET_NAME}

Step 4: Create a Database with Predefined Table

A Database automatically includes a schema named ‘public’. However, we are going to specify a schema that maps to the CSV data we extracted from Intercom.


 Create a database.

create or replace database intercom_database;


Create target tables for your Intercom CSV data. The tables are temporary, meaning they persist only for the duration of the user session and are not visible to other users.


create or replace temporary table intercom_csv_table (
  user_id string,
  user_email string,
  conversation_id string,
  message_id string,
  message_title string,
  message_type string,
  message_channel string,
  campaign_reference string,
  campaign_name string,
  message_version  string,
  message_style string,
  delivered_as string,
  sent_as string,
  opened_at timestamp,
  clicked_at timestamp,
  hit_goal_at timestamp,
  responded_at timestamp,
  unsubscribed_at timestamp,
  bounced_at timestamp,
  marked_as_spam_at timestamp);


Create a warehouse


create or replace warehouse intercom_warehouse with
  auto_suspend = 120
  auto_resume = true


Create a file format object. File format objects are specified depending 

on the type of data you plan to load.

create or replace file format intercom_csv_format
  type = 'CSV'
  field_delimiter = '|'
  skip_header = 1;

Step 5: Create an External Stage

An external stage specifies where the CSV files are stored so that the data can be loaded into a Snowflake table.

create or replace stage intercom_csv_stage url='s3://{$YOUR_AWS_S3_BUCKET}/'
  credentials=(aws_key_id='{$YOUR_KEY}' aws_secret_key='{$YOUR_SECRET_KEY}')
  encryption=(master_key = 8e2e0e469e494a58e16bb956cd071656)
  file_format = intercom_csv_format;

Step 6: Load Data into Snowflake

Load the pre-staged Intercom data from AWS S3

list @intercom_csv_stage;

Load the data

copy into intercom_csv_table

  from @intercom_csv_stage/message.csv

  on_error = 'skip_file';

Step 7: Validate your Data

Following the data load, verify that the correct files are present on Snowflake.


This process returns errors by query ID and saves the results to a table for future reference.


create or replace table save_copy_errors as select * from table(validate(intercom_csv_table, job_id=>'<query_id>'));


Query the results table


select * from save_copy_errors;


Verify the loaded data


select * from intercom_csv_table;

Limitations of Using Custom Code to Connect Intercom to Snowflake

The limitations of using Custom Code to connect Intercom to Snowflake are as follows:

  • Cumbersome Process: There are many steps and decision points in this process and this introduces some unnecessary complexity in your workflow. Additionally, this process is lengthy and tedious, and debugging is even more cumbersome because it lacks error-handling functionality.
  • Data is not available in Real-time: This method does not account for real-time data, which means you’ll have to perform full data refreshes if you wish to include the latest data in your analysis. In case your use case demands you to deliver data in real-time, you will need to set up cron jobs to run the above process in a set schedule. 
  • Maintenance Overhead: Intercom is continuously updating its API which means you have to commit time and budgets to enable your developers to keep up with new changes and replicate the same on your infrastructure.
  • Data Transformations: If you have a use case where you need to transform your data before loading it into Snowflake – for example, say you want to standardize time zones of when the conversations occurred, you will need to build additional code to handle this. 


The article provided you a detailed explanation of 2 methods using which you can easily transfer data from Intercom to Snowflake. The first method requires a lot of your time and resources. If you have no qualms about troubleshooting errors and setting your ETL process manually then the first method is good for you. However, if you want a simplified and automated data transfer process, you should opt for the second method.

Hevo rids you of the added complexity of writing and maintaining custom scripts to move data from not intercom but a multitude of other data sources. It efficiently transfers Intercom data to any other data destination for free. Hevo natively integrates with Sales and Marketing Applications, Analytics Applications, Databases, and so on enabling your organization to efficiently assemble an enterprise BI data warehouse.

visit our website to explore hevo

By using a data integration tool, rather than writing the code yourself, you can save your engineers time and money because you will be allowing them to focus on other value-added activities.

Want to give Hevo a try?

Sign Up for a 14-day free trial and experience a seamless data migration from Intercom to Snowflake for free with the feature-rich Hevo suite first hand. You may also have a look at the amazing price, which will assist you in selecting the best plan for your requirements.

What are your thoughts on moving data from Intercom to Snowflake? Let us know in the comments. 

Software Developer, Hevo Data

Talha is a seasoned Software Developer, currently driving advancements in data integration at Hevo Data, where he have been instrumental in shaping a cutting-edge data integration platform for the past four years. With a significant tenure at Flipkart prior to their current role, he brought innovative solutions to the space of data connectivity and software development.

No-code Data Pipeline For Snowflake

Get Started with Hevo