Unlock the full potential of your Intercom data by integrating it seamlessly with Snowflake. With Hevo’s automated pipeline, get data flowing effortlessly—watch our 1-minute demo below to see it in action!
You are currently at a stage where your organization might need to analyze your data on Intercom to unearth invaluable patterns and insights from your customers’ interactions data. You have chosen to set up a Snowflake Data Warehouse to house your Intercom data (and possibly, data from a lot of other sources as well). If you want to learn about the methods to move data from Intercom to Snowflake, you have landed on the right post.
This blog aims at helping you discover the means to easily load data from Intercom to Snowflake. First, the article will introduce you to Snowflake and Intercom and explain their key features. Afterwards, it will discuss the two methods for connecting Intercom to Snowflake. Read along to learn about the methods of loading data from Intercom to Snowflake.
Prerequisites
- Working knowledge of Snowflake.
- Working knowledge of Intercom.
- An Intercom account.
- Access to Intercom Developer Hub.
- Required credentials to access Snowflake.
Introduction to Snowflake
Snowflake is an SQL-based Data Warehouse solution that supports structured and semi-structured data sets. The key benefit provided by Snowflake is the separation of Storage, Compute, and Metadata Services. Using Snowflake primitives, DBAs can centrally manage logical assets (Data Warehouses, Databases, etc.) and not technical assets (servers, buckets, etc.).
Snowflake has the ability to spin up new nodes (CPU and storage) in seconds while the Data Warehouse is online. It also dramatically reduces cost compared to traditional Data Warehouse platforms.
Introduction to Intercom
Intercom is a customer communication platform. The platform enables teams in internet businesses to communicate with their customers on their Websites, Inside the Web, and Mobile Applications, as well as by E-Mail. Intercom exposes many resources through its API which you can export onto your Database.
Key features of Intercom:
- Easy Collaboration: Intercom’s interface allows for quick and easy collaboration. You can also access old customer interactions from over a year ago.
- Automated Campaigns: Intercom allows you to automatically trigger an E-Mail or Chat Message campaign based on user or visitor activity. This, in turn, allows contextually reaching out to your target audience and drive business outcomes.
- Real-time Metrics: Intercom allows you to quickly access metrics like Open Rates and Click-Throughs. However, often, businesses would need to import this data into BigQuery to combine it with other sources and get deeper insights.
Methods to Connect Intercom to Snowflake
You can transfer your data from Intercom to Snowflake using the following 2 methods:
Method 1: Using Hevo to Connect Intercom to Snowflake
You can use Hevo Data’s No-Code Data Pipeline Platform to sync your Intercom data to Snowflake seamlessly. Here is a step-by-step explanation of how you can sync your data automatically from Intercom to Snowflake.
Prerequisites
Some of the prerequisites that you need to have before performing the integration:
- An active Intercom account that contains the data that is to be ingested.
- An active Snowflake account.
Step 1: Configure Intercom as your Source
You can select Intercom from the list of sources supported by Hevo.
Step 1.1: Select Intercom as your source.
Note: Hevo also supports Intercom Webhook as a free source through which you can subscribe for real-time notifications on the occurrence of any event.
Step 1.2: Configure your Intercom Account.
- You will then be redirected to the Intercom Welcome Page, where you can log in to your Intercom account.
Step 1.3: Select the Workspace you want Hevo to access from the dropdown.
Step 1.4: Click on Authorize Access.
Step 1.5: Provide a Pipeline Name and the email address associated with your Intercom account. Click on Test & Continue.
Step 2: Configure Snowflake as the Destination
You can select Snowflake as your Destination warehouse to load your data.
Step 2.1: Select Snowflake as the Destination.
Step 2.2: Enter Snowflake Configuration Details – You can enter the Snowflake Account URL that you obtained. Also, Database User, Database Password, Database Name, and Database Schema.
Step 2.3: You can now click on Save Destination.
After the connection has been successfully established between the source and the destination, data will start flowing automatically. That’s how easy Hevo makes it for you.
Integrate Intercom to Snowflake
Integrate Intercom to BigQuery
Integrate Intercom Webhook to PostgreSQL
Method 2: Using Custom ETL Scripts to Connect Intercom to Snowflake
Following are the steps to transfer data from Intercom to Snowflake using Custom Scripts:
Step 1: Extract Data from Intercom
It’s easy to export your Intercom data as a CSV file using the RESTful Web API. There are 4 types of datasets that you can export:
- Inbox reports
- Messages reports
- Articles reports
- Operators reports
Perhaps you might want to get a list of all messages to analyze their performance or even to join message engagement data with user data external to Intercom to attribute performance. In this case, you will need to use the Export API and provide the range of dates that you want to be exported.
To learn more about RESTful API visit here.
To export all of your conversation data for a specific date range (UNIX Epoch time), you are going to use a simple CURL (Request).
curl https://api.intercom.io/export/messages/data
-X POST
-H 'Authorization:Bearer <Your access token>'
-H 'Accept: application/json'
-H 'Content-Type: application/json' -d '
{
"created_at_after": 1577836800,
"created_at_before": 1582900395,
}'
You can view the status of your job by using this curl command:
curl https://api.intercom.io/export/messages/data/{job_identifier}
-X GET
-H 'Authorization:Bearer <Your access token>'
-H 'Accept: application/json'
When a job has a status of complete, and thus it has generated a download URL, you can download your data by hitting that provided URL.
HTTP/1.1 200 OK
{
"job_identifier":"2b8ec70207b7af4",
"status":"complete",
"download_expires_at": "1582870440",
"download_url": "https://api.intercom.io/download/messages/data/message-data-export.d79d41-092f7-182e0-c24a5-124b9-c4c7a8.csv",
}'
Your exported message data will be available to you in a gzipped CSV format that you will need to unzip using this command.
curl --compressed https://api.intercom.io/download/messages/data/message-data-export.d79d41-092f7-182e0-c24a5-124b9-c4c7a8.csv
-X GET
-H 'Authorization:Bearer <Your access token>'
-H 'Accept: application/octet-stream'
Step 2: Prepare Data to Load
Snowflake natively supports structured data, which means structured data can be loaded into SQL relational tables. Any valid single-byte delimiter is supported; the default is CSV.
Sync your data seamlessly using Hevo
No credit card required
Step 3: Upload the CSV File to Amazon S3
To upload the CSV file to Amazon S3, you must first create an Amazon S3 bucket to hold your data. Next, install the AWS CLI on your system after which you should execute the following command:
aws s3 cp /path/to/messages.csv s3://{YOUR_BUCKET_NAME}
Step 4: Create a Database with Predefined Table
A Database automatically includes a schema named ‘public’. However, we are going to specify a schema that maps to the CSV data we extracted from Intercom.
/*-----------------------------------------------------------------------
Create a database.
-----------------------------------------------------------------------*/
create or replace database intercom_database;
/*-----------------------------------------------------------------------
Create target tables for your Intercom CSV data. The tables are temporary, meaning they persist only for the duration of the user session and are not visible to other users.
------------------------------------------------------------------------*/
create or replace temporary table intercom_csv_table (
user_id string,
user_email string,
conversation_id string,
message_id string,
message_title string,
message_type string,
message_channel string,
campaign_reference string,
campaign_name string,
message_version string,
message_style string,
delivered_as string,
sent_as string,
opened_at timestamp,
clicked_at timestamp,
hit_goal_at timestamp,
responded_at timestamp,
unsubscribed_at timestamp,
bounced_at timestamp,
marked_as_spam_at timestamp);
/*----------------------------------------------------------------------
Create a warehouse
----------------------------------------------------------------------*/
create or replace warehouse intercom_warehouse with
warehouse_size='X-SMALL'
auto_suspend = 120
auto_resume = true
initially_suspended=true;
/*----------------------------------------------------------------------
Create a file format object. File format objects are specified depending
on the type of data you plan to load.
----------------------------------------------------------------------*/
create or replace file format intercom_csv_format
type = 'CSV'
field_delimiter = '|'
skip_header = 1;
Step 5: Create an External Stage
An external stage specifies where the CSV files are stored so that the data can be loaded into a Snowflake table.
create or replace stage intercom_csv_stage url='s3://{$YOUR_AWS_S3_BUCKET}/'
credentials=(aws_key_id='{$YOUR_KEY}' aws_secret_key='{$YOUR_SECRET_KEY}')
encryption=(master_key = 8e2e0e469e494a58e16bb956cd071656)
file_format = intercom_csv_format;
Step 6: Load Data into Snowflake
/*-----------------------------------------------------------------------
Load the pre-staged Intercom data from AWS S3
-----------------------------------------------------------------------*/
list @intercom_csv_stage;
/*-----------------------------------------------------------------------
Load the data
-----------------------------------------------------------------------*/
copy into intercom_csv_table
from @intercom_csv_stage/message.csv
on_error = 'skip_file';
Step 7: Validate your Data
Following the data load, verify that the correct files are present on Snowflake.
/*----------------------------------------------------------------------
This process returns errors by query ID and saves the results to a table for future reference.
----------------------------------------------------------------------*/
create or replace table save_copy_errors as select * from table(validate(intercom_csv_table, job_id=>'<query_id>'));
/*----------------------------------------------------------------------
Query the results table
---------------------------------------------------------------------*/
select * from save_copy_errors;
/*---------------------------------------------------------------------
Verify the loaded data
---------------------------------------------------------------------*/
select * from intercom_csv_table;
Limitations of Using Custom Code to Connect Intercom to Snowflake
The limitations of using Custom Code to connect Intercom to Snowflake are as follows:
- Cumbersome Process: There are many steps and decision points in this process and this introduces some unnecessary complexity in your workflow. Additionally, this process is lengthy and tedious, and debugging is even more cumbersome because it lacks error-handling functionality.
- Data is not available in Real-time: This method does not account for real-time data, which means you’ll have to perform full data refreshes if you wish to include the latest data in your analysis. In case your use case demands you to deliver data in real-time, you will need to set up cron jobs to run the above process in a set schedule.
- Maintenance Overhead: Intercom is continuously updating its API which means you have to commit time and budgets to enable your developers to keep up with new changes and replicate the same on your infrastructure.
- Data Transformations: If you have a use case where you need to transform your data before loading it into Snowflake – for example, say you want to standardize time zones of when the conversations occurred, you will need to build additional code to handle this.
Read More About:
Conclusion
The article provided you a detailed explanation of 2 methods using which you can easily transfer data from Intercom to Snowflake. The first method requires a lot of your time and resources. If you have no qualms about troubleshooting errors and setting your ETL process manually then the first method is good for you. However, if you want a simplified and automated data transfer process, you should opt for the second method.
Hevo rids you of the added complexity of writing and maintaining custom scripts to move data from not intercom but a multitude of other data sources. It efficiently transfers Intercom data to any other data destination for free.
visit our website to explore hevo
By using a data integration tool, rather than writing the code yourself, you can save your engineers time and money because you will be allowing them to focus on other value-added activities.
Want to give Hevo a try?
Sign Up for a 14-day free trial and experience a seamless data migration from Intercom to Snowflake for free with the feature-rich Hevo suite first hand. You may also have a look at the amazing price, which will assist you in selecting the best plan for your requirements.
What are your thoughts on moving data from Intercom to Snowflake? Let us know in the comments.
FAQs to sync data between Intercom and Snowflake
1. How to load data from MySQL to Snowflake?
You can load data from MySQL to Snowflake by exporting MySQL data to CSV files and using Snowflake’s COPY INTO command or Snowpipe to load the data directly into Snowflake tables from a stage.
2. What are integrations in Snowflake?
In Snowflake, integrations typically refer to connections or linkages established between Snowflake and external systems, services, or tools.
3. What is SQL API in Snowflake?
In Snowflake, the SQL API refers to the REST API that allows developers and applications to interact programmatically with Snowflake using SQL commands.
4. How to list stages in Snowflake?
You can list stages in Snowflake using the SQL command SHOW stages.
Talha is a Software Developer with over eight years of experience in the field. He is currently driving advancements in data integration at Hevo Data, where he has been instrumental in shaping a cutting-edge data integration platform for the past four years. Prior to this, he spent 4 years at Flipkart, where he played a key role in projects related to their data integration capabilities. Talha loves to explain complex information related to data engineering to his peers through writing. He has written many blogs related to data integration, data management aspects, and key challenges data practitioners face.