Messengers and chatbots are an important part of a modern-day game plan for generating leads from your marketing efforts and converting them into revenue. There are several software-as-service offerings in this domain that help companies to quickly set up messengers without having to implement custom software on their own.

This post will deal with copying data from Intercom to Redshift, a very popular data warehouse service in the market. It starts with a brief introduction to Intercom and Redshift before diving into the two methods you can use to connect Intercom and Redshift.

Introduction to Intercom

Intercom Logo
Image Source

Intercom is a popular business messenger service that can be easily integrated into any website. Intercom helps in lead generation, lead conversion, and lead management. It supports automated workflows and conversations. It even allows visitors to execute various activities related to their customer journey like learn about order status, order confirmation, raise a support ticket, etc. right from the chatbox. 

Even though Intercom provides its own dashboard and reporting facilities, it is quite natural for companies to import their generated lead data to their own data warehouse for analysis and decision making. Having this data in their warehouse allows them to combine this with their already existing data and perform far deeper analysis.

Here are a few salient features of Intercom:

  • Outbound Messages: You can also use mobile push, targeted email along with in-app messages to encourage customers to take action, or convert prospective leads to loyal customers. This feature helps you support and onboard your customers in your mobile app and on the web.
  • Team Inbox: You can answer customer questions through a shared, collaborative, and highly customizable inbox. This Intercom feature lets you support your customers in a scalable and personal way.
  • Product Tours: You can support and onboard customers while highlighting new features of Intercom with guided, adoption-driving tours.
  • Help Center Articles: Intercom allows you to publish to a knowledge base to provide your customers with self-serve support. You can also automatically share articles in Messenger, or through conversations with teammates.

Introduction to Redshift

Redshift Logo
Image Source

Amazon Redshift is one of the most widely used Cloud Data Warehouses in today’s marketplace. Amazon Redshift can integrate with your Data Lake with considerable ease and offer a 3x better price-performance compared to its competitors. Amazon Redshift allows you to combine and query exabytes of Semi-structured and Structured data across your operational database, Data Warehouse, and Data Lake using standard SQL.

Here are a few salient features of Redshift:

  • Redshift ML: Redshift ML simplifies creating, training, and deploying Amazon SageMaker models using SQL for database developers and Data Analysts.
  • Federated Query: The federated query allows you to reach into your operational, relational database. You can now query live data across one or more Aurora PostgreSQL and Amazon RDS databases to get instant visibility into the end-to-end business operations without the need for data movement.
  • Materialized Views: This Redshift feature allows you to achieve faster query performance on datasets ranging in size from gigabytes to petabytes. Data Compression, Columnar Storage, and Zone Maps considerably reduce the amount of I/O needed to perform queries.
  • Limitless Concurrency: Amazon Redshift provides fast performance consistently, even with thousands of concurrent queries, whether they query data directly in your Amazon S3 Data Lake, or in your Amazon Redshift Data Warehouse.
Exporting data from Intercom to Redshift

Broadly, there are two broad approaches to moving data from Intercom to Redshift:

Method 1: Using Intercom APIs to Connect Intercom to Redshift

Intercom provides a robust set of APIs that allow you to extract data programmatically. Next, you would need to clean this data and load it to Redshift. This would need you to invest in engineering bandwidth, who can build the setup from scratch.

Method 2: Using Hevo Data to Connect Intercom to Redshift

If you are looking to instantly move data from Intercom to Redshift, then this is a better and faster approach for free. Hevo can help you move Intercom data to Redshift on a point and click interface, without having to write any code for free.

Sign up here for a 14-day Free Trial!

Understanding the Methods to Connect Intercom to Redshift

These are the methods you can use to connect Intercom to Redshift in a seamless fashion:

Method 1: Using Intercom APIs to Connect Intercom to Redshift

Intercom APIs help its users to programmatically fetch and manage their data. Intercom uses the OAuth protocol for authenticating its app integrations. For account holders who want to access only their data, they also support developer access token-based authentication. The APIs are rate limited with headers in the responses to inform developers about the remaining number of requests. 

Let us now learn about the basic building blocks of implementing a custom script to copy data from Intercom to Redshift. We will be using Linux command-line utilities to show the steps involved.

The broad steps to this approach are as follows:

Prerequisites

In addition to a running Intercom and Redshift account, you would need the following to be To successfully execute the following steps, you will need to have:

  • Intercom account with access to Intercom Developer Hub
  • Required credentials to access Amazon S3 bucket and AWS Redshift

Step 1: Creating an Application in the Intercom Developer Hub

The first step is to create an application in the Intercom developer hub and retrieve an access token. Since we will only be using our own data, let us skip OAuth for now. After creating the application in the developer hub, the access token can be retrieved from the Configure > Authentication section.

Getting Intercom access token - Blog Image
Image Source

Step 2: Extracting Data Using Intercom API

Let us now use the Access token in retrieving some data. We will be using Contacts API to fetch the list of all the contacts. The API is paginated with a per_page parameter and starting_from parameter deciding the next items to fetch. Here we will be setting the per_page parameter as 2 which means we want to fetch 2 contacts per request. The default is 50 and the maximum is set at 150.

curl -o contacts.json https://api.intercom.io/contacts?per_page=2 

-H 'Authorization:Bearer <access_token>' 

-H 'Accept:application/json'

The result of this command will be a JSON file with a list of contacts. Each contact entry has a name, email, companies with which the user is associated, location, and a whole lot of details.

Step 3: Parsing the JSON Output

We will now use the JSON file from the previous step to extract the relevant entries. For now, we will focus on the name, email, and location of the contacts. We will be using a command-line utility called jq to parse the JSON.

cat contact.json | jq -r ".data[] | [.id,.name,.email,.location.city] | @csv" >> contacts.csv

The output will be a file named contacts.csv with contents as below.

"5ba682d23d7cf92bef87bfd4","Hoban Washburn","Hoban@seren.io","Dublin"

"5ba682d23d7cf92bef87bfd5","Ashik Washburn","aw@abc.io","Dublin"

Step 4: Loading Data to Amazon S3

Lets us now move this file to an S3 bucket to stage before loading to Redshift.

aws s3 cp /contacts/contacts.csv s3://contacts_bucket/

copy contacts

from 's3://contacts_bucket/contacts.csv' 
iam_role 'arn:aws:iam::0123456789012:role/User_RedshiftRole' 
csv;

Step 5: Copying Data to Amazon Redshift

You will use the COPY command from Redshift to load this CSV file from S3 to Redshift. Assuming a Redshift table called contacts is already created, use the below command.

That concludes the basic steps of executing a copying operation from Intercom to Redshift. The above steps can serve as a template for implementing an application to load data from Intercom to Redshift. While using this approach to execute a copying operation in production, several challenges will come out. Let us look at the typical problems that the developer will have to overcome in executing such an operation reliably.

Limitations of Using Intercom APIs to Connect Intercom to Redshift

  • Intercom APIs are paginated. So if you have a large amount of data, the application has to include logic to keep track of the current page and form requests accordingly.
  • The API is rate-limited. Spawning requests continuously without adhering to rate limits can lead to too many requests exception being returned as a response. The application logic should have a provision to process the response headers which include details of remaining requests possible at any point in time.
  • Usually, such a load process needs to be executed continuously ensuring a near-real-time updated copy of the customer list in the data warehouse. To accomplish this additional logic to handle updates, deletes, and duplicates are required. This is, even more, tougher since Redshift does not enforce primary keys while loading data.
  • Redshift’s automatic data type casting may cause problems if you don’t specify a data type mapping configuration file along with the COPY command.
  • This approach needs a staging system where the file is held and the requests are executed. In the age of software-as-service offerings, such flexibility is rare.
  • The above steps handle only one specific API – the contacts API. Intercom provides several other APIs with different JSON structures to fetch many other useful data elements. There will be a learning curve in making sense of all APIs, their capabilities, and the response structure before implementing a comprehensive data load process.
  • In most cases, the fetched data will need to be transformed before loading it into the database. This would need you to take additional steps to load data.

A way to avoid all the above challenges is to use a No-code Data Pipeline platform, Hevo which can execute data load operations from Intercom to Redshift in a fraction of the time.

Method 2: Using Hevo Data to Connect Intercom to Redshift

Hevo Logo
Image Source

Hevo Data, a No-code Data Pipeline helps to Load Data from any data source such as Databases, SaaS applications, Cloud Storage, SDKs, and Streaming Services and simplifies the ETL process. It supports 100+ data sources and is a 3-step process by just selecting the data source, providing valid credentials, and choosing the destination. Hevo loads the data onto the desired Data Warehouse, enriches the data, and transforms it into an analysis-ready form without writing a single line of code.

Its completely automated pipeline offers data to be delivered in real-time without any loss from source to destination. Its fault-tolerant and scalable architecture ensure that the data is handled in a secure, consistent manner with zero data loss and supports different forms of data. The solutions provided are consistent and work with different Business Intelligence (BI) tools as well.

Get Started with Hevo for free

Here are the simple steps:

  • Step 1: Set up and configure your Intercom platform by first signing into your Intercom account. In the following screen, select the workspace you wish to access from the drop-down list. Click on the Authorize Access button to allow Hevo to access the objects. You need to provide a suitable Pipeline Name in the Configure Your Intercom Source Page to move data from Hevo Data to Intercom.
Intercom to Redshift Source Configuration
Image Source
  • Step 2: Load data from Intercom to Redshift by providing your Redshift databases credentials like Database Port, Username, Password, Name, Schema, and Cluster Identifier along with the Destination Name.
Intercom to Redshift Destination Configuration
Image Source

Hevo abstracts all the pain points and limitations mentioned above. Additionally, Hevo’s fault-tolerant algorithms automatically handle any errors in the data flow, thereby ensuring that your warehouse always has up-to-date data. Here is a quick overview of Hevo: 

  • Data in real-time: Hevo’s real-time architecture ensures that your data is moved instantly and without delay. Thus, you can access analysis-ready data in real-time with Hevo. 
  • Simplicity: Hevo’s interface is intuitive and easy to use. It does not require any setup and starts transferring data instantly.
  • Transformations: Hevo provides preload transformations through Python code. It also allows you to run transformation code for each event in the pipelines you set up. You need to edit the event object’s properties received in the transform method as a parameter to carry out the transformation. Hevo also offers drag and drop transformations like Date and Control Functions, JSON, and Event Manipulation to name a few. These can be configured and tested before putting them to use.
  • Connectors: Hevo supports 100+ integrations to SaaS platforms, files, databases, analytics, and BI tools. It supports various destinations including Google BigQuery, Amazon Redshift, Snowflake Data Warehouses; Amazon S3 Data Lakes; and MySQL, MongoDB, TokuDB, DynamoDB, PostgreSQL databases to name a few.  
  • Reliable data load: Hevo has a fault-tolerant architecture that ensures that your data loads are reliable and consistent. This means that you can move data with zero loss.
  • Scalability: Hevo can handle a wide variety of data from 100+ sources like databases, analytics applications, sales and marketing applications, etc. at any scale. Thus, Hevo helps you scale your data infrastructure as your needs grow.
  • Zero maintenance: Hevo is fully managed and automated. Thus, setting it up does not require a lot of effort on your end.
Sign up here for a 14-day Free Trial!

Conclusion

Hevo rids you of the added complexity of writing and maintaining custom scripts to move data from Intercom to Redshift.

Hevo natively integrates with Sales and Marketing Applications, Analytics Applications, Databases, and so on enabling your organization to efficiently assemble an enterprise BI data warehouse.

Visit our Website to Explore Hevo

Extracting complex data from a diverse set of data sources can be a challenging task and this is where Hevo saves the day! Hevo offers a faster way to move data from Databases or SaaS applications into your Data Warehouse to be visualized in a BI tool. Hevo is fully automated and hence does not require you to code. You can also have a look at the unbeatable pricing that will help you choose the right plan for your business needs.

Want to take Hevo for a spin? Sign Up for a 14-day free trial and experience the feature-rich Hevo suite first hand.

Share your experience of establishing a connection from Intercom to Redshift in the comments section below!

Talha
Software Developer, Hevo Data

Talha is a seasoned Software Developer, currently driving advancements in data integration at Hevo Data, where he have been instrumental in shaping a cutting-edge data integration platform for the past four years. With a significant tenure at Flipkart prior to their current role, he brought innovative solutions to the space of data connectivity and software development.

No-code Data Pipeline for Redshift

Get Started with Hevo