Are you looking for simple methods to quickly load data from Intercom to MySQL and gain actionable insights? If so, you have reached the right place. This article describes two popular methods to move Intercom data to MySQL. It provides a detailed step-by-step guide for both the methods so that you can pick the one that suits your needs best.
Here is a snapshot of what this post covers:
- Method 1: Building a custom script using Intercom APIs
- Method 2: Transfer data from Intercom to MySQL without writing code
Intercom is a popular business messenger service that helps companies to acquire, engage, and retain customers. A software-as-a-service offering like Intercom provides companies with a complete customer engagement module without having to actually build it. Intercom helps in lead generation as well as lead conversion through its variety of services that includes automated workflows and chatbots. Intercom’s messenger service even allows users to perform various activities in their customer journey from the chatbox itself.
Companies using services like Intercom often need to load the data generated by these services to their own database systems for further analysis. Having this data in their own systems allows them to expand the horizon of their data investigations because of the ability to combine this data with data from other applications. This allows them to unearth deeper insights.
You should have the following in place before starting to load data from Intercom to MySQL:
- An active Intercom account
- A set up MySQL database
- Basic understanding of Rest APIs
Methods to load data from Intercom to MySQL
Data can be loaded from Intercom to MySQL using two methods:
This method utilizes APIs made available by Intercom to access and transfer data to a destination. You would need to engage engineering resources familiar with Intercom and MySQL to build a data migration infrastructure from scratch.
By using a No-code Data Pipeline Hevo, you can move data from Intercom to MySQL in real-time without having to write any code. This ensures that you are able to reliably move your data without any dependency on the engineering team. You can instantly start analyzing your Intercom data in MySQL.
Method 1: Building a custom script using Intercom APIs
Intercom APIs enable customers to programmatically manage and track their Intercom activities and data. For the purpose of demonstration, let us assume that we are looking to fetch lead data from Intercom. Intercom’s Contacts API will help us here.
Steps to build custom code to move data from Intercom to MySQL:
- Get access to Intercom APIs
- Access data on leads using Contacts API
- Extract relevant entities from the data
- Create a MySQL table
- Load data to MySQL table
Step 1: Get access to Intercom APIs
The first step in building this script is to understand Intercom API’s authentication process. Intercom provides two options here: either through OAuth protocol or through a developer access token. Let’s begin our attempt here by getting hold of a developer access token first.
The developer access token can be obtained by creating an app at Intercom developer hub and then going to Configure > Authentication section as below.
Step 2: Access data on leads using Contacts API
The next step is to use this token along with CURL command-line utility to access the data.
curl -o intercom_contacts.json https://api.intercom.io/contacts?per_page=2 -H 'Authorization:Bearer <access_token>' -H 'Accept:application/json'
Before going to the next step, let’s examine the basics of this API. Contacts API used above is a paginated API. Two parameters: per_page and starting_from values control the number of entries returned per page and the next entry to be returned. Here we have used the per_page parameter as 2, which means 2 entries will be returned.
The result of this request is a JSON file with a considerable amount of information about each contact. Some of the relevant bits are name, email id, companies with which each user is associated, location, sign-in time, last access time, etc. The above command saves this output to a file called intercom_contacts.json
Step 3: Extract relevant entities from the data
In this step, we will use a command-line utility called jq to process this JSON and extract the relevant entities. For this exercise, we will limit our extraction to contact identifier, name, email, and location.
cat intercom_contacts.json | jq -r ".data | [.id,.name,.email,.location.city] | @csv" >> intercom_contacts.csv
The output of the above command will be a CSV as follows.
"5ba682d23d7bdg92bef87bfd4","Jessinda Romanov","email@example.com","Newyork" "5ba682d2spfcf92bef87bfd5","Matilda kuriakin","firstname.lastname@example.org","Yprkshire"
Step 4: Create a MySQL table
Now, let’s create a MySQL table to load this CSV data.
CREATE TABLE intercom_contacts ( id INT(6) UNSIGNED AUTO_INCREMENT PRIMARY KEY, contact_id BIGINT, name VARCHAR(50), email VARCHAR(50), location VARCHAR(50) )
Step 5: Load data to MySQL table
Login to MySQL instance and use the below command to load data.
LOAD DATA LOCAL INFILE intercom_contacts.csv’ INTO TABLE intercom_contacts FIELDS TERMINATED BY ',' LINES TERMINATED BY 'rn' (contact_id, name, email, location);
That concludes the building blocks of loading data from Intercom to MySQL. As already evident, this is a very simplified version of the tasks at hand in implementing this process. In a real-life scenario, there is an umpteen number of other factors and challenges that will make the job harder for developers. Let us go through some of those problems below.
Limitations of this method:
- The above approach covers only one API and a meager number of fields from that API. The contacts API in itself has a large amount of information inside the nested JSON objects. So the actual job of processing the JSON and making sense of the complete set of API is a lot tougher requiring a steep learning curve.
- The API is paginated and in the above process, we have handled only a single page of results. In production, the application will need logic to keep track of the number of remaining entries and form requests appropriately.
- The API is rate limited with the response headers containing information about the remaining requests and total available requests in specific time windows. Application logic should handle this to prevent getting a rate limit violation error from Intercom.
- Data transformation is a typical requirement during such a load process. The steps mentioned here do not handle it. In reality, most load processes will need transformations according to the destination database structure and design.
- Typically such processes are continuously executed to ensure a low latency copy of the leads data available in MySQL. In such cases, functionality to handle duplicates, updates, etc. also needs to be implemented on the application side.
A shortcut to step over all the above challenges is to use a cloud-based automated ETL service like Hevo that specializes in data load processes from multiple software-as-a-service products to a variety of destinations. Hevo, a No-code Data Pipeline, provides the lowest time to production for such load processes and comes with support for data transformation on the fly.
Method 2: Transfer data from Intercom to MySQL without writing code
Hevo, a No-code Data Pipeline, provides an easy to use interface with the ability to copy data from Intercom to MySQL without writing any code. Hevo is easy to use and requires no setup, allowing developers to focus on their core business logic rather than spend time on the configuration nightmares involved in setting up data pipelines.
Here are the simple steps:
- Link and configure Intercom account
- Connect your MySQL database and watch your data load instantly
Hevo rids you of all the limitations faced while building custom code and reliably moves your data to MySQL. Sign up for a risk-free 14-day free trial to experience a hassle-free data export from Intercom to MySQL.
Some of the salient features of Hevo are:
- Simplicity: Hevo is simple to use and can be set up within minutes. With Hevo, you can start loading data from any source to a destination database or data warehouse in just a few clicks.
- Real-time data load: Hevo supports real-time streaming that ensures that you can move data instantly, without any delay. This will provide you with analysis-ready data in real-time.
- Reliability: Hevo has a fault-tolerant architecture that ensures zero data loss.
- Scalability: Hevo can help you scale up/down your data infrastructure as per your needs.
- Variety of sources and destinations: With Hevo, you may add new sources of data as required. Hevo integrates seamlessly with a variety of databases and cloud applications. You can find the complete list of integrations that Hevo supports here.
- No maintenance: Hevo is a fully-managed solution that requires minimal supervision and maintenance.
What is your preferred method to move data from Intercom to MySQL? Let us know in the comments.