So, you’re an Intercom user, right? It’s always a pleasure talking to someone who gives utmost priority to the customer experience. Being focused on providing top-notch support to your customers is what makes you a great player. 

At times, there would be a need to move your support & customer engagement data from Intercom to a data warehouse. That’s where you come in. You take the responsibility of replicating data from Intercom to a centralized repository. By doing this, the analysts and key stakeholders can take super-fast business-critical decisions.

So, to make your work easier, we’ve prepared a simple and straightforward guide to help you replicate data from Intercom to Databricks. Leap forward and read the 2 simple methods.

How to Replicate Data From Intercom to Databricks?

To replicate data from Intercom to Databricks, you can do either of the following:

  • Use CSV files or 
  • A no-code automated solution. 

We’ll cover replication via CSV files next.

Replicate Data from Intercom to Databricks Using CSV Files

Intercom, being a cloud-based support platform, stores data about conversations, leads, customers, and their engagement. You have to run multiple exports for different types of data.

You can even export the reports in CSV format.

Follow along to replicate data from Intercom to Databricks in CSV format:

Step 1: Export CSV Files from Intercom 

For exporting User or Company data

  • First, select the users whose data you want to export.
  • Select the column icon in the top-right corner. Then select the attributes for which you want to export data.
  •  Then, click on the “More” button at the top of the page. From the drop-down menu, click on the “Export” option.
  • A dialog box appears. You get two options to select from, i.e., “Export with the currently displayed columns” or “Export with all the columns.”
  • The CSV file will be delivered to you by email. Please note that the link to your CSV file will expire after one hour.

For exporting data from Reports

  • Go to the Reports tab, then select the report from which you want to export data. There are three reports by default: Lead Generation, Customer Engagement, and Customer Support. There might even be some custom reports.
  • Then, in the left navigation pane, click on the “Export” button.
  • Select the data range. To apply more filters, click on the “Add Filter” option and then add filters.
  • Now, you can select the fields from which you want to export data.
  • Then, click on the “Export CSV” button.

Step 2: Import CSV Files into Databricks

  • In the Databricks UI, go to the side navigation bar. Click on the “Data” option. 
  • Now, you need to click on the “Create Table” option.
  • Then drag the required CSV files to the drop zone. Otherwise, you can browse the files in your local system and then upload them.

Once the CSV files are uploaded, your file path will look like: /FileStore/tables/<fileName>-<integer>.<fileType>

Creating table while exporting data from Intercom to Databricks
Image Source

Step 3: Modify & Access the Data

  • Click on the “Create Table with UI” button.
  • The data now gets uploaded to Databricks. You can access the data via the Import & Explore Data section on the landing page.
For modifying and accessing the data in Databricks
Image Source
  • To modify the data, select a cluster and click on the “Preview Table” option.
  • Then, change the attributes accordingly and select the “Create Table” option.

The above 3-step IntercomIntercom to Databricks process is optimal for the following scenarios:

  • Less Amount of Data: This method is appropriate for you when the number of reports is less. Even there shouldn’t be massively large number of rows in each report.
  • One-Time Data Replication: This method suits your requirements if your business teams need the data only once in a while.
  • Limited Data Transformation Options: Manually transforming data in CSV files is difficult & time-consuming. Hence, it is ideal if the data in your spreadsheets is clean, standardized, and present in an analysis-ready form. 
  • Dedicated Personnel: If your organization has dedicated people who have to perform the manual downloading and uploading of CSV files, then accomplishing this task is not much of a headache.
Solve your data replication problems with Hevo’s reliable, no-code, automated pipelines with 150+ connectors.
Get your free trial right away!

However, when the frequency of replicating data from Intercom increases, this process becomes highly monotonous. It adds to your misery when you have to transform the raw data every single time. With the increase in data sources, you would have to spend a significant portion of your engineering bandwidth creating new data connectors. Just imagine — building custom connectors for each source, transforming & processing the data, tracking the data flow individually, and fixing issues. Doesn’t it sound exhausting?

How about you focus on more productive tasks than repeatedly writing custom ETL scripts, downloading, cleaning, and uploading CSV files? This sounds good, right?

In that case, you can… 

Replicate Data from Intercom to Databricks Using an Automated ETL Tool

An automated tool is an efficient and economical choice that takes away a massive chunk of repetitive work. It has the following benefits:

  • It allows you to focus on core engineering objectives. By doing so, your business teams can jump on to reporting without any delays or data dependency on you.
  • Your support team can effortlessly filter, aggregate, and segment data from Intercom.
  • Without technical knowledge, your analysts can seamlessly standardize timezones, convert currencies, or simply aggregate campaign data for faster analysis. 
  • An automated solution provides you with a list of native in-built connectors. No need to build custom ETL connectors for every source you require data from.

For instance, here’s how Hevo, a cloud-based ETL solution makes the data replication from Intercom to Databricks ridiculously easy: 

Step 1: Configure Intercom as your Source

  • You can select “Intercom App” or “Intercom Webhook” based on your source requirements.
  • Fill in the required credentials required for configuring Intercom as your source.
Configuring Intercom as the source with Hevo
Image Source

Step 2: Configure Databricks as your Destination

Now, you need to configure Databricks as the destination.

Configure Databricks as the destination with Hevo
Image Source

After implementing the 2 simple steps, Hevo will take care of building the pipeline for replicating data from Intercom to Databricks based on the inputs given by you while configuring the source and the destination.

For in-depth knowledge of how a pipeline is built & managed in Hevo, you can also visit the official documentation for Intercom as a source and Databricks as a destination.

You don’t need to worry about security and data loss. Hevo’s fault-tolerant architecture will stand as a solution to numerous problems. It will enrich your data and transform it into an analysis-ready form without having to write a single line of code.

Here’s what makes Hevo stands out from the rest:

  • Fully Managed: You don’t need to dedicate time to building your pipelines. With Hevo’s dashboard, you can monitor all the processes in your pipeline, thus giving you complete control over it.
  • Data Transformation: Hevo provides a simple interface to cleanse, modify, and transform your data through drag-and-drop features and Python scripts. It can accommodate multiple use cases with its pre-load and post-load transformation capabilities.
  • Faster Insight Generation: Hevo offers near real-time data replication, giving you access to real-time insight generation and faster decision-making. 
  • Schema Management: With Hevo’s auto schema mapping feature, all your mappings will be automatically detected and managed to the destination schema.
  • Scalable Infrastructure: With the increased number of sources and volume of data, Hevo can automatically scale horizontally, handling millions of records per minute with minimal latency.
  • Transparent pricing: You can select your pricing plan based on your requirements. Different plans are clearly put together on its website, along with all the features it supports. You can adjust your credit limits and spend notifications for any increased data flow.
  • Live Support: The support team is available round the clock to extend exceptional support to its customers through chat, email, and support calls.

You can take our 14-day free trial to experience a better way to manage data pipelines.

Get started for Free with Hevo!

What Can You Achieve by Replicating Your Data from Intercom to Databricks?

Here’s a little something for the data analyst on your team. We’ve mentioned a few core insights you could get by replicating data from Intercom to Databricks. Does your use case make the list?

  • Which In-app issues have the highest average Response Time for the corresponding Support Tickets raised?
  • In which geography does a particular messaging have the most engaging customers?
  • How much percentage of your targeted emails have converted signups into active customers?
  • What are the different segments of customers you should target for a particular message?
  • Leads coming from which marketing channels can be targeted for a customized message?

Summing It Up 

Exporting & uploading CSV files is the go-to solution for you when your data & financial analysts require fresh data from Intercom only once in a while. But with an increase in frequency, redundancy will also increase. To channel your time into productive tasks, you can opt-in for an automated solution that will help accommodate regular data replication needs. This would be genuinely helpful to finance & accounting teams as they would need regular updates about marketing expenses, support costs of campaigns, recurring and annual revenue of your organization, etc.

Even better, your support teams would now get immediate access to data from multiple channels and thus deep-dive to explore better market opportunities.

So, take a step forward. And here, we’re ready to help you with this journey of building an automated no-code data pipeline with Hevo. Its 150+ plug-and-play native integrations will help you replicate data smoothly from multiple tools to a destination of your choice. Its intuitive UI will help you smoothly navigate through its interface. And with its pre-load transformation capabilities, you don’t even need to worry about manually finding errors and cleaning & standardizing them.

With a no-code data pipeline solution at your service, companies will spend less time calling APIs, referencing data, building pipelines, and more time gaining insights from their data.

Skeptical? Why not try Hevo for free and take the decision all by yourself? Using Hevo’s 14-day free trial feature, you can build a data pipeline from Intercom to Databricks and try out the experience.

Here’s a short video that will guide you through the process of building a data pipeline with Hevo.

We’ll see you again the next time you want to replicate data from yet another connector to your destination. That is if you haven’t switched to a no-code automated ETL tool already.

We hope you have found the appropriate answer to the query you were searching for. Happy to help!

Manisha Jena
Research Analyst, Hevo Data

Manisha is a data analyst with experience in diverse data tools like Snowflake, Google BigQuery, SQL, and Looker. She has hadns on experience in using data analytics stack for various problem solving through analysis. Manisha has written more than 100 articles on diverse topics related to data industry. Her quest for creative problem solving through technical content writing and the chance to help data practitioners with their day to day challenges keep her write more.

No-code Data Pipeline for Databricks