Press "Enter" to skip to content

Zendesk to Snowflake: Move Data Instantly

Zendesk to SnowflakeIs your organization generating a lot of customer support data in Zendesk? Are you trying to make sense of this data? If this applies to you, then it might make sense for you to transfer your data from Zendesk to a fully-managed data warehouse platform like Snowflake so you can perform deep analytics on it. This blog will present 2 ways of moving data from Zendesk to Snowflake. Thus, enabling you to pick the one that best suits your needs.

Introduction to Zendesk

Zendesk is a customer service support platform that unifies customer activity across multiple channels to enable you to be able to track support related activity from one location. Zendesk also has CRM functionality and easy integration with CRM tools like Salesforce. This enhances your ability to monitor the entire customer lifecycle in your organization. 

Key Highlights of Zendesk:

  • Customer Feedback: You are able to receive constant feedback from customers through the Zendesk platform. This provides another avenue for your customer service desk to gauge the success of their efforts.
  • Ticketing System: Zendesk’s ticketing system brings together requests from multiple channels, enabling fast response times.
  • Reporting: Zendesk has reporting capabilities that provide a quick overview of your service desk activities over a specified period of time.
  • Integrations: Zendesk provides integrations for many popular CRM and marketing tools like Mailchimp and Salesforce. This makes it particularly handy for the modern business environment. 

Introduction to Snowflake:

Snowflake is a cloud-based data warehouse. It is fully-managed and operates using the Software-as-a-Service model. Thus, its users do not have to get involved in any maintenance activities after setting it up. Snowflake is becoming increasingly popular owing to its ease of use, speed and scalability. Snowflake also has separate storage and compute layers. This enables them to be scaled independently of each other and also priced differently. You can read more about Snowflake warehouse here.

Key Highlights of Snowflake:

  • Low Maintenance: Snowflake does not require software and hardware installations or much infrastructure management from your internal team. This is because much of this is handled on the backend.
  • Speed: Already-running queries are not impacted by scaling in Snowflake. Thus, they do not affect running speeds.
  • Cloning Functionality: Snowflake allows the creation of instant copies of objects in real-time. These copies store pointers of the original data in a new table and are not deep copies of the original data. Thus, the cloning functionality saves a lot of space when compared to copying in other data warehouses.
  • Separate storage and compute Layer: Snowflake’s separate storage and compute layer enables versatile scalability – you can scale either one up or down without affecting the other. Also, you only have to pay for what you use as payment for each layer is also independent of the other. 

LOADING THE DATA FROM ZENDESK TO SNOWFLAKE

The data can be moved from Zendesk to Snowflake in the following ways:

Method 1: Use an Official Snowflake Data Pipeline Partner like Hevo Data


Method 2:
Manually Write Custom Code

This blog discuses both the approaches in detail. You can then weigh the pros and cons of each of these approaches and move data as per your needs.

Zendesk to Snowflake: Using Hevo

Hevo, an official Snowflake partner, vastly simplifies the process of moving your data from Zendesk to Snowflake. These are the steps:

  • Authenticate and connect your Zendesk data source
    Zendesk to Snowflake - Connecting Data Source
  • Connect to your Snowflake account and start moving your data
    Zendesk to Snowflake - Connecting Destination

Hevo will ensure that all of your data from Zendesk is automatically mapped to its relevant tables in Snowflake. Sign up for a free, no-risk 14 day trial with Hevo to experience a smooth and straight-forward data migration from Zendesk to Snowflake. 

Furthermore, Hevo helps you clean, enrich and transform the data both before and after it is transferred into Snowflake.

Other key highlights of Hevo: 

  • Real-time: Hevo has a real-time streaming architecture that enables you to transfer data instantly and without delays. Thus, using Hevo will enable you to gain real-time insights.
  • Simplicity: Hevo is an intuitive and easy to use platform. Using Hevo will ensure that your data is moved in just a few clicks.
  • Reliable Data Load: Hevo’s in-built fault tolerance architecture ensures that data that loaded is reliable and consistent, with minimal data loss.
  • Scalability: Hevo is able to handle data from multiple sources at any scale.  Hevo can bring data from Sales and Marketing tools, Analytics applications, Databases and so on. Thus, using Hevo will help you scale your data to meet any growing infrastructural needs. 
  • Minimal and Simple Setup: Hevo is completely managed and fully automated. Therefore, it is easy to set up and requires minimal effort on your part. 

Zendesk to Snowflake: Manually writing custom code

This process would require an investment in your engineering team. The high-level steps are as follows:

  1. Extract the data from Zendesk
  2. Prepare the data
  3. Load the data into Snowflake

Step 1: Extract the data from Zendesk

Zendesk provides access to its data through its REST API. Zendesk administrators can access this data by making calls to its API. It is imperative that an administrator access the data (or you become a one) as Zendesk limits its data access for security. For example, to access data on tickets, you may use the code GET /api/v2/tickets.json. The data returned will be of the following format

{

      "id":      35436,

      "subject": "Help I need somebody!",

      ...

    },

    {

      "id":      20057623,

      "subject": "Not just anybody!",

      ...

    },

    ...

  ]

}

Additional information on Zendesk’s API can be found here:

https://developer.zendesk.com/rest_api/docs/support/introduction

Step 2: Prepare the data

Additional tables may need to be created to accommodate some of the JSON data if it’s not flattened out (nested).

Special care must also be taken to ensure that the data coming out of Zendesk properly corresponds to its equivalent types in Snowflake. Specific information on Zendesk attributes and their data types can be found here https://developer.zendesk.com/rest_api/docs/support

Snowflake provides support for many widely used data types. Documentation on Snowflake data types can be found here Preparing Your Data Files 

Step 3: Load the data into Snowflake

We can load the data into Snowflake via SnowSQL (Snowflake’s Command Line Interface) with the help of the COPY INTO SQL statement. The files can be loaded from Snowflake’s stages or from any of its external storage locations.

Loading the data from Snowflake Stages:

We can use the PUT command to ingest the data into a Snowflake stage. We then use COPY INTO to load it into Snowflake

The following are commands to load the data from the respective stage types:

  • Created Internal Stage:

    COPY INTO TABLE1 FROM @Stage_Name
  • Table Stage Type:

    COPY INTO TABLE1 file_format=(Type=’JSON’   Strip_outer_array=”TRUE”)
  • User Stage Type:

    COPY INTO TABLE1 FROM @~/staged file_format=(format_name=’json_format’)

Loading from External Storage locations:

You can either load directly from a specific storage location or create an external stage for the location and load from it. It is easier to load from an external stage as Snowflake encapsulates the required information like the credentials and location, making it easy for you to reference it. Such a stage can be referenced with a simple statement. Example: COPY INTO TABLE1 FROM @External_Stage_Name. The following is the code to load from the Amazon S3, GCS and Microsoft Azure locations respectively:

  • Amazon S3:

    COPY INTO TABLE1 FROM s3://bucket
    
    CREDENTIALS= (AWS_KEY_ID='YOUR AWS ACCESS KEY' AWS_SECRET_KEY='YOUR AWS SECRET ACCESS KEY')
    
    ENCRYPTION= (MASTER_KEY = 'YOUR MASTER KEY')
    
    FILE_FORMAT = (FORMAT_NAME = JSON_FORMAT)
  • GCS:

    COPY INTO TABLE1 FROM 'gcs://bucket’
    
    STORAGE_INTEGRATION=(Integration_name)
    
    ENCRYPTION= (MASTER_KEY = 'YOUR MASTER KEY')
    
    FILE_FORMAT = (FORMAT_NAME = JSON_FORMAT)
  • Microsoft Azure:

    COPY INTO TABLE1 FROM azure://your account.blob.core.windows.net/container
    
    STORAGE_INTEGRATION=(Integration_name)
    
    ENCRYPTION= (MASTER_KEY = 'YOUR MASTER KEY')
    
    FILE_FORMAT = (FORMAT_NAME = JSON_FORMAT)

There are quite a few drawbacks associated with loading the data manually. Some of these are:

  • Real-time Limitations: Using this method does not allow for real-time data updates. A lot of extra code and cron jobs will have to be configured to enable this. 
  • Need Engineering Bandwidth: This method is barely feasible when you have a small engineering team, as it requires a lot of resources that may not necessarily be available to many small-sized teams.
  • Maintenance: This method can return inaccurate data in the event that the Zendesk API is down or if there are any connectivity issues.
  • Hard to Perform Data Transformations: Additional code will need to be written to perform basic data transformations. This can make the process even more time-consuming.

Using Hevo data to move your data from Zendesk to Snowflake will ensure that you do not have to worry about these drawbacks that plague the manual process. Sign up for a free trial to vastly simplify your data moving process. 

 

ETL Data to Redshift, Bigquery, Snowflake

Move Data from any Source to Warehouse in Real-time

Sign up today to get $500 Free Credits to try Hevo!
Start Free Trial