Transferring your Freshdesk data from Freshdesk to a Data Warehouse like Snowflake is a necessary first step if you want to generate advanced analytics. This blog will present 2 methods that will help you move your data from Freshdesk to Snowflake.

This article will provide an overview of both approaches. This will allow you to analyze the pros and cons of all approaches and select the best method as per your use case.

Understanding Freshdesk

Freshdesk to Snowflake: Freshdesk
Image Source: Google

Freshdesk is a cloud-based helpdesk and customer service platform that helps you manage support-related communication across all channels. Freshdesk also creates reports from metrics on ticket volume, customer analysis, and agent performance. These reports can help you gain a better understanding of customers and team performance.

Freshdesk provides a lot of data on day-to-day support operations. This data can be accessed through Freshdesk’s REST API and loaded into the Snowflake data warehouse.

Understanding Snowflake 

Freshdesk to Snowflake: Snowflake
Image Source: Google

Snowflake is a cloud-based data warehouse solution that uses the Software-as-a-Service (SaaS) model. Snowflake hosts its data on Amazon Web Services (S3), Microsoft Azure, and Google Cloud Platform. Snowflake has become known for its speed, scalability, and ease of use when compared to other data warehouse options. You can read more about Snowflake’s unique architecture here.

Understanding the Need to Load Data from Freshdesk to Snowflake

Freshdesk helps you to know how many customers are upset, how many customers churned, whether the issue is addressed on time, etc. All this information may come from Freshdesk, emails, feedbacks, phone, social media, etc. So, different data silos are created per feedback per country. To remove this problem, you can integrate all your customer-related data and get a broader picture of it.

Methods to Move Data from Freshdesk to Snowflake

Method 1: Move Data from Freshdesk to Snowflake Using Manually Build ETL Scripts
You need to follow these steps to connect Freshdesk to Snowflake using an ETL script:

Method 2: Move Data from Freshdesk to Snowflake Using Hevo Data

Hevo can help you export data from Freshdesk to Snowflake in real-time free of cost, without having to write a single line of code.

Get Started with Hevo for Free

With the help of Hevo’s pre-built integrations with 100+ sources such as Freshdesk, you can set up the whole data pipeline without any technical intervention and load data to Snowflake or a destination of your choice with ease.

Get started with Hevo today! Explore a 14-day free trial here!

Methods to Move Data from Freshdesk to Snowflake

Now you are going to learn about 2 methods you can use to move your data from Freshdesk to Snowflake:

Method 1: Move Data from Freshdesk to Snowflake Using Manually Build ETL Scripts

To connect Freshdesk and Snowflake using manually built ETL scripts can be broken into the following steps:

  1. Extracting the Data from Freshdesk
  2. Preparing and Mapping the Data
  3. Staging Process
  4. Loading the Data into Snowflake

Step 1: Extracting the Data from Freshdesk

The data can be extracted from Freshdesk by making simple calls to its REST API. The data will be returned in JSON format as XML is no longer supported. A sample request may look like GET. ‘https://domain.freshdesk.com/api/v2/tickets/

Sample Result:

{

  "cc_emails" : ["user@cc.com"],

  "fwd_emails" : [ ],

  "reply_cc_emails" : ["user@cc.com"],

  "email_config_id" : null,

  "fr_escalated" : false,

  "group_id" : null,

  "priority" : 1,

  "requester_id" : 1,

  "responder_id" : null,

  "source" : 2,

  "spam" : false,

  "status" : 2,

  "subject" : "",

  "company_id" : 1,

  "id" : 20,

  "type" : null,

  "to_emails" : null,

  "product_id" : null,

  "created_at" : "2015-08-24T11:56:51Z",

  "updated_at" : "2015-08-24T11:59:05Z",

  "due_by" : "2015-08-27T11:30:00Z",

  "fr_due_by" : "2015-08-25T11:30:00Z",

  "is_escalated" : false,

  "description_text" : "Not given.",

  "description" : "<div>Not given.</div>",

  "custom_fields" : {

    "category" : "Primary"

  },

  "tags" : [ ],

  "requester": {

    "email": "test@test.com",

    "id": 1,

    "mobile": null,

    "name": "Rachel",

    "phone": null

  },

  "attachments" : [ ]

}

Additional information on the Freshdesk API can be found at: https://developers.freshdesk.com/api/#getting-started 

Step 2: Preparing and Mapping the Data

Some of the Freshdesk data may be nested and will need to be flattened before the data can be loaded. Additional tables may also need to be created. Freshdesk also provides support for many of the popular data types. Additional information on the types and their corresponding Freshdesk attribute names can be found here.

Special care must also be taken to make sure that the data types map properly into their corresponding types in Snowflake. Information on Snowflake data types can be found here: Preparing Your Data Files 

Step 3: Staging Process

The data is ready to be staged after we have ensured that the data types are accurately mapped. 

There are two types of stages that can be created in Snowflake. These are:

  • Internal Stages 
  • External Stages

Each of these stages can be created using SQL code or via the Snowflake GUI. The SQL code has been included for the purposes of this blog

The following are high-level steps outlined for creating the stages mentioned above:

Internal Stage
CREATE [ OR REPLACE ] [ TEMPORARY ] STAGE [ IF NOT EXISTS ] <internal_stage_name> [ FILE_FORMAT = ( { FORMAT_NAME = '<file_format_name>' | TYPE = { CSV | JSON | AVRO | ORC | PARQUET | XML } [ formatTypeOptions ] ) } ] [ COPY_OPTIONS = ( copyOptions ) ] [ COMMENT = '<string_literal>' ]
External Stage
  • Amazon S3:
CREATE STAGE “[Database Name]”, “[Schema]”,”[Stage Name]” URL=’S3://<URL> CREDENTIALS= (AWS_KEY_ID=<your AWS key ID>, AWS_SECRET_KEY= <your AWS secret key>) ENCRYPTION= (MASTER_KEY=<Master key if required>) COMMENT= ‘[insert comment]’
  • Microsoft Azure:
CREATE STAGE “[Database Name]”, “[Schema]”,”[Stage Name]” URL=’azure://<URL> CREDENTIALS= (AZURE_SAS_TOKEN=’< your token>‘) ENCRYPTION= (TYPE = “AZURE_CSE, MASTER_KEY=<Master key if required>) COMMENT= ‘[insert comment]’

There are other types of internal stages. However, these are generated automatically by Snowflake. The table stage is held within a table object and is best used for situations that require the staged data to be only used exclusively for a specific table. The user table is assigned to each user by the system. It cannot be altered or dropped and serves as a personal storage location.

Step 4: Loading the Data into Snowflake

To load the data into Snowflake, we use the COPY INTO DML statement through Snowflake’s SQL command-line interface – SnowSQL. Note that using the FROM clause in your COPY INTO statement is optional as Snowflake will check for files in the stage automatically.

Internal Stages:
  • User Stage Type:
COPY INTO TABLE1 FROM @~/staged file_format=(format_name=’json_format’)
  • Table Stage Type:
COPY INTO TABLE1 FILE_FORMAT=(TYPE=‘JSON’ STRIP_OUTER_ARRAY=”TRUE”)
  • Created Internal Stage:
COPY INTO TABLE1 FROM @Stage_name
External Stages:
  • Amazon S3

While you can load data directly from an Amazon S3 bucket, the preferred method is that you first create an Amazon S3 external stage. The same applies to Microsoft Azure and GCP buckets too.

COPY INTO TABLE1 FROM s3://bucket CREDENTIALS= (AWS_KEY_ID='YOUR AWS ACCESS KEY' AWS_SECRET_KEY='YOUR AWS SECRET ACCESS KEY') ENCRYPTION= (MASTER_KEY = 'YOUR MASTER KEY') FILE_FORMAT = (FORMAT_NAME = JSON_FORMAT)
  • Microsoft Azure
COPY INTO TABLE1 FROM azure://your account.blob.core.windows.net/container STORAGE_INTEGRATION=(Integration_name) ENCRYPTION= (MASTER_KEY = 'YOUR MASTER KEY') FILE_FORMAT = (FORMAT_NAME = JSON_FORMAT)
  • GCS
COPY INTO TABLE1 FROM 'gcs://bucket’ STORAGE_INTEGRATION=(Integration_name) ENCRYPTION= (MASTER_KEY = 'YOUR MASTER KEY') FILE_FORMAT = (FORMAT_NAME = JSON_FORMAT)

Understanding the Limitations of Using Manual Method

Limitations of using custom code to connect Freshdesk and Snowflake are listed below:

  • Need Engineering Bandwidth: If you are a fast-growing organization with limited engineering resources, this can become a real bottleneck to get your Freshdesk data into Snowflake. The engineering team has its own priorities and targets and would need to realign its current priorities to accommodate these requests.
  • No Access to Real-time Data: Additional code and cron jobs are needed to replicate data when records are updated. This can be a slow, brittle, and time-consuming process.
  • Hard to Perform Data Transformations: Additional code will need to be written if you need to perform data transformations. This can make the process even more cumbersome.
  • Maintenance: This method will stop working if there are any changes to Freshdesk’s API or if their server is down. This can result in data loss. Hence, you would need to station engineering resources that can provide on-demand support. 

Method 2: Move Data from Freshdesk to Snowflake Using Hevo Data

Hevo logo
Image Source: Self

Hevo is fully managed and completely automates the process of not only loading data from your desired source but also enriching the data and transforming it into an analysis-ready form without having to write a single line of code. Its fault-tolerant architecture ensures that the data is handled in a secure, consistent manner with zero data loss.

Sign up here for a 14-Day Free Trial!

It provides a consistent & reliable solution to manage data in real-time and always have analysis-ready data in your desired destination. It allows you to focus on key business needs and perform insightful analysis using various BI tools such as Power BI, Tableau, etc. 

Hevo focuses on two simple steps to move your data from Freshdesk to Snowflake:

  • Configure Source: Connect Hevo Data with Freshdesk by providing a unique name for your Pipeline, along with details about your authorized Freshdesk account. You can also choose the historical sync duration for your Freshdesk data.
Configure Freshdesk
Image Source: Self
  • Integrate Data: Complete Freshdesk to Snowflake migration by providing your destination name, account name, region of your account, database username and password, database and schema name, and the Data Warehouse name.
Configure Snowflake in Hevo
Image Source: Self

Check Out What Makes Hevo Amazing:

  • Secure: Hevo has a fault-tolerant architecture that ensures that the data is handled in a secure, consistent manner with zero data loss.
  • Schema Management: Hevo takes away the tedious task of schema management & automatically detects schema of incoming data and maps it to the destination schema.
  • Minimal Learning: Hevo, with its simple and interactive UI, is extremely simple for new customers to work on and perform operations.
  • Hevo Is Built To Scale: As the number of sources and the volume of your data grows, Hevo scales horizontally, handling millions of records per minute with very little latency.
  • Incremental Data Load: Hevo allows the transfer of data that has been modified in real-time. This ensures efficient utilization of bandwidth on both ends.
  • Live Support: The Hevo team is available round the clock to extend exceptional support to its customers through chat, email, and support calls.
  • Live Monitoring: Hevo allows you to monitor the data flow and check where your data is at a particular point in time.

Conclusion

In this blog, you learned about Freshdesk, Snowflake, and two different approaches to connect Freshdesk and Snowflake. You can manually your ETL script and load your data from Freshdesk to Snowflake. But, the manual method has a lot of limitations. These limitations can easily be overcome by using a ready platform like Hevo that works out of the box.

Visit our Website to Explore Hevo

Hevo is a No-code Data Pipeline that transfers data from Freshdesk to Snowflake for free. It will ensure that you get data that is both reliable and real-time. Sign up for a free trial to simplify your data transfer process from Freshdesk to Snowflake.

Sign Up for a 14-day free trial to try Hevo for free. You can also have a look at the unbeatable pricing that will help you choose the right plan for your business needs.

Share your experience of loading data from Freshdesk to Snowflake in the comment section below.

Rashid Y
Technical Content Writer, Hevo Data

Rashid is a technical content writer with a passion for the data industry. Leveraging his problem-solving skills, he delivers informative and engaging content on data science. With a deep understanding of complex data concepts and a talent for clear, compelling communication, Rashid creates content that informs and captivates his audience.

No-Code Data Pipeline for Snowflake