Business intelligence (BI) tools like Google Data Studio, Power BI, and Tableau help organizations understand data and make effective business decisions.

But, with BI tools, you cannot effectively map your consumers’ journey and build users’ profiles from engagements on websites and applications.

However, it is essential to create users’ profiles to have an in-depth analysis of your customers and their experience with your software solutions.

With Amplitude, user profiling becomes easy due to superior features like Retention and Cohorts. You can use Amplitude with data sources like Snowflake for identity resolutions, data governance, and better analysis.

This article talks about 2 easy methods of Amplitude Snowflake Integration.

So let’s get started!

Prerequisites

  • Basic understanding of BI tools
Solve your data replication problems with Hevo’s reliable, no-code, automated pipelines with 150+ connectors.
Get your free trial right away!

Amplitude to Snowflake Migration Methods

These are the methods you can use to migrate data from Amplitude to Snowflake:

Method 1: Amplitude To Snowflake Connection Using Hevo

Hevo provides a truly efficient and fully-automated solution to manage data in real-time and always have analysis-ready data at your data warehouse.

Using Hevo Data, you can connect Amplitude to Snowflake in the following 2 steps:

  • Step 1: Configuring Amplitude Analytics as a Source
    • Now, log in to your Amplitude account here.
    • Navigate to the “Settings” from the left navigation menu.
  • In the “Org Settings” click on the “Projects” option.
  • Select the project whose data you want to export to Google BigQuery.
  • Copy the “API Key” and “Secret key” from the project details.
  • Log in to your Hevo account here.
  • Click on the “PIPELINES” option from the “Asset Palette”.
  • Click on the “+ Create” button from the “Pipelines List View“.
  • Select the “Amplitude Analytics” source from the “Select Source Type” page.
  • It will open the “Configure your Amplitude Analytics Source” and provide the required details for Amplitude BigQuery Integration.
  • Click on the “TEST & CONTINUE” to proceed and set up the destination.
  • Step 2: Configuring Snowflake as a Destination: Establish a connection to Snowflake by providing information about your Snowflake Account and its credentials, such as Database Name, Username, and Password.
Reliably integrate data with Hevo’s Fully Automated No Code Data Pipeline

If you are anything like the 1000+ data-driven companies that use Hevo, more than 70% of the business apps you use are SaaS.

Integrating the data from these sources in a timely way is crucial to fuel analytics and the decisions it takes.

But given how fast API endpoints can change, creating and managing these pipelines can be a soul-sucking exercise.

Hevo’s no-code data pipeline platform lets you connect over 150+ sources in a matter of minutes to deliver data in near real-time to your warehouse.

Moreover, the in-built transformation capabilities and the intuitive UI means that even non-engineers can set up pipelines and achieve analytics-ready data in minutes. 

This, combined with transparent pricing and 24×7 support, makes us the most loved data pipeline software in terms of user reviews.

Take our 14-day free trial to experience a better way to manage data pipelines.

Get started for Free with Hevo!

Method 2: Using Custom Code to Move Data from Amplitude to Snowflake

Using Custom Code for Amplitude Snowflake Integration, you have to implement the following:

Amplitude Snowflake Integration: Export Amplitude Data to Snowflake

Amplitude users can load their Amplitude event data into their Snowflake account. They need to set up recurring sync on their Amplitude data with the Amplitude UI. Users can also initiate a sync of their historical data in Amplitude manually. Sync is ‘synchronization’ between databases to keep consistency. This feature is enabled only to paid users; a free trial does not have it.

Amplitude Snowflake Integration: Set Up a Recurring Data Export to Snowflake 

It is an essential step in Amplitude Snowflake Integration.

It is a simple step process where each sync can be completed within 10 to 15 minutes.

Follow the below steps to set up recurring export of your Amplitude data to Snowflake. It is assumed that users have logged in to Amplitude as paid users.

  • Step 1: Navigate to the Data Destination tab, as shown below.
  • Step 2: Click on the Snowflake panel under the ‘Add More Destinations‘ tab. It will open the Export Data to Snowflake page.
  • Step 3: Under the Export Data to Snowflake tab, select the data you want to export to Snowflake. You can export data like event data, merged Amplitude IDs, or even both. 
  • Step 4: Review the Event Table and Merge IDs table schemas and click on Next. It will open the ‘Setup Export’ tab.
  • Step 5: Enter the below details in the ‘Snowflake Credentials For Amplitude Section’.

Account name: It is the account name on your Snowflake account. It is the first peat of your URL, i.e., before ‘snowflakecomputing.com.’

Warehouse: It is the warehouse that Amplitude will use to load data.

Database: It is the database where the data will be stored. 

Username: It is the username that Amplitude use for connecting with the Snowflake account.

Password: It is the password associated with the username.

Note: Credentials are case-sensitive, so enter them carefully.

  • Step 6: Amplitude will dynamically create a query to generate Snowflake objects to the right side of the Snowflake credentials section. Copy and paste that query to your Snowflake account.
  • Step 7: Click on Next. Amplitude will try to upload the test data using the above credentials. If the upload is successful, click on Finish.

After the 7th step, all the future events will automatically be sent to Snowflakes.

Amplitude Snowflake Integration: Exporting Historical Amplitude Data to Snowflake

  • Step 1: Click on Export Data and select a date range to export your historical data from Amplitude into Snowflake.
  • Step 2: After selecting the data range, this process can be carried out from a day to several weeks, depending on your data size, warehouse size, cluster count, network bandwidth, and the number of concurrent historical data exports.

Amplitude Snowflake Integration: Connect Amplitude to Snowflake

Amplitude Snowflake integration enables users to ingest Snowflake data directly into your Amplitude project. 

Depending on Amplitude’s network policy, users need to accept the following IP addresses for Amplitude’s servers to access your Snowflake instance.

  • 52.33.3.219
  • 35.162.216.242
  • 52.27.10.221

Amplitude Snowflake Integration: Adding Snowflake as a Data Source

Follow the below steps to add Snowflake as a data source for Amplitude Snowflake Integration.

  • Step 1: Navigate to the Sources section and ensure that you have selected the correct project from the dropdown list. Click on the ‘I want to import data into Amplitude tab’ and click on Snowflake.
  • Step 2: Amplitude will launch the configuration wizard, where you will enter the required credentials for the Snowflake instance.

Account: It is the Snowflake account.

Database: It is the database name where Amplitude can find data.

Warehouse: It is used by Amplitude to load the data.

Username: It is used by Amplitude for authentication.

Password: It is used by Amplitude for authentication. 

  • Step 3: Copy and run the auto-generated SQL code on the right side of the page to give Amplitude the proper access to your Snowflake instance.
  • Step 4: After running the code, click Next to test the connection. You will see the page below if all the credentials are entered correctly.
  • Step 5: Click Next to move on to the data selection stage.
  • Step 6: You have several configuration options to choose from here. Type of data: It is used to tell Amplitude whether you will be ingesting event data or user property data.
    • Type of import:
      • Full Sync: In Full Sync import, Amplitude will periodically import the entire dataset, even if the data has already been imported. It is recommended for the datasets where the rows change over time. 
      • Time-based: In Time-base import, Amplitude will periodically ingest the most recent rows in the data, as determined by the Timestamp column. The first import will bring in all available data, while the subsequent import will ingest any data with timestamps after the most recent import. To make this work, you need to include the timestamp column in the output of your SQL statement. 
      • Frequency: You can choose from several scheduling options from five minutes to one month.
      • SQL query: It is the SQL query that Amplitude uses to determine ingested data.
  • Step 7:After setting the configuration options, click on Test SQL to see how the data is coming through from your Snowflake instance. If there are errors, they will appear below the Test SQL button.
  • Step 8: If there are no errors, click on Finish. You can see a notification indicating that you have successfully established a Snowflake source. You will be then directed to the Sources listing page, where you will see the newly created Snowflake source.

Frequently Asked Questions (FAQs)

In which environment analysis is done after connecting Amplitude and Snowflake?

Thanks to the Amplitude Snowflake ETL, you may access and analyze the richer behavioral data with the help of Amplitude within the Snowflake environment.

Can charts be created after the integration between Amplitude and Snowflake?

The “Create Chart” dropdown menu can be found in the top-right corner of the navigation bar in Amplitude and is part of the native Amplitude experience. Analysis inside Amplitude Snowflake can be saved, shared, and dashboarded like any other chart material.

What metrics are verified by Amplitude while transferring events?

Amplitude verifies the event ID, client event time, and device ID for each event from Snowflake. The event is dropped if it is already in the database; otherwise, Amplitude writes it. Including an insert id field is advised if you use the Amplitude HTTP API.

What are the use cases of Amplitude Snowflake integration?

  1. Provide direct access to specific behavioral product data sets as a for-profit service to data consumers to supplement their current data.
  2. Work with outside parties by removing obstacles by exchanging data between internal and external business partners to streamline processes, improve customer service, and optimize spending.

Conclusion

In this article, you learned to integrate Amplitude Snowflake. This article mainly focuses on exporting and importing data from Amplitude. Besides connecting Amplitude with Snowflake accounts, you can also explore connecting Amplitude with tools like Amazon Redshift, Looker, Gatsby, Segment, and more.

Due to its superior features and connectivity, several companies, including Microsoft, PayPal, Autodesk, Twitter, Cisco Systems, and Accenture, use Amplitude in their day-to-day operations.

Hevo Data is an Official Snowflake Partner. It is a No-Code Data Pipeline that offers a faster way to move data from 150+ Data Sources into your Data Warehouse, such as Snowflake. Hevo is fully automated and hence does not require you to code.

VISIT OUR WEBSITE TO EXPLORE HEVO

Want to take Hevo for a spin?

SIGN UP and experience the feature-rich Hevo suite first hand. You can also have a look at the unbeatable pricing that will help you choose the right plan for your business needs.

Share your experience with Amplitude Snowflake Connector in the comments section below!

Manjiri Gaikwad
Freelance Technical Content Writer, Hevo Data

Manjiri loves data science and produces insightful content on AI, ML, and data science. She applies her flair for writing for simplifying the complexities of data integration and analysis for solving problems faced by data professionals businesses in the data industry.

No-code Data Pipeline For Snowflake

Get Started with Hevo