Unlock the full potential of your Amplitude data by integrating it seamlessly with Snowflake. With Hevo’s automated pipeline, get data flowing effortlessly—watch our 1-minute demo below to see it in action!
Business intelligence (BI) tools like Google Data Studio, Power BI, and Tableau help organizations understand data and make effective business decisions. But, with BI tools, you cannot effectively map your consumers’ journey and build users’ profiles from engagements on websites and applications.
However, it is essential to create users’ profiles to have an in-depth analysis of your customers and their experience with your software solutions. With Amplitude, user profiling becomes easy due to superior features like Retention and Cohorts. You can use Amplitude with data sources like Snowflake for identity resolutions, data governance, and better analysis.
This article talks about 2 easy methods of Amplitude Snowflake Integration.
Prerequisites
- Basic understanding of BI tools
Amplitude to Snowflake Migration Methods
These are the methods you can use to migrate data from Amplitude to Snowflake:
Method 1: Amplitude To Snowflake Connection Using Hevo
Hevo provides a truly efficient and fully-automated solution to manage data in real-time and always have analysis-ready data at your data warehouse.
Using Hevo Data, you can connect Amplitude to Snowflake in the following 2 steps:
- Step 1: Configuring Amplitude Analytics as a Source
- Now, log in to your Amplitude account.
- Navigate to the “Settings” from the left navigation menu.
- In the “Org Settings” click on the “Projects” option.
- Select the project whose data you want to export to Google BigQuery.
- Copy the “API Key” and “Secret key” from the project details.
- Log in to your Hevo account .
- Click on the “PIPELINES” option from the “Asset Palette”.
- Click on the “+ Create” button from the “Pipelines List View“.
- Select the “Amplitude Analytics” source from the “Select Source Type” page.
- It will open the “Configure your Amplitude Analytics Source” and provide the required details for Amplitude BigQuery Integration.
- Click on the “TEST & CONTINUE” to proceed and set up the destination.
- Step 2: Configuring Snowflake as a Destination: Establish a connection to Snowflake by providing information about your Snowflake Account and its credentials, such as Database Name, Username, and Password.
Method 2: Using Custom Code to Move Data from Amplitude to Snowflake
Using Custom Code for Amplitude Snowflake Integration, you have to implement the following:
Export Amplitude Data to Snowflake
Amplitude users can load their Amplitude event data into their Snowflake account. They need to set up recurring sync on their Amplitude data with the Amplitude UI. Users can also initiate a sync of their historical data in Amplitude manually. Sync is ‘synchronization’ between databases to keep consistency. This feature is enabled only to paid users; a free trial does not have it.
Set Up a Recurring Data Export to Snowflake
It is an essential step in Amplitude Snowflake Integration.
It is a simple step process where each sync can be completed within 10 to 15 minutes.
Follow the below steps to set up recurring export of your Amplitude data to Snowflake. It is assumed that users have logged in to Amplitude as paid users.
- Step 1: Navigate to the Data Destination tab, as shown below.
- Step 2: Click on the Snowflake panel under the ‘Add More Destinations‘ tab. It will open the Export Data to Snowflake page.
- Step 3: Under the Export Data to Snowflake tab, select the data you want to export to Snowflake. You can export data like event data, merged Amplitude IDs, or even both.
- Step 4: Review the Event Table and Merge IDs table schemas and click on Next. It will open the ‘Setup Export’ tab.
- Step 5: Enter the below details in the ‘Snowflake Credentials For Amplitude Section’.
Account name: It is the account name on your Snowflake account. It is the first peat of your URL, i.e., before ‘snowflakecomputing.com.’
Warehouse: It is the warehouse that Amplitude will use to load data.
Database: It is the database where the data will be stored.
Username: It is the username that Amplitude use for connecting with the Snowflake account.
Password: It is the password associated with the username.
Note: Credentials are case-sensitive, so enter them carefully.
- Step 6: Amplitude will dynamically create a query to generate Snowflake objects to the right side of the Snowflake credentials section. Copy and paste that query to your Snowflake account.
- Step 7: Click on Next. Amplitude will try to upload the test data using the above credentials. If the upload is successful, click on Finish.
After the 7th step, all the future events will automatically be sent to Snowflakes.
Exporting Historical Amplitude Data to Snowflake
- Step 1: Click on Export Data and select a date range to export your historical data from Amplitude into Snowflake.
- Step 2: After selecting the data range, this process can be carried out from a day to several weeks, depending on your data size, warehouse size, cluster count, network bandwidth, and the number of concurrent historical data exports.
Connect Amplitude to Snowflake
Amplitude Snowflake integration enables users to ingest Snowflake data directly into your Amplitude project.
Depending on Amplitude’s network policy, users need to accept the following IP addresses for Amplitude’s servers to access your Snowflake instance.
- 52.33.3.219
- 35.162.216.242
- 52.27.10.221
Adding Snowflake as a Data Source
Follow the below steps to add Snowflake as a data source for Amplitude Snowflake Integration.
- Step 1: Navigate to the Sources section and ensure that you have selected the correct project from the dropdown list. Click on the ‘I want to import data into Amplitude tab’ and click on Snowflake.
- Step 2: Amplitude will launch the configuration wizard, where you will enter the required credentials for the Snowflake instance.
Account: It is the Snowflake account.
Database: It is the database name where Amplitude can find data.
Warehouse: It is used by Amplitude to load the data.
Username: It is used by Amplitude for authentication.
Password: It is used by Amplitude for authentication.
- Step 3: Copy and run the auto-generated SQL code on the right side of the page to give Amplitude the proper access to your Snowflake instance.
- Step 4: After running the code, click Next to test the connection. You will see the page below if all the credentials are entered correctly.
- Step 5: Click Next to move on to the data selection stage.
- Step 6: You have several configuration options to choose from here. Type of data: It is used to tell Amplitude whether you will be ingesting event data or user property data.
- Type of import:
- Full Sync: In Full Sync import, Amplitude will periodically import the entire dataset, even if the data has already been imported. It is recommended for the datasets where the rows change over time.
- Time-based: In Time-base import, Amplitude will periodically ingest the most recent rows in the data, as determined by the Timestamp column. The first import will bring in all available data, while the subsequent import will ingest any data with timestamps after the most recent import. To make this work, you need to include the timestamp column in the output of your SQL statement.
- Frequency: You can choose from several scheduling options from five minutes to one month.
- SQL query: It is the SQL query that Amplitude uses to determine ingested data.
- Step 7:After setting the configuration options, click on Test SQL to see how the data is coming through from your Snowflake instance. If there are errors, they will appear below the Test SQL button.
- Step 8: If there are no errors, click on Finish. You can see a notification indicating that you have successfully established a Snowflake source. You will be then directed to the Sources listing page, where you will see the newly created Snowflake source.
Discover how to integrate Amplitude with HubSpot and leverage data to drive better marketing strategies and performance.
Frequently Asked Questions (FAQs)
In which environment analysis is done after connecting Amplitude and Snowflake?
Thanks to the Amplitude Snowflake ETL, you may access and analyze the richer behavioral data with the help of Amplitude within the Snowflake environment.
Can charts be created after the integration between Amplitude and Snowflake?
The “Create Chart” dropdown menu can be found in the top-right corner of the navigation bar in Amplitude and is part of the native Amplitude experience. Analysis inside Amplitude Snowflake can be saved, shared, and dashboarded like any other chart material.
What metrics are verified by Amplitude while transferring events?
Amplitude verifies the event ID, client event time, and device ID for each event from Snowflake. The event is dropped if it is already in the database; otherwise, Amplitude writes it. Including an insert id field is advised if you use the Amplitude HTTP API.
What are the use cases of Amplitude Snowflake integration?
- Provide direct access to specific behavioral product data sets as a for-profit service to data consumers to supplement their current data.
- Work with outside parties by removing obstacles by exchanging data between internal and external business partners to streamline processes, improve customer service, and optimize spending.
Conclusion
In this article, you learned to integrate Amplitude Snowflake. This article mainly focuses on exporting and importing data from Amplitude. Besides connecting Amplitude with Snowflake accounts, you can also explore connecting Amplitude with tools like Amazon Redshift, Looker, Gatsby, Segment, and more.
See how connecting Amplitude to Redshift can optimize your analytics process using these two simple methods.
Due to its superior features and connectivity, several companies, including Microsoft, PayPal, Autodesk, Twitter, Cisco Systems, and Accenture, use Amplitude in their day-to-day operations.
Share your experience with Amplitude Snowflake Connector in the comments section below!
Manjiri is a proficient technical writer and a data science enthusiast. She holds an M.Tech degree and leverages the knowledge acquired through that to write insightful content on AI, ML, and data engineering concepts. She enjoys breaking down the complex topics of data integration and other challenges in data engineering to help data professionals solve their everyday problems.