Azure Data Factory (ADF) is a Microsoft-managed data integration solution that facilitates the creation of cloud-based data workflows. It is a fully managed service that can be used to build data pipelines by orchestrating data movement. 

Snowflake is a fully managed SaaS (Software-as-a-Service) tool that offers cloud-based data warehouse services. It provides multi-cloud support and can be hosted on Google Cloud, AWS, and Microsoft Azure. Snowflake uses virtual compute instances to perform various compute and data processing tasks. 

Both these platforms are highly efficient services for data management and analytics. You can perform Azure Data Factory Snowflake integration to enhance data storage, scalability, and security. It also enables robust data management, processing, governance, and cost optimization. 

This article explains the Azure Data Factory Snowflake connection for effective data integration, management, and analytics. 

What is the Difference Between Snowflake, Azure Data Warehouse, and Azure Data Factory? 

The Azure Data Warehouse is now Azure Synapse Analytics. Here is a quick Snowflake vs. Azure Data Factory vs. Azure Data Warehouse comparison: 

FeaturesSnowflakeAzure Data Warehouse (Azure Synapse Analytics)Azure Data Factory
ServiceIt is a SaaS (Software-as-a-Service).It is a cloud analytics service offered by Microsoft Azure. It is a hybrid data integration service within the Azure ecosystem. 
Scalability Snowflake facilitates auto-scaling with separate storage and compute scaling.Azure Synapse facilitates on-demand scaling for storage and compute resources. It offers on-demand scaling to handle increased data workload. 
Data AnalyticsIt provides effective analytics capabilities through integration with various other platforms. However, these services incur additional costs. It integrates with various other Azure data analytics tools, such as Synapse Studio, Power BI, and Azure machine learning, without any additional charges. It can integrate with the analytics services offered by the Azure ecosystem, such as Azure Synapse, Power BI, or Synapse Studio. 
Data BackupAs an alternative to backup, Snowflake offers a fail-safe feature that recovers lost data for up to 7 days. It uses a built-in backup feature for data recovery. It takes the help of the Azure Resource Manager (ARM) template or Azure DevOps to facilitate data backup. 
CostsSnowflake charges you according to your usage with separate charges for storage, compute, and data transfer resources. Azure Synapse offers a pay-as-you-go model and provides flexibility to use and pay for the resource as per your requirements. It provides a pay-as-you-go model, enabling you to pay for only those resources that you use for data integration.

Does Snowflake Integrate with Azure?

Snowflake partners with the Azure ecosystem and can leverage the services offered by it, such as Azure data warehouse, Azure Data Factory, Azure OpenAI, and Azure ML. Azure Data Factory and Snowflake integration enable you to combine the specialities of both platforms and gain useful insights from your enterprise data. 

Azure Data Factory supports complex data transformations and allows you to orchestrate the data flow, schedule, and automate pipelines before loading the data to Snowflake. The storage, processing, machine learning, Snowpark, or Snowsight features of Snowflake make it ideal for large enterprise data analytics applications. Thus, with Snowflake and Azure, you can effectively carry out data storage, processing, and analytics. 

Snowflake Connector for Azure Data Factory

The native Azure Data Factory Snowflake connector supports the following three types of activities: 

  • Copy Activity: The Copy activity is the prominent function in the Azure Data Factory pipeline. It copies data from one data source (called source) to another (called sink). The Copy activity provides more than 90 connectors as a data source. It enables you to use Snowflake as a source or as a sink to transfer data. 
  • Lookup Activity: The Lookup activity enables you to read metadata from the data source’s files and tables. It is used to build dynamic and metadata-based pipelines. The Lookup activity can help you call stored procedures, but it is recommended not to do so to modify data. You can call the Azure Data Factory Snowflake stored procedure using the Script activity.
  • Script Activity: The Script activity enables you to run SQL commands against Snowflake. It also allows the execution of data manipulation language (DML), data definition language (DDL), and stored procedures. These capabilities of Script activity facilitate data transformation and build efficient data pipelines with Snowflake. 

You can create Snowflake as a linked service in Azure Data Factory and harness these activities for seamless data movement and transformation. 

How to Copy Data to Snowflake From Azure Data Factory? 

azure data factory snowflake: azure environment

Azure Data Factory Snowflake: Data Workflow

You can write data to Snowflake from Azure Data Factory using the following ways:

  • Direct copy to Snowflake
  • Staged copy to Snowflake 
  • Using REST API 
  • Using Private Endpoint

Direct Copy to Snowflake

If your data source fulfills Snowflake’s data format criteria, you can directly copy it into Snowflake. You should ensure that the following prerequisites are fulfilled:

  • Azure Blob Storage should be the linked service with shared access signature authentication. 
  • The source data should be in paraquet, delimited text, or JSON format and meet some specific conditions

To create a linked service in Snowflake, you can follow the below steps:

  • Login to Azure account. Go to the Manage tab in your Azure Data Factory workspace and click Linked Services > New
azure data factory snowflake: linked services

Azure Data Factory Snowflake: Setting up Snowflake as a Linked Service in ADF

  • Then, search for Snowflake and click on the Snowflake connecter icon. 
azure data factory snowflake: snowflake

Azure Data Factory Snowflake: Setting up Snowflake as a Linked Service in ADF

  • Enter the configuration details of the service. Then, test the connection and create the new linked service. 
azure data factory snowflake: new linked services

Azure Data Factory Snowflake: Setting up Snowflake as a Linked Service in ADF

After the data source is set as a linked service, you can use the COPY command to load data from the source to Snowflake. 

Here is an example of a JSON code for a direct copy method to write data to Snowflake through Azure Data Factory. 

azure data factory snowflake: json code

Azure Data Factory Snowflake: JSON Code for Direct Copy from ADF to Snowflake 

Staged Copy to Snowflake 

When the source data format is incompatible with Snowflake’s COPY command, you can use the staged copy method, which involves Azure Blob Storage. In this method, the linked service automatically converts the source data into the required format. Then, you can invoke the COPY command to transfer data to Snowflake. 

You have to first create an Azure Blob Storage-linked service using the following steps:

  • Login to your Azure account and go to the Manage tab in your Azure Data Factory workspace. Click on Linked Services > New
azure data factory snowflake: create linked service

Azure Data Factory Snowflake: Setting up Azure Blob Storage as a Linked Service

  • Then, search for Blob and select Azure Blob Storage connector
azure data factory snowflake:  select blob storage

Azure Data Factory Snowflake: Setting up Azure Blob Storage as a Linked Service

  • Enter the configuration details and test the connection to create the new linked service. 
azure data factory snowflake: blob storage

Azure Data Factory Snowflake: Setting up Azure Blob Storage as a Linked Service

After creating a linked service, you can transfer the source data to Azure Blob Storage. The data can then be staged and loaded to Snowflake using the COPY command. 

Here is an example JSON code for the staged copy method for Azure Data Factory connect to Snowflake using Azure Blob Storage. 

azure data factory snowflake: json code

Azure Data Factory Snowflake: JSON Code for Staged Copy from ADF to Snowflake 

Solve your data replication problems with Hevo’s reliable, no-code, automated pipelines with 150+ connectors.
Get your free trial right away!

Using REST API 

Another method for writing data to Snowflake through Azure Data Factory is using REST API. Follow the steps below to understand this method. 

Setting up Your Linked Services

To copy data to Snowflake from Azure Data Factory using REST APIs, you should first set up REST API as linked services:

  • Login to your Azure Data Factory account and go to the Manage tab. Click on Linked Services > New
azure data factory snowflake: login

Azure Data Factory Snowflake: Using REST API to Connect ADF Snowflake

  • Then, search for REST and select REST connector. Click Continue
azure data factory snowflake: rest

Azure Data Factory Snowflake: Using REST API to Connect ADF Snowflake

  • Now, enter the configuration details and test the connection to create REST as a linked service. Click on Save to confirm your credentials. 
azure data factory snowflake: configuration details

Azure Data Factory Snowflake: Using REST API to Connect ADF Snowflake

After setting up Snowflake as the sink and Azure Blob Storage as linked services, you can proceed to use the Copy activity to write data into Snowflake.

Building the COPY Activity 

  • Bring in the Copy activity and give it a desired name. 
azure data factory snowflake: copy data

Azure Data Factory Snowflake: Using REST API to Connect ADF Snowflake

  • After this, click on the Source tab. Select your Linked Service, which will be your source dataset, and then add the required information. 
azure data factory snowflake: copy data1

Azure Data Factory Snowflake: Using REST API to Connect ADF Snowflake

  • Set up the Sink tab. Here, Snowflake is the sink dataset.
azure data factory snowflake: usage reports

Azure Data Factory Snowflake: Using REST API to Connect ADF Snowflake

  • Now, go to the mapping button to map all the fields in your Azure Data Factory pipeline. 
azure data factory snowflake: field name

Azure Data Factory Snowflake: Using REST API to Connect ADF Snowflake

  • Go to Settings, connect the Blob Storage Linked Service you created earlier, and choose your desired storage path. 
azure data factory snowflake: connect blog storage linked service

Azure Data Factory Snowflake: Using REST API to Connect ADF Snowflake

You can then test the pipeline by clicking on the Debug button. This completes setting up the connection between Azure Data Factory and Snowflake. You can now upload your data to Snowflake. 

Using Private Endpoint

You can connect Azure Data Factory to Snowflake using a private endpoint. A private endpoint is a network-based interface that works with the help of private IP addresses from your virtual network. You can use a private endpoint to connect securely with Azure’s private link services. Follow the steps below to connect Azure Data Factory to Snowflake using private endpoints.

  • First, you must contact the Snowflake technical support service to get Snowflake’s endpoint service resource ID for the Azure region of your Snowflake account.
  • After logging into your Snowflake account, you should run the system function SYSTEM$GET_PRIVATELINK_CONFIG() to retrieve the privatelink-account-url, regionless-privatelink-account-url, and privatelink_ocsp-url.

use role accountadmin; 

select key, value::varchar from table(flatten(input=>parse_json(SYSTEM$GET_PRIVATELINK_CONFIG())));

  • Using Snowflake’s resource ID, create managed endpoints for Azure Data Factory and add FQDN (Fully Qualified Domain Names) values from the previous step. You should ensure that you add all the Snowflake endpoints, such as the account locator, the regionless account name, and the OCSP endpoint, under fully qualified domain names with private link hostnames. 
azure data factory snowflake: new managed private end point

Azure Data Factory Snowflake: Using Private Endpoint to Connect ADF to Snowflake

  • The current status of the connection should be Pending

Azure Data Factory Snowflake: Using Private Endpoint to Connect ADF to Snowflake

  • Now, retrieve the managed private endpoint resource ID by clicking on the Managed Private Endpoint (MPE) name. Enter this ID in Snowflake support and wait for it to be approved. 
azure data factory snowflake: managed private endpoint resource

Azure Data Factory Snowflake: Using Private Endpoint to Connect ADF to Snowflake

Once approved, the managed endpoint connection status changes to Approved

azure data factory snowflake: status change

Azure Data Factory Snowflake: Using Private Endpoint to Connect ADF to Snowflake

  • To test the private link connectivity to Snowflake from MPE in Azure Data Factory, click on integration runtimes from the left-side menu and then click on AutoResolveIntegrationRuntime.
azure data factory snowflake: integration runtimes

Azure Data Factory Snowflake: Using Private Endpoint to Connect ADF to Snowflake

  • A pop-up will appear. Go to Virtual Network, choose Enable for Interactive authoring, and then click Apply.
azure data factory snowflake: virtual network

Azure Data Factory Snowflake: Using Private Endpoint to Connect ADF to Snowflake

  • From the left-side menu, select Linked Services > New.
azure data factory snowflake: new linked services

Azure Data Factory Snowflake: Using Private Endpoint to Connect ADF to Snowflake

  • Search for Snowflake and click on the Snowflake connector icon. Then, wait for interactive authoring to be enabled. 
azure data factory snowflake: snowflake as linked services

Azure Data Factory Snowflake: Using Private Endpoint to Connect ADF to Snowflake 

  • After this, enter all the configuration details, test the connection, and set up Snowflake to a linked service in Azure Data Factory.
azure data factory snowflake: test connection

Azure Data Factory Snowflake: Using Private Endpoint to Connect ADF to Snowflake 

Once your connection is set, you can load data to Snowflake from Azure Data Factory. 

Challenges for Azure Data Factory Snowflake Integration 

Some of the challenges of Azure Data Factory and Snowflake connection are as follows:

  • It is difficult to directly copy data from Azure Data Factory to Snowflake as the data should be in the specified format for a seamless transfer. 
  • The data source for the direct copy method should be in Azure Blob Storage, ensuring it is a linked service with shared access signature authentication.
  • Setting up Snowflake as a linked service is a slightly complex process.
  • The Snowflake’s COPY command enables parallel data loading, but managing this process can be challenging, especially for large datasets. 

Benefits of Hevo Over Azure Data Factory for Snowflake Integration

To simplify the complexities of Azure Data Factory and Snowflake connection, you can use other third-party ingestion and integration tools. These tools can facilitate hassle-free data transfer from various sources to Snowflake. One such tool is Hevo Data, a zero-code data integration tool. 

Hevo Data is a no-code ELT platform that provides real-time data integration and a cost-effective way to automate your data pipeline workflow. With over 150 source connectors, you can integrate your data into multiple platforms, conduct advanced analysis on your data, and produce useful insights.

Here are some of the most important features provided by Hevo Data:

  • Data Transformation: Hevo Data allows you to transform your data for analysis with simple Python-based and drag-and-drop data transformation techniques. This feature allows you to transform data into a Snowflake-compatible format, which you can directly load into Snowflake without configuring any other linked service.
  • Automated Schema Mapping: Hevo Data automatically arranges the destination schema to match the incoming data. This feature helps identify similar data elements in the source and automatically matches them in the respective fields of the Snowflake schema. It also lets you choose between Full and Incremental Mapping. Thus, you can ensure data consistency during Snowflake migration. 
  • Incremental Data Load: It ensures proper bandwidth utilization at both the source and the destination by allowing near real-time data transfer of the modified data. This feature ensures that only new or updated data is transferred to Snowflake. It improves query performance and helps optimize the usage of Snowflake resources to reduce expenses. 

Azure Data Factory does not facilitate these features. You may also find it challenging to carry out the integration process through the tool. Instead, you can switch to Hevo Data which has a host of efficient features to achieve successful data integration with Snowflake. 

Use Cases Migrating Data to Snowflake 

  • Secure Data Storage: Loading data to Snowflake enables secure storage. The platform provides a role-based access mechanism and facilitates data backup, which can be retrieved in case of discrepancies. 
  • Business Intelligence: Snowflake easily integrates with several BI tools, such as Looker, Power BI, or Tableau. It enables you to perform various business intelligence operations, like creating interactive dashboards and reports to gain valuable data insights. 
  • Machine Learning: Your organization may want to leverage machine learning for improved data analytics. You can achieve this by transferring data to Snowflake. It uses a zero-copy cloning function to copy all datasets, which you can utilize to train and test ML models. 

Conclusion 

This blog is a comprehensive guide for Azure Data Factory Snowflake data integration. It provides you with detailed information on both these platforms and explains how to write data to Snowflake via Azure Data Factory. To leverage the benefits of Snowflake data warehouse, you can take the assistance of a third-party tool like Hevo Data.

It offers an extensive library of connectors that enable you to transfer data from several sources into Snowflake. Hevo also facilitates automated pipeline setup, data transformation, and other robust features to streamline data transfer to Snowflake. You can schedule a demo to take advantage of Hevo’s features.

FAQs 

  1. Does Azure Data Factory work with Snowflake? 

Yes, Azure Data Factory works with Snowflake to facilitate data orchestration, transformation, scheduling, and management through a data pipeline. 

  1. What are the advantages of Snowflake over Azure?

Snowflake provides a simple user interface, better documentation, resource allocation, data integration, and debugging capabilities than Azure. You can opt for Snowflake if you prefer easy-to-deploy data warehouse service with almost unlimited automatic scaling and high performance. You can use the Azure data warehouse if you want a data warehouse service with a high price-to-performance ratio. 

Nitin Birajdar
Lead Customer Experience Engineer

Nitin, a seasoned professional with 9 years of experience, excels in ETL, Data Engineering, SAAS, and AI as a Customer Experience Lead. His expertise drives innovation and excellence, ensuring optimal solutions for diverse data challenges.

All your customer data in one place.