Businesses today often implement the Snowflake API Integration to load data from different applications into the Snowflake Data Warehouse. Using this integration, data enrichment service providers can respond to data requests from consumers of Snowflake Data Marketplace. So, by calling their APIs on behalf of their customers, from a Snowflake account, the data provider can simplify the consumer’s work.
This blog will introduce Snowflake and REST API along with their unique features. It will further explain the data flow process between REST API and Snowflake and will discuss 2 methods using which you can easily set up the Snowflake API Integration. Read along to learn the steps and benefits of these methods!
Prerequisites
- A Snowflake account.
- A Java or Python Software Development Kit (SDK) is installed according to the client’s requirements.
- Working knowledge of REST APIs.
What is Snowflake?
Snowflake is a fully relational, SQL-based Cloud Data Warehouse that offers a Database as a Service (DBaaS) platform to users. Snowflake can give your business the flexibility and agility required to meet the changing data needs. Its flexible Cloud Storage allows you to store almost unlimited amounts of structured and semi-structured data in one location thus consolidating the data from your various data sources. Moreover, this virtual Data Warehouse is scalable and enables your business to meet growing data needs without additional purchases or overheads.
Key Features of Snowflake
The following features of Snowflake are responsible for its growing popularity in the global market:
- Scalability: Snowflake provides storage and computation facilities separately. A Database is used to store the data in Snowflake and the calculation can be done from a virtual data warehouse. Thus it can provide high scalability at low costs.
- Low Maintainance: Snowflake is designed to ensure that minimal user interaction and effort are required for any performance or maintenance-related activity.
- Query Optimization: Snowflake which has automatic query optimization saves you the trouble of manually improving your queries.
- Load Balancing: Snowflake can be useful if you want to separate your routine workloads into different virtual Data Warehouses so that analytical loads are not affected by a busy cluster during peak routine loads.
To learn more about Snowflake, visit here.
What is REST API?
An API (Application Programming Interface), is a set of instructions defining how applications can integrate. A REST (Representational State Transfer) API is a special API that works on the Representative State Transfer Architectural Style design principles. For this reason, REST APIs are sometimes referred to as RESTful APIs. The REST APIs provide a flexible and lightweight way for applications to connect and communicate with each other. REST technology is generally preferred because it consumes less bandwidth, which makes it more suitable for efficient internet use. Moreover, you can use programming languages like JavaScript or Python to build customized RESTful APIs.
Key Features of REST APIs
The following features make REST APIs, a preferred choice for developers:
- Scalability: REST APIs are popular for their scalability. With the separation of the client and server, the product can be scaled up by a development team seamlessly.
- Ease of Management: REST APIs allow you to migrate data from one server to another or to make changes to the Database at any time. The frontend and the backend of an application can therefore be hosted on different servers, which is a significant management advantage.
- Independent Development: Since there is a separation between client and server, the REST API makes it easy for you to independently develop the different areas of a project. In addition, the REST API adapts at all times to the syntax and the work platform. This offers the possibility of trying out different environments during the development process.
To learn more about REST API, visit here.
What is Snowflake API Process Flow?
The process flow of Snowflake API Integration consists of the following 3 steps:
- The data files from REST APIs are copied to an internal stage like Snowflake or an external stage such as Amazon S3, Google Cloud Storage, etc as shown in the below image (Labeled as 1).
- A client will call the “insertFiles” endpoint with a list of files that have to be inserted using a defined pipe as shown in the below image (Labeled as 2). The REST endpoint transfers these files to the required Ingestion Queue.
- A virtual Data Warehouse provided by Snowflake will load the data from queued files into the assigned target table according to the parameters defined in the created pipe.
The above steps indicate that the client application will call a public REST endpoint and provide it with a list of file names and a referral channel name. The channel can be working on either the Java SDK or the Python SDKs at your convenience. If the sent files get a match in the list (present in the stage), the files will be stored in a loading queue. These files will be loaded to the target destination using certain parameters that are defined by the Snowflake API pipe.
How to CREATE Snowflake API Integration?
Replaces an existing API integration object in the account or creates a new API integration object.
- An HTTPS proxy service’s information is stored in an API integration object, which includes:
- The company that provides cloud platforms (e.g. Amazon AWS). The proxy service’s type (in case the cloud platform provider offers more than one type of proxy service).
- The identifier and access credentials for a cloud platform role with sufficient proxy service privileges. The role’s ARN (Amazon resource name) serves as the identifier and access credentials on AWS, for example.
- Snowflake can use this cloud user to access resources on the proxy service (for example, an instance of the cloud platform’s native HTTPS proxy service or an instance of an Amazon API Gateway) once this user has been granted appropriate privileges.
- On those proxy services, an API integration object specifies allowed (and optionally blocked) endpoints and resources.
Each cloud platform has a different syntax.
For Amazon API Gateway
CREATE [ OR REPLACE ] API INTEGRATION [ IF NOT EXISTS ] <integration_name>
API_PROVIDER = { aws_api_gateway | aws_private_api_gateway | aws_gov_api_gateway | aws_gov_private_api_gateway }
API_AWS_ROLE_ARN = '<iam_role>'
[ API_KEY = '<api_key>' ]
API_ALLOWED_PREFIXES = ('<...>')
[ API_BLOCKED_PREFIXES = ('<...>') ]
ENABLED = { TRUE | FALSE }
[ COMMENT = '<string_literal>' ]
;
Required Parameters:
- integration_name: The rules for Object Identifiers are followed in this name. The name should be unique among your account’s API integrations.
- provider_info: The HTTPS proxy service type is specified. The following are valid values:
aws_api_gateway
: aws api gateway uses regional endpoints for Amazon API Gateway.aws_private_api_gateway
: aws private api gateway is an Amazon API Gateway extension that uses private endpoints.aws_gov_api_gateway
: Amazon API Gateway endpoints for the US government’s GovCloud.aws_gov_private_api_gateway
: for Amazon API Gateway endpoints that are also private endpoints from the US government.
- iam_role: This is the ARN (Amazon resource name) of a cloud platform role on Amazon AWS.
- api_key: The API key (also known as a “subscription key”) is a unique identifier for your account.
- ENABLED = < TRUE | FALSE >: Indicates whether or not this API integration is enabled. Any external function that relies on the API integration will not work if it is disabled.
- The value is unaffected by the case.
- TRUE is the default value.API_ALLOWED_PREFIXES = (…): (…) treats each URL as a prefix. For instance, if you say:https://xyz.amazonaws.com/production/
That is to say, all resources that are under the control of https://xyz.amazonaws.com/production/ are permitted. For instance, the following is permitted: https://xyz.amazonaws.com/production/ml1
To maximize security, you should limit the number of locations that are permitted as much as possible.
For Azure API Management
CREATE [ OR REPLACE ] API INTEGRATION [ IF NOT EXISTS ] <integration_name>
API_PROVIDER = azure_api_management
AZURE_TENANT_ID = '<tenant_id>'
AZURE_AD_APPLICATION_ID = '<azure_application_id>'
[ API_KEY = '<api_key>' ]
API_ALLOWED_PREFIXES = ( '<...>' )
[ API_BLOCKED_PREFIXES = ( '<...>' ) ]
ENABLED = { TRUE | FALSE }
[ COMMENT = '<string_literal>' ]
;
Required Parameters:
- integration_name: The name of the API integration is specified here. The rules for Object Identifiers are followed in this name. The name should be unique among your account’s API integrations.
- tenant_id: Specifies the Office 365 tenant ID to which all Azure API Management instances are assigned. Because an API integration can only authenticate to a single tenant, the allowed and blocked locations must all refer to the same API Management instance.
- Log in to Azure and go to Azure Active Directory » Properties to find your tenant ID. In the Directory ID field, you’ll see the tenant ID.
- azure_application_id: The Azure AD (Active Directory) app for your remote service’s “Application (client) id.” This is the Azure Function App AD Application ID that you recorded in the worksheet in the Creating External Functions on Microsoft Azure instructions if you followed them.
- api_key: The API key (also known as a “subscription key”) is a unique identifier for your account.
- ENABLED = < TRUE | FALSE >: Indicates whether or not this API integration is enabled. Any external function that relies on the API integration will not work if it is disabled.
- The value is unaffected by the case.
- TRUE is the default value.
- API_ALLOWED_PREFIXES = (…): External functions using the integration are explicitly limited to referencing one or more HTTPS proxy service endpoints (e.g. Azure API Management services) and resources within those proxies. Supports a list of URLs separated by commas that are treated as prefixes (for details, see below).
For Google Cloud API Gateway
CREATE [ OR REPLACE ] API INTEGRATION [ IF NOT EXISTS ] <integration_name>
API_PROVIDER = google_api_gateway
GOOGLE_AUDIENCE = '<google_audience_claim>'
API_ALLOWED_PREFIXES = ( '<...>' )
[ API_BLOCKED_PREFIXES = ( '<...>' ) ]
ENABLED = { TRUE | FALSE }
[ COMMENT = '<string_literal>' ]
;
Required Parameters:
integration_name: The name of the API integration is specified here. The rules for Object Identifiers are followed in this name. The name should be unique among your account’s API integrations.
google_audience: When generating the JWT (JSON Web Token) to authenticate to the Google API Gateway, this is used as the audience claim. Please see the Google service account authentication documentation for more information on how to authenticate with Google.
Usage Notes
- CREATE API INTEGRATION is only available to Snowflake users who have the ACCOUNTADMIN role or a role with the global CREATE INTEGRATION privilege.
- The API integration can only be used directly by Snowflake roles with OWNERSHIP or USAGE privileges on it (e.g. by creating an external function that specifies that API integration).
- A cloud platform account and role within that account are tied to an API integration object, but not to a specific HTTPS proxy URL. In a cloud provider account, you can create multiple instances of an HTTPS proxy service, and you can use the same API integration to authenticate to multiple proxy services in that account.
- You can have multiple API integration objects in your Snowflake account, for example, for different cloud platform accounts.
- The same API integration object, and thus the same HTTPS proxy service, can be used by multiple external functions.
Examples
This example demonstrates the creation of API integration and its subsequent use in a CREATE EXTERNAL FUNCTION statement:
create or replace api integration demonstration_external_api_integration_01
api_provider=aws_api_gateway
api_aws_role_arn='arn:aws:iam::123456789012:role/my_cloud_account_role'
api_allowed_prefixes=('https://xyz.execute-api.us-west-2.amazonaws.com/production')
enabled=true;
create or replace external function local_echo(string_col varchar)
returns variant
api_integration = demonstration_external_api_integration_01
as 'https://xyz.execute-api.us-west-2.amazonaws.com/production/remote_echo';
Time to stop hand-coding your data pipelines and start using Hevo’s No-Code, Fully Automated ETL solution. With Hevo, you can replicate data from a growing library of 150+ plug-and-play integrations and 15+ destinations — SaaS apps, databases, data warehouses, and much more.
Hevo’s ETL empowers your data and business teams to integrate multiple data sources or prepare your data for transformation. Hevo’s Pre and Post Load Transformations accelerates your business team to have analysis-ready data without writing a single line of code!
Gain faster insights, build a competitive edge, and improve data-driven decision-making with a modern ETL solution. Hevo is the easiest and most reliable data replication platform that will save your engineering bandwidth and time multifold.
Start your data journey with the fastest ETL on the cloud!
Get Started with Hevo for Free
How to ALTER Snowflake API Integration?
Changes the properties of an API integration that already exists.
ALTER [ API ] INTEGRATION [ IF EXISTS ] <name> SET
[ API_AWS_ROLE_ARN = '<iam_role>' ]
[ AZURE_AD_APPLICATION_ID = '<azure_application_id>' ]
[ API_KEY = '<api_key>' ]
[ ENABLED = { TRUE | FALSE } ]
[ API_ALLOWED_PREFIXES = ('<...>') ]
[ API_BLOCKED_PREFIXES = ('<...>') ]
[ COMMENT = '<string_literal>' ]
ALTER [ API ] INTEGRATION <name> SET TAG <tag_name> = '<tag_value>' [ , <tag_name> = '<tag_value>' ... ]
ALTER [ API ] INTEGRATION <name> UNSET TAG <tag_name> [ , <tag_name> ... ]
ALTER [ API ] INTEGRATION [ IF EXISTS ] <name> UNSET {
API_KEY |
ENABLED |
API_BLOCKED_PREFIXES |
COMMENT
}
[ , ... ]
Parameters:
name: The integration to change’s identifier. The entire string must be enclosed in double-quotes if the identifier contains spaces or special characters. Case is also taken into account with identifiers enclosed in double-quotes.
SET …: Sets one or more API integration properties/parameters (separated by blank spaces, commas, or new lines):
ENABLED = TRUE | FALSE :
Specifies whether the integration should be started or stopped.
The integration can run if TRUE is set.
FALSE causes the integration to be paused for maintenance. Any attempt to integrate Snowflake with a third-party service fails.
API_AWS_ROLE_ARN = ‘<iam_role>’:
The ARN (Amazon resource name) of a cloud platform role is iam role.
Only if API PROVIDER is set to aws api gateway does this parameter apply.
AZURE_AD_APPLICATION_ID = ‘<azure_application_id>’:
The Azure AD (Active Directory) app for your remote service’s “Application (client) id.”
Only if the API PROVIDER is set to azure api management does this parameter apply.
API_KEY = ‘<api_key>’: The API key (also known as a “subscription key”) is a unique identifier for your account.
API_ALLOWED_PREFIXES = (‘<…>’):
External functions using the integration are explicitly limited to referencing one or more HTTPS proxy service endpoints (e.g. Amazon AWS API Gateway) and resources within those proxies. Supports a list of URLs separated by commas that are treated as prefixes (for details, see below).
API ALLOWED PREFIXES = (…) treats each URL as a prefix. For instance, if you say:
https://xyz.amazonaws.com/production/
That is to say, all resources are under the control of
https://xyz.amazonaws.com/production/
API_BLOCKED_PREFIXES = (‘<…>’):
Endpoints and resources in the HTTPS proxy service that cannot be called from Snowflake are listed here.
The rules for location values are the same as for API ALLOWED PREFIXES above.
API BLOCKED PREFIXES is more important than API ALLOWED PREFIXES. A prefix is blocked if it matches both. In other words, except for values that also match API BLOCKED PREFIXES, Snowflake allows all values that match API ALLOWED PREFIXES.
You do not need to explicitly block a value that is outside API ALLOWED
TAG tag_name = ‘tag_value‘ [ , tag_name = ‘tag_value‘ , … ]:
The tag name (i.e. the key) and the tag value are specified.
The tag value is always a string, and the tag value can have up to 256 characters. On an object, the maximum number of unique tag keys that can be set is 20.
COMMENT =’string literal’
Specifies a comment for the integration as a string (literal).
UNSET …: Unsets one or more properties/parameters for the API integration, restoring them to their default values:
ENABLED
API_BLOCKED_PREFIXES
TAG tag_name [ , tag_name ... ]
COMMENT
Usage Notes:
The API PROVIDER setting is unchangeable.
When using the Snowflake service, customers should make sure that no personal data (except for User objects), sensitive data, export-controlled data, or other regulated data is entered as metadata.
Example:
The example below shows how to start a suspended integration:
alter api integration myint set enabled = true;
How to DROP Snowflake API Integration?
Removes a link from the account’s integrations.
DROP [ { API | NOTIFICATION | SECURITY | STORAGE } ] INTEGRATION [ IF EXISTS ] <name>
Parameters:
name: The identifier for the integration to be dropped. The entire string must be enclosed in double-quotes if the identifier contains spaces, special characters, or mixed-case characters. Case is also taken into account when using identifiers enclosed in double quotes (e.g. “My Object”).
API | NOTIFICATION | SECURITY | STORAGE: The integration type is specified here.
Usage Notes:
Integrations that have been dropped cannot be recovered; they must be recreated.
Example:
Remove a link:
show integrations like 't2%';
drop integration t2;
show integrations like 't2%';
Drop the integration once more, but don’t throw an error if it doesn’t exist:
drop integration if exists t2;
How to SHOW Snowflake API Integration?
The integrations to which you have access privileges in your account are listed here. Integration metadata and properties are returned in the output.
SHOW [ { API | NOTIFICATION | SECURITY | STORAGE } ] INTEGRATIONS [ LIKE '<pattern>' ]
Parameters:
API | NOTIFICATION | SECURITY | STORAGE: Only integrations of the specified type are returned.
LIKE ‘pattern‘: Filters the output of the command by object name. With support for SQL wildcard characters (percent and _), the filter uses case-insensitive pattern matching. The following patterns, for example, produce the same results:
... LIKE '%testing%' ...
... LIKE '%TESTING%' ...
Usage Notes:
- The API | NOTIFICATION | SECURITY | STORAGE parameter is currently the only one that works.
- The command can be run without the need for a running warehouse.
- You can use the RESULT SCAN function to post-process the output of this command, which treats it as a table that can be queried.
Output:
Table properties and metadata are provided in the following columns in the command output:
| name | type | category | enabled | created_on |
column | Description |
---|
name | Name of the integration |
type | Type of the integration |
category | Category of the integration |
enabled | The current status of the integration, either TRUE (enabled) or FALSE (disabled) |
created_on | Date and time when the integration was created |
Examples:
Show all integrations with notifications:
show notification integrations;
Show all the integrations with a name that begins with line that you have access to:
show integrations like 'line%';
How to DESCRIBE Snowflake API Integration?
The properties of an integration are described here.
DESC is a short form of DESCRIBE.
Syntax:
DESC[RIBE] [ { API | NOTIFICATION | SECURITY | STORAGE } ] INTEGRATION <name>
Parameters:
name: The identifier for the integration to describe is specified here. If the identifier contains spaces or special characters, double quotes must be used to surround the entire string. Case is also taken into account for identifiers enclosed in double-quotes.
Usage Notes:
You can use the RESULT SCAN function to post-process the output of this command, which treats it as a table that can be queried.
The API KEY column is included in the output if the integration is an API integration. If an API key was entered, the API KEY returns a masked value. (Neither the original unencrypted key nor the encrypted version of the key are displayed.)
If the TYPE property of the security integration is set to OAUTH (i.e. Snowflake OAuth), Snowflake returns two additional security integration properties in the query result that cannot be set using either the CREATE SECURITY INTEGRATION or ALTER SECURITY
OAUTH_ALLOWED_AUTHORIZATION_ENDPOINTS: A list of all supported endpoints for receiving an authorization code from Snowflake by a client application.
OAUTH_ALLOWED_TOKEN_ENDPOINTS: A list of all endpoints that a client application can use to exchange an authorization code for an access token or to get a refresh token.
Example:
Integrate a notification system:
create notification integration myint ( ... );
The columns in the integration should be described as follows:
desc integration myint;
Methods to Set Up the Snowflake API Integration
You can easily set up the Snowflake API Integration using the following 2 methods:
Method 1: Snowflake API Integration Using Snowpipe
Snowpipe provides a serverless data download option that sets up your Snowflake API Integration using the following steps:
Step 1: Create a New Stage
Snowpipe allows you to load data from the following 2 types of staging areas:
- Named internal stages like Snowflake or external stages like Amazon S3, Google Cloud Storage, or Microsoft Azure.
- Simple tabular stages.
Use the CREATE STAGE command to create a named stage or use an existing stage. You will temporarily store your files in this staging area loading them into the destination Tables using Snowpipe.
Step 2: Create a New Pipe to Load Data
Use the “CREATE PIPE” command to build a new pipe in your Snowflake system. Then use the “COPY INTO” command to import data from the Ingestion Queue into Snowpipe’s tables. For more information regarding creating a pipe, visit here.
Keep in mind that you require to have “CREATE PIPE” and “USAGE” privileges associated with the Schema, Stage, and Database to create a pipe using Snowpipe.
An example code is given below:
create pipe mydb.myschema.mypipe if not exists as copy into mydb.myschema.mytable from @mydb.myschema.mystage;
This code will transfer the data present in the staging area “mystage” to the table named “mytable”.
Step 3: Authenticate Security Measures
You must generate a public-private key pair to make calls to Snowpipe’s REST endpoints for each user who wishes to perform continuous data loading operations using Snowpipe. Moreover, if you want to limit the use of this Snowflake API Integration to a single user, you only need to set up the key pair authentication for the user once. After that, provide the access control privileges to the users for each data load.
To generate a key pair, visit here.
To provide the least privileges to your user, it is recommended to create a separate user-role pair to use when importing files using a pipe in the Snowflake API Integration (The user must be created using this as the default role).
Step 4: Grant Privileges to Users
For each user, once the security key pair is generated, go ahead and grant sufficient privileges to the users regarding the objects for the data load. This will include permissions to move data among the following:
- The target Database
- Schema
- Table
- The stage object
- The pipe
You can use the “GRANT … TO ROLE” command for assigning these user privileges.
The following code shows an example of creating a user role named “snowpipe1” that can access “mydb.myschema” Database objects and uses the pipe named “mypipe” to load data. This code can be used to grant the user a role to work in the Snowflake API system.
-- Create a role to contain the Snowpipe privileges
use role securityadmin;
create or replace role snowpipe1;
-- Grant the required privileges on the database objects
grant usage on database mydb to role snowpipe1;
grant usage on schema mydb.myschema to role snowpipe1;
grant insert, select on mydb.myschema.mytable to role snowpipe1;
grant usage, read on stage mydb.myschema.mystage to role snowpipe1;
-- Grant the OWNERSHIP privilege on the pipe object
grant ownership on pipe mydb.myschema.mypipe to role snowpipe1;
-- Grant the role to a user
grant role snowpipe1 to user jsmith;
-- Set the role as the default role for the user
alter user jsmith set default_role = snowpipe1;
Step 5: Transfer Data Files to the Stage
You must copy the files to the Stage (internal or external) that you have created in Step1 to upload files using the Snowflake API Integration as follows:
- External Stage: Copy the required files to your external stage using the tools provided offered by the Cloud storage service.
- Internal Stage: Copy required files to an internal deck using the “PUT” control. The syntax for the PUT command is:
PUT file://<path_to_file>/<filename> internalStage
[ PARALLEL = <integer> ]
[ AUTO_COMPRESS = TRUE | FALSE ]
[ SOURCE_COMPRESSION = AUTO_DETECT | GZIP | BZ2 | BROTLI | ZSTD | DEFLATE | RAW_DEFLATE | NONE ]
[ OVERWRITE = TRUE | FALSE ]
That’s it! Your Snowflake API Integration is ready!
Method 2: Snowflake API Integration Using Hevo Data
Hevo Data, a No-code Data Pipeline, helps you directly transfer data from REST APIs and 150+ other data sources to Data Warehouses like Snowflake, Databases, BI tools, or a destination of your choice in a completely hassle-free & automated manner for free. Hevo is fully managed and completely automates the process of not only loading data from your desired source but also enriching the data and transforming it into an analysis-ready form without having to write a single line of code. Its fault-tolerant architecture ensures that the data is handled in a secure, consistent manner with zero data loss.
Hevo Data takes care of all your data preprocessing needs and lets you focus on key business activities and draw a much more powerful insight on how to generate more leads, retain customers, and take your business to new heights of profitability. It provides a consistent & reliable solution to manage data in real-time and always has analysis-ready data in your desired destination.
Sign up here for a 14-Day Free Trial!
Hevo Data, an official Snowflake partner, offers you a seamless way to set up the Snowflake API Integration 2 simple steps:
- Authenticate Source: You must authenticate REST API as your data source in Hevo data as shown in the below image.
Configure REST API Source
- Configure Destination: Set your Snowflake account as the data destination for the Hevo Data Pipeline as shown in the image below.
Configure Snowflake Destination
More Reasons to Choose Hevo Data
- Faster Implementation: A very quick 2-stage process to get your pipeline setup. After that, everything’s automated while you watch data sync to Snowflake or any other destination in real time.
- Fully Managed Service & Live Support: Every aspect of the data transfer process is fully managed by Hevo. Hevo also provides a dedicated team of talented support engineers to guide you through any issues that arise.
- Real-time Alerts & Notifications: With Hevo, you are always in the know about your data transfer pipelines. Receive real-time multiple-format notifications across your various devices.
- 100% Complete & Accurate Data Transfer: The entire ETL spectrum is covered by Hevo. Quality rules and expert engineering also ensure that data is always 100% transmitted so there’s no risk of data corruption or missing data.
- Scalability Assured: Hevo simply scales to meet your needs today and tomorrow thanks to its world-class infrastructure and platform.
- A Wide Array of Data Source Integrations: Hevo can bring data from not only Rest API but a wide array of data sources ranging from Sales and Marketing tools to Databases and more.
What are the benefits of the Snowflake API Integration?
Using the Snowflake API Integration, data enrichment service providers can be quickly added to Snowflake Data Marketplace and instantly respond to consumer inquiries in any cloud or Snowflake region using the REST APIs. This eliminates the need to design new Data Flows at the vendor’s end and Data Replication between regions is also not required for this setup. Apart from this, the Snowflake API Integration provides you with the following benefits:
- Instant insights: The Snowflake API Integration instantly delivers relevant and up-to-date data to all of your business users. Moreover, it provides conflict-free access to the files stored in your Snowflake account.
- Profitable: With the Snowflake API Integration in place, you only pay for the per-second use of Snowpipe to load data instead of the costs to run a Data Warehouse continuously or by the hour. Thus this tool charges you only for your usage and nothing more.
- Flexibility: Technical resources of Snowflake can interface directly with the programmable REST API, using the Java and Python SDK. This allows you to enable highly customized data load cases.
- Low Maintainance: Snowflake API integration automatically provides the correct capacity for the downloaded data. This saves you from the trouble of manually modifying the required data capacity Moreover, there are no servers present for you to worry about.
Conclusion
This article provided a brief introduction of Snowflake and REST API and explained their key features. Also, it discussed the basic data flow process between REST API and Snowflake. The article further listed 2 methods, using which you can seamlessly set up your Snowflake API Integration. The first method involves the use of Snowpipe ( a Snowflake tool) to transfer data from REST API to Snowflake. This however requires you to manually write a custom code that will connect these 2 platforms.
Visit our Website to Explore Hevo
If you are looking for a simpler method, Hevo Data can take care of your data transfer process. It provides a faster way to move data from 150+ data sources such as REST APIs, Databases, CRMs, etc., to the Snowflake Data Warehouse. Hevo is fully automated and hence does not require you to write any code.
Want to take Hevo for a spin? Sign Up for a 14-day free trial and experience the feature-rich Hevo suite first hand.
Share your understanding of the Snowflake API Integration in the comments below!