The world is becoming increasingly computerized. Soon, every part of our lives will be connected to the Internet, providing consumers with more conveniences such as instant access to information. While this is a positive development, the sheer volume of data generated as a result of digitalization is staggering. Snowflake is one such Cloud Data Warehouse that is helping out users manage this colossal volume of data by establishing efficient development strategies through its application.

Developing and managing data-intensive applications has traditionally been expensive and time-consuming. Snowflake’s modern solution, on the other hand, is changing the game.

In this article, you will gain information about Snowflake Apps. You will also gain a holistic understanding of Snowflake, its key features, and steps for building Snowflake Apps. Read along to find out in-depth information about Snowflake Apps.

Table of Contents

What is Snowflake?

Snowflake Logo: Snowflake Apps
Image Source

Snowflake is a Data Warehouse-as-a-service platform built for the cloud. Its data architecture is different from that of Amazon Redshift because it uses the scalable, elastic Azure Blobs Storage as the internal storage engine and Azure Data Lake to store the unstructured, structured, and on-premise data ingested via the Azure Data Factory. 

The Snowflake Data Warehouse provides security and protection of data using Amazon S3 policy controls, SSO, Azure SAS tokens, and Google Cloud Storage access permissions. You can also scale your storage depending on your storage needs. 

The key benefit of leveraging Snowflake are as follows:

  • Given the elastic nature of the cloud, you can scale up your virtual warehouse to take advantage of extra compute resources to say run a high volume of queries, or load data faster.
  • With Snowflake you can combine semistructured and structured data for analysis and load it into the database without the need to transform or convert it into a fixed relational schema beforehand.
  • Snowflake has a multi-cluster architecture that takes care of concurrency issues like failures and delays.
  • Snowflake’s architecture enables companies to leverage it to seamlessly share data with any data consumer.
Snowflake working: Snowflake Apps
Image Source

Key Features of Snowflake

Snowflake Apps - Snowflake Features
Image Source

Here are some of the benefits of using Snowflake as a Software as a Service (SaaS) solution:

  • Snowflake enables you to enhance your Analytics Pipeline by transitioning from nightly batch loads to real-time data streams, allowing you to improve the quality and speed of your analytics. By allowing safe, concurrent, and controlled access to your Data Warehouse across your organization, you can improve the quality of analytics at your company.
  • Snowflake uses the caching paradigm to swiftly deliver the results from the cache. To avoid re-generation of the report when nothing has changed, Snowflake employs persistent (within the session) query results.
  • Snowflake allows you to break down data silos and provide access to meaningful insights across the enterprise, resulting in better data-driven decision-making. This is a crucial first step toward bettering partner relationships, optimizing pricing, lowering operational expenses, increasing sales effectiveness, and more.
  • Snowflake allows you to better understand user behavior and product usage. You can also use the whole scope of data to ensure customer satisfaction, drastically improve product offers, and foster Data Science innovation.
  • Snowflake allows you to create your own Data Exchange, which allows you to securely communicate live, controlled data. It also encourages you to improve data relationships throughout your business units, as well as with your partners and customers.
  • Secure Data Lake: You can use a secure Data Lake to store all compliance and cybersecurity data in one place. Snowflake Data Lakes ensure quick incident response times. This allows you to understand the complete picture of an incident by clubbing high-volume log data in a single location, and efficiently analyzing years of log data in seconds.

Simplify Snowflake ETL and Analysis with Hevo’s No-code Data Pipeline

A fully managed No-code Data Pipeline platform like Hevo Data helps you integrate and load data from 100+ different sources (including 40+ free sources) to a Data Warehouse such as Snowflake or Destination of your choice in real-time in an effortless manner. Hevo with its minimal learning curve can be set up in just a few minutes allowing the users to load data without having to compromise performance. Its strong integration with umpteenth sources allows users to bring in data of different kinds in a smooth fashion without having to code a single line. 

Get Started with Hevo for Free

Check out some of the cool features of Hevo:

  • Completely Automated: The Hevo platform can be set up in just a few minutes and requires minimal maintenance.
  • Transformations: Hevo provides preload transformations through Python code. It also allows you to run transformation code for each event in the Data Pipelines you set up. You need to edit the event object’s properties received in the transform method as a parameter to carry out the transformation. Hevo also offers drag and drop transformations like Date and Control Functions, JSON, and Event Manipulation to name a few. These can be configured and tested before putting them to use.
  • Connectors: Hevo supports 100+ integrations to SaaS platforms, files, Databases, analytics, and BI tools. It supports various destinations including Google BigQuery, Amazon Redshift, Snowflake Data Warehouses; Amazon S3 Data Lakes; and MySQL, SQL Server, TokuDB, DynamoDB, PostgreSQL Databases to name a few.  
  • Real-Time Data Transfer: Hevo provides real-time data migration, so you can have analysis-ready data always.
  • 100% Complete & Accurate Data Transfer: Hevo’s robust infrastructure ensures reliable data transfer with zero data loss.
  • Scalable Infrastructure: Hevo has in-built integrations for 100+ sources (including 40+ free sources) that can help you scale your data infrastructure as required.
  • 24/7 Live Support: The Hevo team is available round the clock to extend exceptional support to you through chat, email, and support calls.
  • Schema Management: Hevo takes away the tedious task of schema management & automatically detects the schema of incoming data and maps it to the destination schema.
  • Live Monitoring: Hevo allows you to monitor the data flow so you can check where your data is at a particular point in time.
Sign up here for a 14-Day Free Trial!

Prerequisites

  • Snowflake privileges required to create a user, database, and warehouse.
  • Ability to install and run software on your computer.
  • Basic knowledge of git and editing JSON.
  • Intermediate SQL Access knowledge to run SQL queries in the Snowflake console or SnowSQL.
  • Basic to intermediate knowledge of SQL.
  • Access to run SQL queries in the Snowflake console or SnowSQL.
  • installation of VSCode.
  • Installation of NodeJS.

Building Snowflake Apps

A Cloud-based Data platform provides the infrastructure for developing data apps that can meet modern customer demands. Today, many data apps are built on traditional data stacks, such as legacy on-premises and “Cloud-Washed” Data Warehouses. These lack the characteristics that contribute to the success of modern apps.

Snowflake Cloud Data Platform includes the stack required to build and scale modern data applications. Snowflake is built on and for the cloud, which provides significant advantages that become clear when you examine its architecture, deployment, and operations.

Snowflake powers a multitude of applications in a wide range of industries and use-cases. In this article, you’ll learn how to build a data application and API that uses Snowflake as its analytical engine.

The steps involved in building Snowflake Apps are as follows:

Step 1: Setting up the Database and Warehouse

A warehouse is required for the application to query the data, and a database is required to store the data displayed on the dashboard.

You can Connect to Snowflake and run the following commands in the Snowflake console or with SnowSQL to create the database and warehouse:

USE ROLE ACCOUNTADMIN;
CREATE WAREHOUSE DATA_APPS_ADHOC WITH WAREHOUSE_SIZE='medium';
USE WAREHOUSE DATA_APPS_ADHOC;
CREATE DATABASE DATA_APPS_DEMO;
CREATE SCHEMA DATA_APPS_DEMO.DEMO;
CREATE WAREHOUSE DATA_APPS_DEMO WITH WAREHOUSE_SIZE='small' STATEMENT_TIMEOUT_IN_SECONDS=15 STATEMENT_QUEUED_TIMEOUT_IN_SECONDS=15;
CREATE STAGE "DATA_APPS_DEMO"."DEMO"."DEMOCITIBIKEDATA" ENCRYPTION=(TYPE='AWS_SSE_S3') URL='s3://demo-citibike-data/';

Step 2: Create a Service User for the Application

You will now create a user account that is clearly different from your own and will be used by the application to query Snowflake. In accordance with the security guidelines, the account will use key-pair authentication and will have limited access to Snowflake.

The data used in the example showcased in the process to build Snowflake apps is in an S3 bucket in the external stage.

Step 3: Create an RSA key for Authentication

To generate a private and public key, you can run the following commands. These keys are required to authenticate the Terraform service account.

$ cd ~/.ssh
$ openssl genrsa -out snowflake_demo_key 4096
$ openssl rsa -in snowflake_demo_key -pubout -out snowflake_demo_key.pub

Step 4: Create the User and Role in Snowflake

  • Step 1: Before creating a user account, you need to:
    • Copy the contents of the /.ssh/snowflake demo key.pub file, starting just after the PUBLIC KEY header and ending just before the PUBLIC KEY footer.
    • Paste over the RSA_PUBLIC_KEY_HERE label, which follows the RSA_PUBLIC_KEY attribute, before running the CREATE USER command.
  • Step 2: Now, you can create a user account. For that login to the Snowflake console & run the following commands as the ACCOUNTADMIN role.
  • Step 3: Execute the SQL statements below to create the user account and grant it access to the data required by the application.
USE ROLE ACCOUNTADMIN;
CREATE ROLE DATA_APPS_DEMO_APP;

GRANT USAGE ON WAREHOUSE DATA_APPS_DEMO TO ROLE DATA_APPS_DEMO_APP;
GRANT USAGE ON DATABASE DATA_APPS_DEMO TO ROLE DATA_APPS_DEMO_APP;
GRANT USAGE ON SCHEMA DATA_APPS_DEMO.DEMO TO ROLE DATA_APPS_DEMO_APP;

CREATE USER "DATA_APPS_DEMO" RSA_PUBLIC_KEY='RSA_PUBLIC_KEY_HERE' DEFAULT_ROLE=DATA_APPS_DEMO_APP DEFAULT_WAREHOUSE=DATA_APPS_DEMO DEFAULT_NAMESPACE=DATA_APPS_DEMO.DEMO MUST_CHANGE_PASSWORD=FALSE;

GRANT SELECT ON FUTURE TABLES IN SCHEMA DATA_APPS_DEMO.DEMO TO ROLE DATA_APPS_DEMO_APP;
GRANT SELECT ON FUTURE MATERIALIZED VIEWS IN SCHEMA DATA_APPS_DEMO.DEMO TO ROLE DATA_APPS_DEMO_APP;
GRANT ROLE DATA_APPS_DEMO_APP TO USER DATA_APPS_DEMO;

Step 5: Import the data

The commands that follow load the data into a temporary table for use by the Snowflake data apps.

You can import the base dataset used in this example by running the following SQL in the Snowflake console or SnowSQL:

USE ROLE ACCOUNTADMIN;
USE WAREHOUSE DATA_APPS_ADHOC;
CREATE TEMPORARY TABLE TEMP_TRIPS (
	TRIPDURATION NUMBER(38,0),
	STARTTIME TIMESTAMP_NTZ(9),
	STOPTIME TIMESTAMP_NTZ(9),
	START_STATION_ID NUMBER(38,0),
	END_STATION_ID NUMBER(38,0),
	BIKEID NUMBER(38,0),
	USERTYPE VARCHAR(16777216),
	BIRTH_YEAR NUMBER(38,0),
	GENDER NUMBER(38,0)
);

CREATE TEMPORARY TABLE TEMP_WEATHER (
	STATE VARCHAR(80),
	OBSERVATION_DATE DATE,
	DAY_OF_YEAR NUMBER(3,0),
	TEMP_MIN_F NUMBER(5,1),
	TEMP_MAX_F NUMBER(5,1),
	TEMP_AVG_F NUMBER(5,1),
	TEMP_MIN_C FLOAT,
	TEMP_MAX_C FLOAT,
	TEMP_AVG_C FLOAT,
	TOT_PRECIP_IN NUMBER(4,2),
	TOT_SNOWFALL_IN NUMBER(4,2),
	TOT_SNOWDEPTH_IN NUMBER(4,1),
	TOT_PRECIP_MM NUMBER(8,1),
	TOT_SNOWFALL_MM NUMBER(8,1),
	TOT_SNOWDEPTH_MM NUMBER(8,1)
);

COPY INTO TEMP_TRIPS from @DEMOCITIBIKEDATA
file_format=(type=csv skip_header=1) pattern='.*trips.*.csv.gz';

CREATE TABLE TRIPS AS
(Select * from TEMP_TRIPS order by STARTTIME);

COPY INTO TEMP_WEATHER from @DEMOCITIBIKEDATA
file_format=(type=csv skip_header=1) pattern='.*weather.*.csv.gz';

CREATE TABLE WEATHER AS
(Select * from TEMP_WEATHER order by OBSERVATION_DATE);
Back

Step 6: Get the source code for the project

Node.js will be used to write the application you will be running. This example shows a Citi Bike dashboard that allows users to view bike usage over time and in various weather conditions.

The source code of the Citi Bike dashboard is available on GitHub.

  • Step 1: You can download the code to a folder of your choice. Now, open the project in VSCode.
  • Step 2: In VSCode, open the terminal and install the following modules:
npm install
npm install -g artillery

Step 7: Configure the Application

  • Step 1: Copy the contents of config-template.js into config.js. Then, modify the following settings to match the values you created in the preceding steps:
    • snowflake_account
    • snowflake_user
    • snowflake_private_key
  • Step 2: Run Select CURRENT_ACCOUNT() to get the snowflake account value from Snowflake. The user is DATA_APPS_DEMO, and the snowflake_private_key is the full path to the previously created private key.
  • Step 3: In a terminal, run npm start to start the application. Check that the configuration is correct if you see errors.

Step 8: Test the Application

Open a web browser and go to http://localhost:3000.

If the page loads and graphs appear, your application is operational! 
To test the dashboard’s interactivity, try different date ranges. Alternatively, you can let the application choose random dates by clicking the link in the upper-right corner of the page.

In real-time, the application processes gigabytes of data for the dashboard aggregates counts over time and joins data from trips and weather. The data is then returned to the website via an API by the application.

For further information about building Snowflake Apps, you can visit here.

Conclusion

In this article, you have learned about Snowflake Apps. This article also provided information on Snowflake, its key features, and the steps for building Snowflake Apps in detail. For further information on Snowflake Primary Key Constraint, Cast & Try_cast Commands, Copy command, you can visit the following links.

Hevo Data, a No-code Data Pipeline provides you with a consistent and reliable solution to manage data transfer between a variety of sources and a wide variety of Desired Destinations with a few clicks.

Visit our Website to Explore Hevo

Hevo Data with its strong integration with 500+ data sources (including 40+ Free Sources) allows you to not only export data from your desired data sources & load it to the destination of your choice but also transform & enrich your data to make it analysis-ready. Hevo also allows integrating data from non-native sources using Hevo’s in-built Webhooks Connector. You can then focus on your key business needs and perform insightful analysis using BI tools. 

Want to give Hevo a try?

Sign Up for a 14-day free trial and experience the feature-rich Hevo suite first hand. You may also have a look at the amazing price, which will assist you in selecting the best plan for your requirements.

Share your experience of understanding Snowflake Apps in the comment section below! We would love to hear your thoughts.

mm
Former Research Analyst, Hevo Data

Manisha is a data analyst with experience in diverse data tools like Snowflake, Google BigQuery, SQL, and Looker. She has written more than 100 articles on diverse topics related to data industry.

No-code Data Pipeline for Snowflake

Get Started with Hevo