Multiple business organizations leverage the data they generate by storing it in robust Data Warehouses and performing analyses using Business Intelligence [BI] tools to gather actionable insights for optimizing strategies. Snowflake is one such Data Warehouse used by companies and Looker is a famous Data Visualization tool and BI platform that can help analyze this data from Snowflake. That is why, more often than, Looker Snowflake integration is essential for most organizations.

In this article, you will be introduced to Looker, Snowflake, and their key features. You will be informed about the prerequisites of the Looker and Snowflake Integration, and learn about steps that can be followed to implement the Looker Snowflake Integration. 

Introduction to Snowflake 

Snowflake Logo - Looker Snowflake
Image Source

Snowflake is one of the most robust Cloud Data Warehouse platforms available in the market today. It allows users to utilize the Cloud Architecture not just to store the data but also share it in real-time. The flexibility of the system is ensured by deriving the functionality from new SQL query engines and custom architecture specific to its Cloud Service. This Cloud Data Warehouse platform is ideal for businesses that do not want to have dedicated in-house resources for setup, maintenance, and support for in-house servers as every operation is done on the Cloud.

Snowflake provides best-in-class performance even with enormous volumes of data and also has the functionality to scale up or down easily according to the user’s requirements. Snowflake can natively ingest, store, and query diverse data both Structured and Semi-Structured, such as CSV, XML, JSON, AVRO, etc. You can query this data with ANSI, ACID-compliant SQL in a fully relational manner

More information regarding Snowflake can be found here

Key Features of Snowflake

Snowflake has the following features:

  • Cloud Service Provision: The Snowflake Data Warehouse platform is completely hosted on Cloud infrastructure offered by Amazon Web Service, Microsoft Azure and Google Cloud with the option to choose the geographic region of the stored data. 
  • High Scalability: The user is provided with the ability to scale up and down rapidly in accordance with the business requirements and thus being more cost effective.
  • Focus on Security: Along with the security features related to data storage such as encryption, there are a range of security measures to control data access such as restricting certain IP Addresses, Two-factor authentication etc. 
  • Low Administrative Requirements: As Snowflake is available as a Software-as-a-Service [SaaS], there is little requirement for setting up the IT infrastructure and maintaining a IT administrator over the deployment. 
  • Support for Data Sharing: Sharing data amongst other Snowflake users is very easy. For users not on the platform, special reader accounts can be created to access the shared content. 

Introduction to Looker

Looker Logo - Looker Snowflake
Image Source

Looker is a Web-based Data Visualization and Business Intelligence platform used by various organisations to create Business Reports and real-time Dashboards. It is capable of transforming Graphical User Interface (GUI) based user input into SQL queries and then sending it directly to the database in live mode. It provides users with numerous visualizations and customisation options tailored to their needs. The platform supports multiple data sources and deployment methods so that you can use them for all your analytical needs without compromising the transparency, reliability, security, or privacy of your data, thereby making it the right choice for mission critical needs.

Looker works perfectly with data connections with Cloud-based Data Warehouses like Google BigQuery, Amazon Redshift, or Snowflake, which can be scaled up or down as per requirements to manage the levels of user concurrency and query load and thus optimize business cost. It supports a Data Modelling layer which is separate from the components that help visualize data. The functionalities offered by this layer can be leveraged by developers to transform data, perform numerous join operations across tables, etc. This feature is considered to be very important as it enables multiple developers to work simultaneously on a single model and then merge it using Github Sync.  

More information about Looker can be found here.

Key Features of Looker

The key features of Looker are as follows:

  • Create Custom Applications: Users can create custom applications that can provide tailored visualization experience to different organizations according to their needs.
  • Support for Advanced Integrations: Platforms like Google BigQuery, Snowflake, AWS Redshift along with 50+ SQL dialects are supported using various Connectors and Integrations. 
  • Latest BI Tools: Latest data and support for creating Real-Time Dashboards is provided  and support for latest Business Intelligence tools is present to improve reporting. 
  • Support for Advanced Hosting: To ensure data reliability and safety, multiple Cloud-based platforms including Google Cloud Platform (GCP), Amazon Web Services(AWS) are supported. 
  • Looker ML Functionality: This feature is also known as LookML and is used to describe the measures and dimensions of all the projects stored and being analysed on Looker. 

Detailed information about LookerML can be found here.


  • Working knowledge of of Snowflake
  • Working knowledge of Looker 
  • Active Looker Account 
  • Active Snowflake account and Cloud Data Warehouse 
  • Provision Access on Snowflake
Hevo Data, Seamless Data Migration to Looker

Hevo is a No-code Data Pipeline that offers a fully managed solution to set up data integration from 150+ data sources (including 50+ Free Data Sources) and will let you directly load data to Snowflake or a Data Warehouse of your choice. It will automate your data flow in minutes without writing any line of code. Its fault-tolerant architecture makes sure that your data is secure and consistent. Hevo provides you with a truly efficient and fully automated solution to manage data in real-time and always have analysis-ready data.

Check out what makes Hevo amazing:

  • Secure: Hevo has a fault-tolerant architecture that ensures that the data is handled in a secure, consistent manner with zero data loss.
  • Schema Management: Hevo takes away the tedious task of schema management & automatically detects schema of incoming data and maps it to the destination schema.
  • Minimal Learning: Hevo, with its simple and interactive UI, is extremely simple for new customers to work on and perform operations.
  • Hevo Is Built To Scale: As the number of sources and the volume of your data grows, Hevo scales horizontally, handling millions of records per minute with minimal latency.
  • Incremental Data Load: Hevo allows the transfer of data that has been modified in real-time. This ensures efficient utilization of bandwidth on both ends.
  • Live Support: The Hevo team is available round the clock to extend exceptional support to its customers through chat, email, and support calls.
  • Live Monitoring: Hevo allows you to monitor the data flow and check where your data is at a particular point in time.

Simplify your data analysis with Hevo today! Sign up here for a 14-day free trial!

Procedure to Implement Looker Snowflake Integration 

Looker Snowflake Integration Step 1: Creation of a Looker User on Snowflake

You need to run each of these commands individually to create a Looker User on Snowflake. 

Before that, you are required to check the “All Queries” option on the Snowflake Connection Panel. 

-- change role to ACCOUNTADMIN
-- create role for looker
create role if not exists looker_role;
grant role looker_role to role SYSADMIN;
   -- Note that we are not making the looker_role a SYSADMIN,
   -- but rather granting users with the SYSADMIN role to modify the looker_role
-- create a user for looker
create user if not exists looker_user
password = '<enter password here>';
grant role looker_role to user looker_user;
alter user looker_user
set default_role = looker_role
default_warehouse = 'looker_wh';
-- this part is to executed only is the user roles are to be changed  
-- change role
use role SYSADMIN;
-- create a warehouse for looker (optional)
create warehouse if not exists looker_wh
-- set the size based on your dataset
warehouse_size = medium
warehouse_type = standard
auto_suspend = 1800
auto_resume = true
initially_suspended = true;
grant all privileges
on warehouse looker_wh
to role looker_role;

-- grant read only database access (repeat for all database/schemas)
grant usage on database <database> to role looker_role;
grant usage on schema <database>.<schema> to role looker_role;

-- rerun the following any time a table is added to the schema
grant select on all tables in schema <database>.<schema> to role looker_role;
-- or
grant select on future tables in schema <database>.<schema> to role looker_role;

-- create schema for looker to write back to
use database <database>;
create schema if not exists looker_scratch;
grant ownership on schema looker_scratch to role SYSADMIN revoke current grants;
grant all on schema looker_scratch to role looker_role;

Looker Snowflake Integration Step 2: Adding the Database Connection in Looker 

To add New Database Connection in Looker you need to navigate to the Admin panel of the interface. Subsequently select “Connections” and then “New Connections”.  

A Configuration Section will open up where you will be required to enter the following information. 

Name: Provide an Arbitrary name to refer to this connection later. 

Host: This section is for providing Hostname for Snowflake deployment. It is of the format <account_name> 

Port: Depending upon your deployment, you can change the Port number, the default Port is 443. 

Database: Provide the name of the default database that is required for use. 

Username and Password: These are the credentials of the user that will connect to Looker. 

Schema: This the default Database Schema that is used in your Snowflake Deployment.  

Temp Database: This section needs to be set to the Database Schema if PDTs [Persistent Derived Tables] are enabled.

Some selections that are supposed to be made are:

Dialect: Snowflake needs to be selected. 

Persistent Derived Tables: Selection needs to be made to enable PDTs [Persistent Derived Tables]

Some Optional Changes that you might implement: 

Max Connections: This is related to Connection Pool size.

Connection Pool Timeout: Session Timeout for Connection Pool.

Database Time Zone: Default is UTC.

Query Time Zone: Default is UTC.

You can also provide Additional JDBC [Java Database Connectivity] parameters pertaining to Snowflake JDBC driver. An example is providing as follows: 

#You can override each of these by setting an alternative value in the Additional Params field, #for example: &AUTOCOMMIT=FALSE

After making the configurations you can proceed to click on the “Test These Settings” button to check if the connection is Successful. 

Finally, you need to click on the “Add Connection” button to save the connection. 

Looker Snowflake Integration Step 3: Setting Access on Per-Group or Per-User Basis 

You can allocate different computing resources to different users using Looker Attributes. This can help assign Snowflake Warehouses to specific Looker users or even groups.

Firstly, add the groups or users in Looker and then proceed to enter the User Attributes to store the Snowflake Warehouse names as show below:

looker snowflake step 1
Image Source

Then proceed to assign Warehouse name values to the users or groups as shown below: 

looker snowflake step 2
Image Source

Subsequently, you will be required to provide Additional Parameters on the “Connection Settings” page using the format shown below: 

warehouse={{ _user_attributes['snowflake_warehouse'] }}

Looker  Snowflake Integration Step 4: Implementing Snowflake’s Autosuspend Feature

Using the Autosuspend feature, the Warehouse gets suspended after a specified period of time. Therefore all the queries after this period will show an error. However this error is not visible to Third-Party Dashboards such as the one on Looker. 

To counter the shortcomings of this feature, there is also an Auto-resume feature which resumes the Warehouse when it is queried. The specified time period for these features can be controlled in the Warehouses Tab. 

Looker  Snowflake Integration Step 5: Setting Up OAuth for Connection 

To set up an OAuth based connection, you will require a user account with ACCOUNTADMIN permission on Snowflake. 

Firstly you are required to run the following command in Snowflake, where <looker_hostname> is the hostname of the Looker Instance: 

OAUTH_REDIRECT_URI = 'https://<looker_hostname>/external_oauth/redirect';

To obtain the OAuth Client ID and Client Secret, you need to run the following command: 


You will be required to enter these credentials while creating a new connection, and provide the information in the given window. 

Looker Snowflake Integration Step 6: Running Queries and Signing in on Snowflake 

After the OAuth Snowflake connection is set upland running, The users are required to login to run queries, otherwise you will receive the error message as shown below. 

looker snowflake error message
Image Source

The login can be easily completed by entering the “OAuth Connection Credentials” on the “Accounts” page on Looker as shown below.  

looker snowflake step 4
Image Source

You are required to click on the “Allow” button to give Looker access to your Snowflake account. 

Looker Snowflake Integration Step 7: Implementing User Permissions 

You can control the User Permission on Looker as per your requirements for accessing features and data on the platform. For starters, the default roles and permissions are appropriate until the deployment is expanded to multiple users in the organisation. 

More information regarding User Permissions on Looker can be found here.

Limitations of Implementing Looker Snowflake Integration using Snowflake Database Connector

  • The implementation requires technical knowledge of the user pertaining to both platforms and the ability to understand the developer documentation and implement additional parameters using Command Line Interface. 
  • Connection Administrator would require knowledge of the permission handling on both the platforms. 
  • The Autosuspend and Autoresume features can cause delays in running queries while analyzing the data using Looker. 


In this article, you learned about Snowflake, Looker and their key features, steps to implement your Looker Snowflake Integration, its Prerequisites, and the limitations of the manual method.

If you are interested in learning about connecting Google BigQuery with Looker, you can find the guide here, and a guide about connecting Amazon Redshift to Looker can be found here.

Integrating and analyzing data from a huge set of diverse sources can be challenging, this is where Hevo comes into the picture. Hevo Data, a No-code Data Pipeline helps you transfer data from a source of your choice in a fully automated and secure manner without having to write the code repeatedly. Hevo with its strong integration with 150+ sources & BI tools, allows you to not only export & load Data but also transform & enrich your Data & make it analysis-ready in a jiffy.

Get started with Hevo today! Sign up here for a 14-day free trial!

Abhishek Duggal
Former Research Analyst, Hevo Data

Abhishek is a data analysis enthusiast with a passion for data, software architecture, and writing technical content. He has experience writing articles on diverse topics such as data integration and infrastructure.

No-code Data Pipeline for Snowflake

Get Started with Hevo