Snowflake Logs Simplified – Drivers and Connectors 101

Last Modified: December 29th, 2022


Organizations rely on integration with different applications to collect data as well as send data for analysis. However, since different applications require different Drivers and Connectors, monitoring them becomes a tedious task for administrators. This challenge is more prominent in Data Warehouses, where organizations usually connect numerous Drivers and Connectors. To simplify monitoring and fixing issues in the connection, you can collect logs if you are using a Snowflake Data Warehouse. 

In this article, you will learn about different Drivers and Connectors provided by Snowflake and how to generate the Snowflake Logs through them.

Table of Contents


  • Basic Knowledge of Cloud Computing.

What is Snowflake?

Snowflake Logs - Snowflake logo
Image Source

In July 2012, Snowflake was established by 3 Data Warehousing specialists, Benoit Dageville, Thierry Cranes, and Marcin Zukowski. After being in stealth mode for the next two years, in October 2014, Snowflake was launched publicly. In June 2020, it was rebranded as Snowflake Data Cloud.

In September of the same year, Snowflake took the industry by storm due to its unprecedented IPO. Ever since Snowflake has been immensely growing, and today it is one of the popular Data Warehouses. Currently, its headquarters is situated in Bozeman, Montana.

Snowflake is a ‘Data Warehouse-as-a-Service’ platform offering cloud-based data storage and analytics solutions. In other words, Snowflake utilizes cloud computation and storage to provide limitless processing and storage capabilities.

Snowflake runs on popular cloud providers such as Amazon Web Services, Microsoft Azure, and Google Cloud Platform. It is significantly more versatile and faster to use than any traditional Data Warehouse platform.

Key Features of Snowflake

Here are a few features of Snowflake:

1) Cloud Agnostic

Using Snowflake, users can easily integrate Snowflake into their existing cloud infrastructure and deploy it in regions of their choice.

2) Scalability 

The Snowflake’s multi-cluster shared data architecture separated computation and storage. This approach allows users to scale up and down resources without disrupting the service easily. 

3) Security

Snowflake supports various authentication methods, including two-factor authentication and federated authentication for SSO. A hybrid model of discretionary access control and role-based access control can access objects in the account.

4) Concurrency and Workload Separation

Snowflake came into existence, users had to wait for the availability of resources in a traditional Data Warehouse system, resulting in concurrency concerns. There is no issue with concurrency in Snowflake because of its multicluster architecture. This architecture separates workloads to be conducted against their computing clusters, referred to as a virtual warehouse.

Replicate Data in Minutes Using Hevo’s No-Code Data Pipeline

Hevo Data, a Fully-managed Data Pipeline platform, can help you automate, simplify & enrich your data replication process in a few clicks. With Hevo’s wide variety of connectors and blazing-fast Data Pipelines, you can extract & load data from 100+ Data Sources straight into your Data Warehouse or any Databases. To further streamline and prepare your data for analysis, you can process and enrich raw granular data using Hevo’s robust & built-in Transformation Layer without writing a single line of code!


Hevo is the fastest, easiest, and most reliable data replication platform that will save your engineering bandwidth and time multifold. Try our 14-day full access free trial today to experience an entirely automated hassle-free Data Replication!

Snowflake Installation

Step 1: Go to Snowflake. Enter the First Name, Last Name, Email, Company, Country, and click the Continue button.

Snowflake Logs - Snowflahe Signup page
Image Source

Step 2: In the next screen, select an edition, cloud provider, and the region. Next, click on Get Started.

Step 3: At the same time, an email ‘Activate Your Snowflake Account’ will be sent to your registered email address. Once you click on Activate, it will ask you to create the initial username and password.

Snowflake Logs - Sign in
Image Source

Step 4: The screen as shown below appears.

Snowflake Logs - Home page
Image Source

Snowflake Drivers and Connectors

Snowflake supports a variety of popular programming languages and development platforms. Users can use Snowflakes native client Connectors or Drivers to develop applications that use any of these programmatic interfaces.

Using the Snowflake Drivers/Connectors, users can connect to the Snowflake Data Warehouse and perform data access operations such as read-write and metadata import. 

Generating Snowflake Logs for Drivers and Connectors 

Snowflake supports a variety of connections methods:

  • Drivers: Using client services and applications from 3rd-party that supports JDBC or ODBC 
  • Connectors: Developing applications that connect through the Connectors of Snowflake for languages such as Python and Spark.


The 2 types of Drivers are listed below:


An ODBC Driver uses the Open Database Connectivity (ODBC) interface that allows applications to connect data in Database Management Systems (DBMS) using SQL. ODBC allows maximum interoperability, implying that a single application can access multiple DBMS. 

The requirements for the ODBC Driver vary depending on the platform. Also, depending on the cloud service that hosts the Snowflake account, multiple versions of the ODBC Driver support the GET and PUT commands.

1.1) Downloading ODBC for Snowflake Logs

The Snowflake ODBC Driver installation is available from the Snowflake Client Repository.

The Repository serves the client components using the following endpoints:

1.2) Installing ODBC for Snowflake Logs

Double click on the downloaded ODBC file. The Driver will be installed most probably in the following location C:Program Files.

1.3) Generating Snowflake Logs

On Windows, Add a string value to the registry as shown:


Also, update to: LogLevel=6, CurlVerboseMode=true, LogPath=C:PATH)


A JDBC Driver uses Oracle’s JDBC Java Database Connectivity (JDBC) API, which provides a standard method to access data using Java. All users get access to the Driver downloads at no additional cost. 

The JDBC Driver requires Java 1.8 version or higher and a 64-bit environment. Most client applications that support JDBC for connecting to a database server can utilize the Driver. 

2.1) Downloading JDBC for Snowflake Logs

Visit the Maven Central Repository and download the latest version.

2.2) Generating Snowflake Logs

In order to generate the log files in JDBC, add tracing=ALL to the JDBC connection as shown here


Here are some Connectors for different platforms:

1) Snowflake Connector for Python

The Snowflake Connector for Python is a Python interface for connecting to Snowflake and performing all standard operations. The connector is a pure Python library with no JDBC or ODBC dependencies. Pip can also be used to install it on Linux, Mac OS, and Windows that already have a supported version of Python installed.

The Snowflake Connector for Python was used to develop the SnowSQL, although it is not required to install SnowSQL.

1.1) Installing the Python Connector 

Step 1: If you do not have Python already on your system, Click here.

Step 2: Now, in the command prompt again, type 

pip install snowflake-connector-python==<version>

Where, in <version> enter the version of the connector you wish to install

1.2) Generating Snowflake Logs

In order to generate log files, add the following lines to the application code.

import logging
import os

for logger_name in ['snowflake','botocore']:
	logger = logging.getLogger(logger_name)
	ch = logging.FileHandler('python_connector.log')
	ch.setFormatter(logging.Formatter('%(asctime)s - %(threadName)s %(filename)s:%(lineno)d - %(funcName)s() - %(levelname)s - %(message)s'))

What Makes Hevo’s ETL Process Best-In-Class

Providing a high-quality ETL solution can be a difficult task if you have a large volume of data. Hevo’s automated, No-code platform empowers you with everything you need to have for a smooth data replication experience.

Check out what makes Hevo amazing:

  • Fully Managed: Hevo requires no management and maintenance as it is a fully automated platform.
  • Data Transformation: Hevo provides a simple interface to perfect, modify, and enrich the data you want to transfer.
  • Faster Insight Generation: Hevo offers near real-time data replication so you have access to real-time insight generation and faster decision making. 
  • Schema Management: Hevo can automatically detect the schema of the incoming data and map it to the destination schema.
  • Scalable Infrastructure: Hevo has in-built integrations for 100+ sources (with 40+ free sources) that can help you scale your data infrastructure as required.
  • Live Support: Hevo team is available round the clock to extend exceptional support to its customers through chat, email, and support calls.
Sign up here for a 14-day free trial!

2) Snowflake Connector for Spark

Snowflake Connector for Spark, also known as Spark Connector, integrates Snowflake into the Apache Spark environment, enabling it to read and write data to Snowflake. For each version of Spark, there is a different version of the Snowflake connector. As a Spark plugin, the connector executes and is available as a Spark package (spark-snowflake).

Snowpark is an excellent alternative to Spark, which enables users to perform all the tasks within Snowflake rather than as a separate cluster. The Spark Connector uses the Snowflake JDBC Driver to establish a connection to Snowflake.

2.1) Downloading Apache Spark for Snowflake Logs

Step 1: Download the Apache Spark Connector as per the version of Scala on your system.

Step 2: Install the Snowflake JDBC Driver.

2.2) Generating Snowflake Logs

In order to enable the JDBC Driver to log in to the Spark connector, append “tracing”: “all” to the parameter list “sfOptions”.

In the Scala:

var sfOptions = Map(
"sfURL" -> ("" + ""),
"sfUser" -> "test",
"sfPassword" -> "**********",
"sfRole" -> sys.env.getOrElse("SF_ROLE", "PUBLIC"),
"sfDatabase" -> sys.env.getOrElse("SF_DB", "DB"),
"sfSchema" -> sys.env.getOrElse("SF_SCHEMA", "PUBLIC"),
"sfWarehouse" -> sys.env.getOrElse("SF_WH", "COMPUTE_WH"),
"traing" -> "all"

Generating Snowflake Logs for CLI Clients

We will be using SnowSQL which is an interactive Snowflake terminal. You may use it to run queries, create database objects, and conduct various administrative activities.


SnowSQL, the structured query language of Snowflake, is a new-generation command-line client. It is used for connecting to Snowflake for executing SQL queries as well as performing all DDL (Data Definition Language) and DML (Data Manipulation Language) operations. Using SnowSQL, users can control all aspects of the Snowflake Data Cloud, including uploading, querying, modifying, and removing data. 

1) Platform-Specific Version Requirements

  • Linux: Ubuntu 16.04 or later.
  • macOS: 10.14 or later.
  • Windows: Windows 8 or later.

2) Download SnowSQL

On the Snowflake Interface, on the upper right side, click on Help > Download the SnowSQL CLI client.

Image Source

3) SnowSQL Installation

Step 1: Open the command prompt, type snowsql, and hit enter.

Step 2: Now, log in to Snowflake through SnowSQL, type snowsql -a y, where a is the account name, and y is the username.

4) Generating Snowflake Logs

Add -o log_level=DEBUG to the command-line arguments


In the config file of SnowSQL, update log_level=DEBUG:

For Windows: %USERPROFILE%.snowsqlconfig


Using the Drivers, Connector, or CLI clients to make connections to the Snowflake makes it easy to generate the log files. In this article, we learned about Snowflake Logs and their features.

Further, we learned about different Drivers and Connectors used to connect to the Snowflake platform. In the end, we learned how to generate the Snowflake Logs using these Drivers and Connectors.

While Snowflake Services are useful, maintaining the correct environment on a regular basis is a difficult undertaking. Further, extracting data from a variety of sources and integrating it into your Data Warehouse can be a daunting task. This is where Hevo comes to your aid to make things easier! Hevo Data is a No-code Data Pipeline and has awesome 100+ pre-built Integrations that you can choose from.

Visit our Website to Explore Hevo

Hevo can help you integrate your data from numerous sources and load them into destinations like Snowflake to analyze real-time data. It will make your life easier and data migration hassle-free.

Want to take Hevo for a spin? Sign Up for a 14-day free trial and see the difference!

Share your experience of learning about the Snowflake Logs in the comments section below. We would love to hear from you!

Freelance Technical Content Writer, Hevo Data

Shravani is a data science enthusiast who loves to delve deeper into complex topics on data science and solve the problems related to data integration and analysis through comprehensive content for data practitioners and businesses.

No-code Data Pipeline for Snowflake