Grafana is an open-source solution for running data analytics, pulling up metrics that make sense of the massive amount of data & to monitor our apps with the help of cool customizable dashboards. Grafana connects with every possible data source, commonly referred to as databases such as Graphite, Prometheus, Influx DB, ElasticSearch, MySQL, PostgreSQL, etc.

Grafana Snowflake Integration allows utilizing these dashboards for more detailed information. Snowflake is one of the few enterprise-ready cloud data warehouses that brings simplicity without sacrificing features. It automatically scales, both up and down, to get the right balance of performance vs. cost.

Snowflake’s claim to fame is that it separates compute from storage. This is significant because almost every other database, Redshift included, combines the two together, meaning you must size for your largest workload and incur the cost that comes with it.

Introduction to Snowflake

Grafana Snowflake: Snowflake Logo
Image Source: encrypted-tbn0.gstatic.com

Snowflake’s Data Cloud is based on a cutting-edge data platform that is available as Software-as-a-Service (SaaS). Snowflake provides data storage, processing, and analytic solutions that are faster, easier to use, and more adaptable than traditional systems.

The Snowflake data platform isn’t based on any existing database or “big data” software platforms like Hadoop. Snowflake, on the other hand, combines a completely new SQL query engine with an innovative cloud-native architecture.

Snowflake gives all of the capability of an enterprise analytic database to the user, as well as a number of additional special features and capabilities.

Data Platform as a Cloud Service

Snowflake is a real software-as-a-service solution.

  •  More particular, there is no hardware to select, install, configure, or manage (virtual or actual).
  • There isn’t much to install, configure, or administer in terms of software.
  • Snowflake is in charge of ongoing maintenance, management, updates, and tweaking.

Snowflake Architecture

The Snowflake database design is a mix of shared-disk and shared-nothing database architectures. Snowflake uses a central data repository for persisting data that is accessible from all compute nodes in the platform, similar to shared-disk systems.

Snowflake, however, performs queries utilizing MPP (massively parallel processing) compute clusters, in which each node in the cluster maintains a piece of the full data set locally, akin to shared-nothing systems.

This method combines the ease of data management of a shared-disk design with the performance and scale-out advantages of a shared-nothing architecture.

The architecture of a snowflake is made up of three layers:

  • Database Storage
  • Query Processing
  • Cloud Services

Learn more about Snowflake.

Introduction to Grafana

Grafana is more than just a collection of features. Switch between metrics, logs, and traces with ease. Connect data and get to the bottom of the problem more quickly and easily. The de-risk feature launches by reducing mean time to recovery (MTTR).

Give your people the resources they want. Leave the platform to us so you can focus on what you do best.

Unify your data, not your database

You don’t have to ingest data to a backend store or vendor database using Grafana. Grafana, on the other hand, takes a novel approach to offer a “single-pane-of-glass” by unifying all of your existing data, regardless of where it resides.

Data everyone can see

Grafana was created on the premise that data should be available to everyone in your company, not just the Ops person.

Flexibility and versatility

Any data may be translated and transformed into flexible and versatile dashboards. Grafana, unlike other technologies, allows you to create dashboards tailored to you and your team.

Grafana Snowflake Integration

You may query and view Snowflake data metrics from Grafana using the Snowflake data source plugin.

Grafana Snowflake Integration: Requirements

The following are the prerequisites for the Snowflake data source:

  • With a valid license, Grafana Enterprise can be used.
  • A Grafana user with the position of severing admin or org admin.
  • A Snowflake user who has been assigned the right role.
  • No specific role is required for this data source.
  • The role of a Snowflake user determines whether or not the user has access to tables. Ensure that your user has the required roles in order to query your data.

1) Grafana Snowflake Integration: Configure Snowflake

A Snowflake user with a username and password is required to configure the Snowflake data source.

For this data source, Grafana recommends creating a new user with limited rights.

2) Grafana Snowflake Integration: Create a user

  • To connect to Snowflake, you must either create a user or login with an existing one. All queries sent from Grafana will be executed by this user.
  • You should construct numerous Snowflake data sources with various configurations if you want different users to run different queries/workloads.
  • You must first connect to your Snowflake instance and run the CREATE USER command to create a user.

3) Grafana Snowflake Integration: Grant a role

  • Now that the Snowflake user has been established, the GRANT ROLE command must be used to assign the user a role. When a user is given a role, the user is able to conduct the operations that the role authorizes.
  • The user’s job determines which warehouses and tables he or she has access to.

4) Grafana Snowflake Integration: Configure the data source in Grafana

These are the same connection parameters that are used when performing Grafana Snowflake Integration

Fill in the following fields to create a data source:

  • Name: A unique name for this Snowflake data source.
  • Account: Account is the name of the Snowflake account that Snowflake has assigned to you. The account name is the complete string to the left of snowflakecomputing.com in the URL received from Snowflake after the account was provisioned. The region must be included in the account name if the Snowflake instance is not on us-west-2. xyz123.us-east-1 is an example. If the Snowflake instance is not hosted on Amazon Web Services, the account name must additionally mention the platform. xyz123.us-east-1.gcp is an example.
  • Username: The username of the Snowflake account that will be queried.
  • Password: The password for the account that will be query Snowflake
  • Region: Account has been deprecated in favor of Region. The Snowflake instance’s region is specified via the region.
  • Role: Taking on the role of This option enables users to connect to the Snowflake instance with a role that isn’t the user’s default. To be assumed, the role must still be granted to the user using the GRANT ROLE command.
  • Warehouse: The default warehouse for queries is the warehouse.
  • Database: The database that will be used by default for queries.
  • Schema: The schema that will be used by default for queries.

5) Grafana Snowflake Integration: Configure the data source with provisioning

Grafana’s provisioning system allows you to configure data sources via config files.

Grafana Snowflake: configure data source
Image Source: grafana.com

6) Grafana Snowflake Integration: Query the data source

The query editor that is given is a typical SQL query editor. Grafana has a few macros that can help you write more complicated time series queries.

Grafana Snowflake: query data sources
Image Source:grafana.com

7) Grafana Snowflake Integration: Inspecting the query

The complete rendered query, which can be copied/pasted directly into Snowflake, is accessible in the Query Inspector since Grafana supports macros that Snowflake does not.

Click the Query Inspector button to see the entire interpolated query, which will appear under the “Query” tab.

Conclusion

This Article gave a comprehensive overview of Grafana and Snowflake. It also gave a step-by-step guide on setting up Grafana Snowflake Integration.

While using Grafana Snowflake Integration is insightful, it is a hectic task to Set Up the proper environment. To make things easier, Hevo comes into the picture.

Muhammad Faraz
Technical Content Writer, Hevo Data

Muhammad Faraz is an AI/ML and MLOps expert with extensive experience in cloud platforms and new technologies. With a Master's degree in Data Science, he excels in data science, machine learning, DevOps, and tech management. As an AI/ML and tech project manager, he leads projects in machine learning and IoT, contributing extensively researched technical content to solve complex problems.

No-code Data Pipeline For Snowflake