10 Things Every Snowflake Admin Should be Doing to Optimize Credits

on Data Warehouses, Snowflake • March 10th, 2022 • Write for Hevo

In today’s data-driven world, organizations need to find efficient ways of processing and storing their data. With the evolution of technology in the recent decade, companies are faced with two options for Data Storage and Processing: a physical server environment or a Virtual Server environment. 

In truth, both of these systems have their underlying benefits and challenges. Therefore, the choice eventually falls on the concerned firm and will primarily be influenced by their needs. Regardless, it goes without saying that Virtual Servers are the best option due to several reasons. For instance, Virtual Servers offer significant advantages when it comes to Scalability. You don’t have to buy new hardware or discard existing resources whenever you want to scale up or down since everything is housed off-site. 

Furthermore, Virtual Servers do not occupy any space and are way more efficient when compared to Physical Servers. To put this into perspective, the Virtual Software Market was at 63 billion dollars as of 2020. This shows just how popular this sector is.

With this in mind, this post will dive into the nuts and bolts of Snowflake. More specifically, after reading, you will have a rough idea of strategies any Snowflake Admin can utilize to optimize credit usage on the platform. 

Table of Contents

Prerequisites 

This post assumes that you have Snowflake set up and running.  

What is Snowflake?

Snowflake Admin: Snowflake logo
Image Source

Whenever you hear of Snowflake, you think of terms like Serverless and Cloud technology. So, in simple terms, Snowflake is a Cloud-native Serverless solution designed so that companies do not have to invest in Data Marts, Data Warehouses, and Data Lakes. The platform was unveiled way back in 2012 and has since become one of the household names in the data warehousing and analytics industry. Below are some of the platform’s top benefits: 

Key Benefits of Snowflake

  • Unlimited Scalability Capabilities: This is perhaps one of the features that have put Snowflake in its current position on the map. It has an elastic engine that enables it to deliver unlimited scalability features. Coupled with its multi-cluster resource allocation, Snowflake will handle any sort of workload that comes it’s the way. 
  • Integrations: This highly efficient feature allows customers to connect Snowflake with third-party applications. It is made possible by Integration Platform as a Service tool such as SnapLogic that enables users to create data pipelines. 
  • Minimal Administration: Snowflake is designed to run with a minimal observation from the snowflake admin. Accordingly, features such as auto-scaling enable it to increase or decrease the virtual warehouse size depending on the current workload. 
  • Cloud Provider Agnostic Solution: Snowflake is a cloud provider agnostic solution meaning it is available on all three major cloud providers: AWS, GCP, and Azure. 

By now, you should have a rough idea of what Snowflake is and the benefits it offers users. Let’s understand credits: What are they, and how do they relate to Snowflake? 

What are Snowflake Credits? 

Credits are units of Measure. In simple terms, they are the entities used to pay for Snowflake’s Resources. Furthermore, these credits are only used when Snowflake is running, and resources are consumed, such as utilizing some of the platform’s features. 

Here below is how exactly can snowflake Admins utilize credits to ensure you don’t overspend on Snowflake. 

Simplify Snowflake Data Transfer with Hevo’s No-code Pipeline

Hevo Data is a No-code Data Pipeline that helps you transfer data from 100+ sources (including 40+ Free Data Sources) to Snowflake in real-time in an effortless manner. After using Hevo you can easily carry out Snowflake Create Users Tasks.

Get Started with Hevo for Free

Key Features of Hevo Data:

  • Fully Managed: Hevo Data is a fully managed service and is straightforward to set up.
  • Transformations: Hevo provides preload transformations through Python code. It also allows you to run transformation code for each event in the Data Pipelines you set up. You need to edit the event object’s properties received in the transform method as a parameter to carry out the transformation. Hevo also offers drag and drop transformations like Date and Control Functions, JSON, and Event Manipulation to name a few. These can be configured and tested before putting them to use.
  • Connectors: Hevo supports 100+ integrations to SaaS platforms, files, databases, analytics, and BI tools. It supports various destinations including Amazon Redshift, Snowflake Data Warehouses; Amazon S3 Data Lakes; and MySQL, SQL Server, TokuDB, DynamoDB, PostgreSQL databases to name a few.  
  • Ensure Unique Records: Hevo Data helps you ensure that only unique records are present in the tables if Primary Keys are defined.
  • Multiple Sources: Hevo Data has various connectors incorporated with it, which can connect to multiple sources with ease. 
  • Automatic Mapping: Hevo Data automatically maps the source schema to perform analysis without worrying about the changes in the schema.
  • Real-time Data Transfer: Hevo Data works on both batch as well as real-time data transfer. 
  • Resume from Point of Failure: Hevo Data can resume the ingestion from the point of failure if it occurs. 
  • Advanced Monitoring: Advanced monitoring gives you a one-stop view to watch all the activity that occurs within pipelines.
  • 24/7 Support: With 24/7 Support, Hevo provides customer-centric solutions to the business use case.

Steps to load Snowflake data using Hevo Data:

  • Sign up on the Hevo Data, and select Snowflake as the destination.
  • Provide the user credentials and connect to the server.
  • Select the database, and schema to load the data.
Sign up here for a 14-Day Free Trial!

10 Things Every Snowflake Admin Should be Doing to Optimize Credits

Snowflake Admin
Image Source

As a Snowflake Admin, you may be wondering what you should do to optimize credit usage on the platform. Below are ten tips you should keep in mind to minimize Snowflake credit costs. 

1) Keep an Eye out of Reader Accounts 

Reader Accounts are created when data is shared with non-Snowflake accounts. The Users can then run queries and other functions on these accounts while the provider bears the overall cost. With this in mind, as a Snowflake Admin, it is always a good idea to set Data Limits on these accounts and even set alerts for situations when the user leaves the Virtual Warehouse running. 

2) Track your Account Usage

As a Snowflake Admin, you should track specific account usage Metrics such as query history and warehouse metering history. Ideally, you should observe these factors based on a particular period. This way, you will watch the loads that took up excessive credits due to optimization time and find ways of dealing with them. 

For instance, you may find a query running 20 times a day and completing in a minimum of 20 seconds in each run. This means that the total compute time will be 200 seconds. It presents a significant issue since the minimum charge time for Snowflake is 60 seconds, meaning you will be paying for 1200 seconds. These are some of the issues you should look out for and deal with effectively. 

3) Loading large files: Avoid the COPY Command. 

You may find yourself in a situation where you are loading large files from a different warehouse into Snowflake. You may be tempted to use the COPY command to allow for faster loading. However, this is never a good idea. Instead, you should break down the large file into smaller bits and load them using the COPY command. Snowflake will then divide the load into parallel workloads and load several files simultaneously. Lesser computing time will be needed, meaning you will pay for fewer credits instead of loading one heavy file. 

4) Make Use of Resource Monitors

The monthly budget is usually broken down in most companies, and each department is allocated a specific amount. This means that the Snowflake Admin will have a particular amount they are supposed to spend monthly on Snowflake-related tasks (credits, for the most part). With this in mind, it is recommended to set up resource monitors to alert you when the monthly credit quota has been reached. In such scenarios, there are several follow-up measures that the Snowflake Admin can take, such as aborting all the running queries and suspending the Virtual Warehouse. 

5) Ensure that all Snowflake Users are Educated on Warehouse Usage and selection.

As a Snowflake Admin, you must ensure that all the Snowflake Users in your company adhere to best practices on the platform. Therefore, you need to check the Query history on a predetermined time frame and note consumption and completion time. 

6) Check and Update the Default Query Timeout Value

Have you ever thought about what happens when a query has been running for too long? Well, Snowflake will automatically abort a query running for 172800 seconds or two days. This means that you will have been charged for two days. Hence, it is always good to change this default time frame to a smaller value to minimize costs. 

7) Virtual Warehouses 

As mentioned earlier, Computing Resource Usage will heavily influence credit costs. You also know that the smaller the warehouse, the smaller the computing costs. Therefore, it is always good to start with smaller warehouses and run all your queries before deciding on your computing needs. 

8) Suspend Any Virtual Warehouses during Idle time

Virtual Warehouses are used to run queries in Snowflake and incur computing costs in the process. Therefore, you should auto suspend these queries during idle time otherwise, you will incur unnecessary credit costs. Snowflake allows users a minimum auto suspend time of 5 minutes, but you can use the following statement to reduce it to whichever time you want: 

alter warehouse <warehouse_name> set auto_suspend = <num_in_seconds>;

9) Check on Data Source Region and Cloud Service Provider Availability

Snowflake is a Cloud-agnostic solution meaning it supports all the major Cloud Service Providers with the same functionality. However, you will incur data transfer costs when loading data from one Cloud Service Provider. Therefore, you need to consider the region from where the data will be loaded to avoid unnecessary transfer costs in the future. 

10) Zero-Copy Cloning

Cloning is an efficient strategy you can utilize when the same data is required for several purposes. This is because only metadata is created for the cloned table, ensuring that you do not incur any additional costs for the same data. 

Conclusion 

This post is designed for Snowflake Admins looking for ingenious ways of reducing credit usage while maintaining the same level of functionality on the platform. By following the strategies laid out above, you will significantly minimize credit usage on Snowflake and even boost the platform’s functionality. It is recommended for Snowflake Admins only to utilize the strategies in line with their company policies. In case you want to export data from a source of your choice into your desired Database/destination like Snowflake, then Hevo Data is the right choice for you! 

Visit our Website to Explore Hevo

Hevo Data provides its users with a simpler platform for integrating data from 100+ sources for Analysis. It is a No-code Data Pipeline that can help you combine data from multiple sources. You can use it to transfer data from multiple data sources into your Data Warehouses such as Snowflake, Database, or a destination of your choice. It provides you with a consistent and reliable solution to managing data in real-time, ensuring that you always have Analysis-ready data in your desired destination.

Want to take Hevo for a ride? Sign Up for a 14-day free trial and simplify your Data Integration process. Do check out the pricing details to understand which plan fulfills all your business needs.

Share your experience of learning about Snowflake Admin Credit Usage! Let us know in the comments section below!

No-code Data Pipeline for Snowflake