Snowflake CAST & TRY_CAST Commands 101: Syntax & Usage Simplified

By: Published: January 4, 2022

Snowflake Cast - Featured Image

When working with an On-premise Data Storage solution, it is not always feasible to scale your Storage and Computation resources instantly without incurring hefty costs. An economical and faster alternative is to opt for Cloud Data Warehousing and Analytics Solutions such as Snowflake. Built especially for the Cloud, Snowflake can scale Storage and Computational resources independently on the go. Making it super intuitive for Data Analysts, Snowflake provides complete ANSI SQL Language support for Data Blending, Analysis, and Transformations.

To analyse your data efficiently, one of the important tasks is to convert the columns/values with default data types to the correct data types. You can use the Snowflake CAST and TRY_CAST commands to perform the required data type conversions.

In this article, you will learn how to effectively use the Snowflake CAST and TRY_CAST commands to convert a value of one data type into another data type.

Table of Contents

What is Snowflake?

Snowflake Cast - Snowflake Logo
Image Source

Snowflake is a one-stop Cloud SaaS (Software as a Service) platform that can act as a Data Warehouse, Data Lake as well as perform activities associated with Data Science, Data Analytics, Data Sharing, Data Engineering, etc. Owing to its architecture, you can scale the storage capacity and the Computation resources independently. Operating as a True SaaS Data Platform, you don’t need to install or configure any software and all the maintenance tasks & updates are managed completely by Snowflake.

Incorporated with a Powerful SQL Query Engine, Data Analysts can process complex Queries in Snowflake much faster than any traditional Data Analytics solution. Snowflake is a Cost-effective tool as you only pay for the resources you use for that instant and will automatically scale according to the workload. The Data Sharing capabilities of Snowflake also allow you to securely share specific data within your organisation as well as with external sources such as vendors & clients.

Key Features of Snowflake

Founded in 2012, Snowflake has evolved into a Data Warehousing & Analytics giant that offers the following eye-catching features:

  • Semi-Structured Data Support: Today, data is generated in several formats from sensors, mobile devices, etc. Snowflake provides data storing capabilities for files in JSON, Avro, ORC, Parquet, and XML format. Snowflake also provides support for a read-on schema data type called VARIANT (Size Limit 16 MB) that can store both Structured and Semi-Structured Data.  
  • Zero-Copy Cloning: Cloning is an important task for performing testing changes, ad hoc analysis, or building dev/test environments from a Production Database. Snowflake’s Zero-Copy feature allows you to instantly clone a database, schema, or table. The advantage is that it doesn’t create a new copy as it tracks the changes made to the clone on its metadata store, thereby eliminating any additional storage costs.
  • Data Recovery: It often happens that mistakenly wrong tables are dropped. Snowflake allows you to instantly recover the table using the UNDROP Command as long as you remain in the recovery window. Snowflake also retains the historical data of your file changes via the Snowflake Time Travel feature. Using it you can easily recover your data that might have been affected due to data deletion, data modification, and software or hardware failure.
  • Best-in-Class Query performance: Snowflake uses a Columnar Data Store structure making it more efficient to retrieve small portions of data from the tables. It also provides better performance for repeated queries by fetching results stored in Results Cache. Snowflake automatically executes Micro-Partitioning and Data Clustering that provides an optimized table structure.
  • Data Sharing: Saving the additional storage costs, Snowflake allows you to share data across your organisation without creating a separate copy. It also fosters collaboration by allowing you to invite members to work on the same data collectively. For users outside Snowflake, you can provide them access to your data via Reader Accounts.
  • Data Security and Governance: Snowflake supports a variety of authentication methods such as Multi-factor authentication (MFA), Federated authentication & single sign-on (SSO), and OAuth. You can also set Column & Row-level access permissions. Your data is completely encrypted automatically via AES 256 Encryption in both transit and rest.
Supercharge Snowflake ETL and Analysis Using Hevo’s No-code Data Pipeline

Hevo Data is a No-code Data Pipeline that offers a fully managed solution to set up Data Integration for 100+ Data Sources (Including 40+ Free sources) and will let you directly load data to a Data Warehouse such as Snowflake or the Destination of your choice. It will automate your data flow in minutes without writing any line of code. Its fault-tolerant architecture makes sure that your data is secure and consistent. Hevo provides you with a truly efficient and fully automated solution to manage data in real-time and always have analysis-ready data. 

Get Started with Hevo for Free

Let’s look at some of the salient features of Hevo:

  • Fully Managed: It requires no management and maintenance as Hevo is a fully automated platform.
  • Data Transformation: It provides a simple interface to perfect, modify, and enrich the data you want to transfer. 
  • Real-Time: Hevo offers real-time data migration. So, your data is always ready for analysis.
  • Schema Management: Hevo can automatically detect the schema of the incoming data and map it to the destination schema.
  • Connectors: Hevo supports 100+ Integrations to SaaS platforms such as WordPress, FTP/SFTP, Files, Databases, BI tools, and Native REST API & Webhooks Connectors. It supports various destinations including Google BigQuery, Amazon Redshift, Snowflake, Firebolt, Data Warehouses; Amazon S3 Data Lakes; Databricks; and MySQL, SQL Server, TokuDB, DynamoDB, PostgreSQL Databases to name a few.  
  • Secure: Hevo has a fault-tolerant architecture that ensures that the data is handled in a secure, consistent manner with zero data loss.
  • Hevo Is Built To Scale: As the number of sources and the volume of your data grows, Hevo scales horizontally, handling millions of records per minute with very little latency.
  • Live Monitoring: Advanced monitoring gives you a one-stop view to watch all the activities that occur within Data Pipelines.
  • Live Support: Hevo team is available round the clock to extend exceptional support to its customers through chat, email, and support calls.
Sign up here for a 14-Day Free Trial!

How to use the Snowflake CAST command?

Snowflake CAST is a data-type conversion command. Snowflake CAST works similar to the TO_ datatype conversion functions. If a particular data type conversion is not possible, it raises an error. Let’s understand the Snowflake CAST in detail via the syntax and a few examples.

A) Snowflake CAST Syntax

CAST( <source_expr> AS <target_data_type> )

As an alternative to CAST, you can use the :: operator for the same functionality.

<source_expr> :: <target_data_type>

The Snowflake CAST command has the following 2 arguments:

  • Source_expr: This is the expression of the supported data type you need to convert.
  • Target_data_type: This contains the desired data type to which you want to convert the expression. You can also provide the Precision/Scale properties for numbers/decimals. Here, Scale refers to the digits after the decimal point and Precision refers to the total number of digits in the number(including the digits after the decimal).

B) Snowflake CAST Examples

Let’s observe the following use cases and the outputs to understand the working of the Snowflake CAST command:

  • Using the Snowflake CAST command to convert a string consisting of a number to a decimal with scale (2):
select cast('1.6845' as decimal(6,2));

Output:

1.68

Here, since the scale provided is insufficient to hold the input expression, the Snowflake CAST function rounds the value.

  • Using the :: notation to convert a string to a decimal with a scale of 5,
select '1.6845'::decimal(6,5);

Output:

1.68450
  • Using the Snowflake CAST command to convert a number to an integer:
select cast(4.78 as integer);

Output:

5
  • Using the Snowflake CAST command to convert a string containing a Date to a Timestamp:
select cast('10-Sep-2021' as timestamp);

Output:

2021-09-10 00:00:00.000
  • When the provided precision is insufficient to hold the input value, the Snowflake CAST command raises an error as follows:
Snowflake Cast - Snowflake CAST error message
Image Source

Here, precision is set as 4 but the input value has a total of 5 digits, thereby raising the error.

How to use the Snowflake TRY_CAST command?

Snowflake TRY_CAST is a special case of the Snowflake CAST command. It also converts the data type into another, however, it returns a NULL value instead of an error message when the conversion is not possible. Let’s understand the Snowflake TRY_CAST in detail via the syntax and a few examples.

A) Snowflake TRY_CAST Syntax

TRY_CAST( <source_string_expr> AS <target_data_type> )

Both the arguments are similar to the Snowflake CAST function. The TRY_CAST command only takes a string as an input value. For the Target_data_type argument, it supports VARCHAR (or any of its versions), NUMBER (or any of its versions), DOUBLE, BOOLEAN, DATE, TIME, TIMESTAMP, TIMESTAMP_LTZ, TIMESTAMP_NTZ, or TIMESTAMP_TZ.

B) Snowflake TRY_CAST Examples

Let’s observe the following use cases and the outputs to understand the working of the Snowflake TRY_CAST command:

  • Using the Snowflake TRY_CAST command to convert the String Containing a Date to Timestamp.
select try_cast('05-Mar-2016' as timestamp);

Output: 

2016-03-05 00:00:00.000
  • The Snowflake TRY_CAST command returns NULL as the input value has more characters than the provided precision in the target data type.
select try_cast('ABCD' as char(2));

Output:

NULL

Conclusion

In this article, you have learned how to effectively use the Snowflake CAST and TRY_CAST command to perform the data type conversions. The Snowflake CAST function allows you to set precision and scale parameters for converting the Data type and the value according to your business logic. When the Data conversion is not successful, Snowflake CAST returns an error message. To remedy that you can use the Snowflake TRY_CAST function that returns a NULL value whenever the data type conversion is not possible.

As your business begins to grow, data is generated at an alarming rate across all of your company’s SaaS applications. To meet this growing storage and computing needs of data,  you would require to invest a portion of your engineering bandwidth to Integrate data from all sources, Clean & Transform it, and finally load it to a Cloud Data Warehouse such as Snowflake for further business analytics. All of these challenges can be efficiently handled by a Cloud-Based ETL tool such as HevoData.

Visit our Website to Explore Hevo

Hevo Data, a No-code Data Pipeline can seamlessly transfer data from a vast sea of 100+ sources to a Data Warehouse like Snowflake or a Destination of your choice to be visualised in a BI Tool. It is a reliable, completely automated, and secure service that doesn’t require you to write any code!  

If you are using Snowflake as a Data Warehousing and Analytics Platform and searching for a no-fuss alternative to Manual Data Integration, then Hevo can effortlessly automate this for you. Hevo, with its strong integration with 100+ sources & BI tools, allows you to not only export & load data but also transform & enrich your data & make it analysis-ready in a jiff.

Want to take Hevo for a ride? Sign Up for a 14-day free trial and simplify your Data Integration process. Do check out the pricing details to understand which plan fulfills all your business needs.

Tell us about your experience of working with the Snowflake CAST and TRY_CAST commands! Share your thoughts with us in the comments section below.

Sanchit Agarwal
Former Research Analyst, Hevo Data

Sanchit Agarwal is a data analyst at heart with a passion for data, software architecture, and writing technical content. He has experience writing more than 200 articles on data integration and infrastructure. He finds joy in breaking down complex concepts in simple and easy language, especially related to data base migration techniques and challenges in data replication.

No-code Data Pipeline for Snowflake