Snowflake Create Role Command: Syntax, Usage & Examples Simplified 101

Manisha Jena • Last Modified: December 29th, 2022

Snowflake Create Role_FI

The world is becoming increasingly computerized. Soon, every part of our lives will be connected to the Internet, providing consumers with more conveniences such as instant access to information. While this is a positive development, the sheer volume of data generated as a result of digitalization is staggering. Snowflake is one such Cloud Data Warehouse that is helping out users manage this colossal volume of data.

You would often be required to Create a new role or replace an existing role in the system. And accordingly you would be required to grant object privileges to the roles. For this purpose, the Snowflake CREATE ROLE Command is utilised.

In this article, you will gain information about Snowflake Create Role Command. You will also gain a holistic understanding of Snowflake, its key features, SQL for Snowflake, types of SQL Commands, DDL Commands, User & Security DDL Commands, Role Management, and the usage of Snowflake Create Role Command in detail. Read along to find out in-depth information about Snowflake CREATE ROLE Command.

Table of Contents

What is Snowflake?

Snowflake Create Role : Snowflake Logo | Hevo Data
Image Source

Snowflake is a Data Warehouse-as-a-service platform built for the cloud. Its data architecture is different from that of Amazon Redshift because it uses the scalable, elastic Azure Blobs Storage as the internal storage engine and Azure Data Lake to store the unstructured, structured, and on-premise data ingested via the Azure Data Factory. 

The Snowflake Data Warehouse provides security and protection of data using Amazon S3 policy controls, SSO, Azure SAS tokens, and Google Cloud Storage access permissions. You can also scale your storage depending on your storage needs. 

The key benefit of leveraging Snowflake are as follows:

  • Given the elastic nature of the cloud, you can scale up your virtual warehouse to take advantage of extra compute resources to say run a high volume of queries, or load data faster.
  • With Snowflake you can combine semistructured and structured data for analysis and load it into the database without the need to transform or convert it into a fixed relational schema beforehand.
  • Snowflake has a multi-cluster architecture that takes care of concurrency issues like failures and delays.
  • Snowflake’s architecture enables companies to leverage it to seamlessly share data with any data consumer.
Snowflake Create Role: Snowflake with Azure working| Hevo Data
Image Source

Key Features of Snowflake

Snowflake Create Role Command - Snowflake Features| Hevo data
Image Source

Here are some of the benefits of using Snowflake as a Software as a Service (SaaS) solution:

  • Snowflake enables you to enhance your Analytics Pipeline by transitioning from nightly batch loads to real-time data streams, allowing you to improve the quality and speed of your analytics. By allowing safe, concurrent, and controlled access to your Data Warehouse across your organization, you can improve the quality of analytics at your company.
  • Snowflake uses the caching paradigm to swiftly deliver the results from the cache. To avoid re-generation of the report when nothing has changed, Snowflake employs persistent (within the session) query results.
  • Snowflake allows you to break down data silos and provide access to meaningful insights across the enterprise, resulting in better data-driven decision-making. This is a crucial first step toward bettering partner relationships, optimizing pricing, lowering operational expenses, increasing sales effectiveness, and more.
  • Snowflake allows you to better understand user behavior and product usage. You can also use the whole scope of data to ensure customer satisfaction, drastically improve product offers, and foster Data Science innovation.
  • Snowflake allows you to create your own Data Exchange, which allows you to securely communicate live, controlled data. It also encourages you to improve data relationships throughout your business units, as well as with your partners and customers.
  • Secure Data Lake: You can use a secure Data Lake to store all compliance and cybersecurity data in one place. Snowflake Data Lakes ensure quick incident response times. This allows you to understand the complete picture of an incident by clubbing high-volume log data in a single location, and efficiently analyzing years of log data in seconds.

Simplify Snowflake ETL and Analysis with Hevo’s No-code Data Pipeline

A fully managed No-code Data Pipeline platform like Hevo Data helps you integrate and load data from 100+ different sources (including 40+ free sources) to a Data Warehouse such as Snowflake or Destination of your choice in real-time in an effortless manner. Hevo with its minimal learning curve can be set up in just a few minutes allowing the users to load data without having to compromise performance. Its strong integration with umpteenth sources allows users to bring in data of different kinds in a smooth fashion without having to code a single line. 

Get Started with Hevo for Free

Check out some of the cool features of Hevo:

  • Completely Automated: The Hevo platform can be set up in just a few minutes and requires minimal maintenance.
  • Transformations: Hevo provides preload transformations through Python code. It also allows you to run transformation code for each event in the Data Pipelines you set up. You need to edit the event object’s properties received in the transform method as a parameter to carry out the transformation. Hevo also offers drag and drop transformations like Date and Control Functions, JSON, and Event Manipulation to name a few. These can be configured and tested before putting them to use.
  • Connectors: Hevo supports 100+ integrations to SaaS platforms, files, Databases, analytics, and BI tools. It supports various destinations including Google BigQuery, Amazon Redshift, Snowflake Data Warehouses; Amazon S3 Data Lakes; and MySQL, SQL Server, TokuDB, DynamoDB, PostgreSQL Databases to name a few.  
  • Real-Time Data Transfer: Hevo provides real-time data migration, so you can have analysis-ready data always.
  • 100% Complete & Accurate Data Transfer: Hevo’s robust infrastructure ensures reliable data transfer with zero data loss.
  • Scalable Infrastructure: Hevo has in-built integrations for 100+ sources (including 40+ free sources) that can help you scale your data infrastructure as required.
  • 24/7 Live Support: The Hevo team is available round the clock to extend exceptional support to you through chat, email, and support calls.
  • Schema Management: Hevo takes away the tedious task of schema management & automatically detects the schema of incoming data and maps it to the destination schema.
  • Live Monitoring: Hevo allows you to monitor the data flow so you can check where your data is at a particular point in time.
Sign up here for a 14-Day Free Trial!

What is SQL for Snowflake?

Snowflake Create Role: SQL in Snowflake| Hevo Data
Image Source

Snowflake’s Data Cloud platform has a data warehouse workload that supports the most widely used standardized SQL version (ANSI) for comprehensive relational database querying. It may also combine semi-structured data, such as JSON, with structured data, such as SQL. Snowflake makes JSON data more accessible and allows users to integrate it with structured data.

Snowflake allows users to access JSON data with SQL queries and seamlessly join it to traditional tabular data. This innovative querying strategy enables users to store JSON documents in a relational database using a new data type (VARIANT) that is automatically optimized in the background for columnar and MPP access.

The different types of SQL Commands supported by Snowflake are:

  • DDL (Data Definition Language) Commands
  • DML (Data Manipulation Language) Commands
  • Query Syntax & Operators
  • TCL (Transaction Control Language) Commands

What are DDL Commands?

DDL stands for Data Definition Language. Snowflake objects including users, virtual warehouses, databases, schemas, tables, views, columns, functions, and stored procedures are created, manipulated, and modified using DDL commands.

They are also used to conduct several account-level and session-level operations, such as parameter setting, variable initialization, and transaction initiation.

The following commands serve as the base for all DDL commands:

  • ALTER <object>
  • CREATE <object>
  • DESCRIBE <object>
  • DROP <object>
  • SHOW <objects>
  • USE <object>

Each of the above DDL commands requires an object type and an identifier. The remaining parameters and options available for the command are object-specific.

The different categories of DDL Commands are as follows:

  • Account & Session DDL Commands
  • User & Security DDL Commands
  • Warehouse & Resource Monitor DDL Commands
  • Database, Schema, & Share DDL Commands
  • Table, View, & Sequence DDL Commands
  • Data Loading / Unloading DDL Commands
  • DDL Commands for User-Defined Functions, External Functions, and Stored Procedures
  • Data Pipeline DDL Commands

What are User & Security DDL Commands?

Snowflake has a comprehensive collection of SQL commands for managing users and security. These commands can only be used by users who have been granted roles with the OWNERSHIP privilege on the managed item. The ACCOUNTADMIN and SECURITYADMIN roles are generally the only ones that can do this.

However, individual users can perform the following tasks for themselves:

  • Change their password (only through the web interface).
  • View their user information (via DESCRIBE USER).
  • Change their default role, virtual warehouse, or namespace (via ALTER USER).
  • Change their session parameters (via ALTER SESSION).

The different types of User & Security DDL Commands fall under the following categories:

  • User Management
  • Role Management
  • Object Tagging Management
  • Access Control Management
  • Network Policy Management
  • Session Policy Management
  • Third-Party Integrations
    • API Integrations
    • Notification Integrations
    • Security Integrations
    • Storage Integrations

What is Role Management?

Snowflake uses roles to control access to objects in the system:

  • Roles are granted access privileges to system objects (databases, tables, etc.).
  • Users are granted roles in order for them to be able to create, modify, and utilize the objects for which the roles have privileges.
  • Roles can be granted to other roles to aid in the definition of hierarchical access privileges.

The following DDL commands can be used to manage roles in the system:

  • CREATE ROLE Command
  • ALTER ROLE Command
  • DROP ROLE Command
  • SHOW ROLES Command

What is Snowflake CREATE ROLE Command?

The Snowflake Create Role Command is used to create a new role or to replace an existing role.

After creating roles using the Snowflake Create Role Command, you can grant the role object privileges and then grant the role to other roles or specific users to provide access control security for system objects.

Note: Only users with the USERADMIN or higher role, or another role with the Snowflake CREATE ROLE privilege on the account, can create roles.

The following categories explain the Snowflake CREATE ROLE Command in detail:

A) Syntax

The syntax for the Snowflake Create Role Command is as follows:

CREATE [ OR REPLACE ] ROLE [ IF NOT EXISTS ] <name> [ [ WITH ] TAG ( <tag_name> = '<tag_value>' [ , <tag_name> = '<tag_value>' , ... ] ) ] [ COMMENT = '<string_literal>' ]

B) Required Parameters

  • <name>

The “name” parameter serves as the role’s identification. It must be unique to your account.

Unless the entire identifier string is wrapped in double-quotes (e.g., “My object“), the identifier must begin with an alphabetic character and cannot contain spaces or special characters. The case is also taken into account for identifiers enclosed in double quotations.

For more details on the required parameters, you can see the Identifier Requirements.

C) Optional Parameters

The optional parameters of Snowflake Create Role Command are as follows:

  • TAG ( tag_name = 'tag_value' [ , tag_name = 'tag_value' , ... ] )

It specifies the tag name (i.e. the key) and the tag value. The tag value is always a string, and the tag value can have a maximum of 256 characters. The maximum number of unique tag keys that may be assigned to an item is 20.

  • 2. COMMENT = 'string_literal'

It indicates a role-specific comment.

Default: No value

D) Example

The example which showcases the usage of the Snowflake Create Role Command is as follows:

create role myrole;

For further information on Snowflake Create Role Command, you can visit here.


In this article, you have learned about Snowflake Create Role Commands. This article also provided information on Snowflake, its key features, SQL for Snowflake, types of SQL Commands, DDL Commands, User & Security DDL Commands, Role Management, and the usage of Snowflake Create Role Command in detail. For further information on Snowflake Primary Key Constraint, Cast & Try_cast Commands, Copy command, you can visit the former links.

Hevo Data, a No-code Data Pipeline provides you with a consistent and reliable solution to manage data transfer between a variety of sources and a wide variety of Desired Destinations with a few clicks.

Visit our Website to Explore Hevo

Hevo Data with its strong integration with 100+ data sources (including 40+ Free Sources) allows you to not only export data from your desired data sources & load it to the destination of your choice but also transform & enrich your data to make it analysis-ready. Hevo also allows integrating data from non-native sources using Hevo’s in-built Webhooks Connector. You can then focus on your key business needs and perform insightful analysis using BI tools. 

Want to give Hevo a try?

Sign Up for a 14-day free trial and experience the feature-rich Hevo suite first hand. You may also have a look at the amazing price, which will assist you in selecting the best plan for your requirements.

Share your experience of understanding Snowflake Create Role Command in the comment section below! We would love to hear your thoughts.

No-code Data Pipeline for Snowflake