The Snowflake COPY command is a simple and powerful command that allows for easier data loading into Snowflake’s tables from various sources like cloud storage or local files. Whether you are a novice or someone who already knows the ropes of Snowflake, this guide takes you through the various simple, yet absolutely essential aspects of the COPY command with the use of practical examples. 

In this article, you will understand the need to copy snowflake data from one table to another, the syntax to use, the required parameters, and also a few examples.

Introduction to Snowflake

Snowflake Logo

Snowflake runs entirely on cloud infrastructure and provides a fully managed data warehouse system. Snowflake runs and manages its infrastructure on a public cloud and cannot be installed or hosted on a private cloud.

Snowflake has a hybrid architecture of traditional shared-disk database architectures and shared-nothing database architectures. 

Key features of Snowflake

  • Cloud-Based Platform: Operates as a SaaS model, eliminating the need for hardware setup and maintenance.
  • Centralized and Scalable Storage: Provides a central repository for seamless data access and scales effortlessly based on data volume.
  • High Performance: Utilizes MPP (Massive Parallel Processing) for efficient query processing with localized cluster data storage.
  • Advanced SQL Support: Offers ANSI SQL and advanced functionalities like merge, lateral view, and statistical functions.
  • Enhanced Insights and Collaboration: Breaks down data silos, enabling better decision-making, improved partner relationships, and actionable insights.
  • Custom Data Sharing: Facilitates secure, governed data sharing to build better customer and partner relationships.
  • Robust Security: Ensures secure data storage, fast incident response, and efficient analysis of structured and semi-structured data for easy transformations in Snowflake.
Build your Data Pipeline to Connect Snowflake in just a few clicks! 

Looking for the best ETL tools to connect your Snowflake account? Rest assured, Hevo’s no-code platform seamlessly integrates with Snowflake streamlining your ETL process. Try Hevo and equip your team to: 

  1. Integrate data from 150+ sources(60+ free sources).
  2. Simplify data mapping with an intuitive, user-friendly interface.
  3. Instantly load and sync your transformed data into Snowflake.

Choose Hevo and see why Deliverr says- “The combination of Hevo and Snowflake has worked best for us. ”

Get Started with Hevo for Free

What is Snowflake COPY?

The Snowflake COPY command allows you to load data from staged files on internal/external locations to an existing table or vice versa.

Snowflake offers two types of COPY commands:

  1. COPY INTO <location>: This will copy the data from an existing table to locations that can be:
    • An internal stage table. 
    • An external stage table pointing to an external site, i.e., Amazon S3, Google Cloud Storage, or Microsoft Azure.
    • An external location like Amazon Cloud, GCS, or Microsoft Azure.
  1. COPY INTO <table>: This command will copy the data from staged files to the existing table. The staged files can be present at:
    • The internal stage table.
    • The external stage table points to an external location like a public cloud provided by Amazon, Google, or Microsoft. (or)
    • An external location like Amazon Cloud, GCS, or Microsoft Azure.

Why Do You Need Snowflake COPY Command?

In analytics, there is often a requirement to copy data from one table to another table/location to share, collaborate, and perform data analytics within the team. The copy command is helpful in increasing: 

  • Efficiency: COPY is optimized for high-speed data ingestion.
  • Flexibility: It supports multiple file formats and customization options.
  • Error Handling: You can control error thresholds and log problematic rows for review.

Learn about how Snowflake differs from platforms like Redshift and BigQuery through our detailed blog.

COPY Command Usage

Prerequisites for Using the COPY Command

Before diving into the COPY command, ensure the following:

  • Permissions: You have the necessary privileges for the target table and stage.
  • Stage Creation: You have an external or Snowflake stage set up.
  • File Accessibility: Your data files are accessible to Snowflake.

You can use the COPY command by using the following syntax:

COPY INTO [<namespace>.]<table_name>
FROM { internalStage | externalStage | externalLocation }
[ FILES = ( '<file_name>' [ , '<file_name>' ] [ , ... ] ) ]
[ PATTERN = '<regex_pattern>' ]
[ FILE_FORMAT = ( { FORMAT_NAME = '[<namespace>.]<file_format_name>' |
TYPE = { CSV | JSON | AVRO | ORC | PARQUET | XML } [ formatTypeOptions ] } ) ]
[ copyOptions ]
[ VALIDATION_MODE = RETURN_<n>_ROWS | RETURN_ERRORS | RETURN_ALL_ERRORS ]

Required Parameters  

  • [<namespace>].<table_name>: Specifies the name of the target table into which data is loaded.
  • Namespace is the combination of <database_name>.<schema_name> or <schema_name>. If the schema and database have already been selected, then this parameter is optional.
  • FROM: This specifies the internal or external location that contains the data file to load into the table.

Optional Parameters

  • FILES: This specifies the list of file names to load into the table. The file must be staged in either Snowflake’s internal or external location. 
  • PATTERN: A regular expression pattern to match the specific file names.
  • FILE_FORMAT: Specifies the data file format (CSV, JSON, AVRO, ORC, PARQUET, etc.).
  • VALIDATION_MODE: This instructs the command to validate the data files instead of loading them into target tables and allows you to perform the dry run to ensure the fail-safe delivery of data. 

Examples of Using the COPY Command

Let’s explore how to use the COPY command with examples.

1. Loading Data from Amazon S3

Suppose you have a CSV file named sales_data.csv in an S3 bucket. Here’s how you can load it into a Snowflake table named sales:

COPY INTO sales

FROM 's3://my-bucket/data/'

CREDENTIALS=(AWS_KEY_ID='your_key_id' AWS_SECRET_KEY='your_secret_key')

FILE_FORMAT = (TYPE = 'CSV' FIELD_OPTIONALLY_ENCLOSED_BY='"');

2. Loading Data from a Snowflake Internal Stage

If your file is stored in a Snowflake stage named @my_stage:

COPY INTO sales

FROM @my_stage

FILE_FORMAT = (TYPE = 'CSV' FIELD_OPTIONALLY_ENCLOSED_BY='"')

ON_ERROR = 'CONTINUE';

3. Loading JSON Data with Auto-Parsing

For JSON files, you can use AUTO to simplify parsing:

COPY INTO sales

FROM @my_stage

FILE_FORMAT = (TYPE = 'JSON')

MATCH_BY_COLUMN_NAME = CASE_INSENSITIVE;

4. Loading files Using Patterns

copy into mytable

file_format = (type = 'CSV')

pattern='.*/.*/.*[.]csv[.]gz';

Some Typical Errors and How to Fix Them

1. File Not Found

Cause: An incorrect stage or file path.

Fix: Check the file path, and make sure the file exists at the specified stage.

2. File Format Mismatch

Cause: The file format described in FILE_FORMAT does not conform to that of the actual export file.

Fix: Kindly double-check the file format definition.

3. Permission Denied

Cause: Insufficient access or permission for the stage or table.

Fix: Make sure you have READ access on the stage and INSERT privilege on the table. 

Best Practices for Using the COPY Command

  • Use External Stages: For large datasets, external stages like Amazon S3 or Azure Blob Storage are more scalable options.
  • Leverage File Compression: Compressed files (e.g., GZIP) will reduce load time and cost.
  • Enable Data Validation: Use ON_ERROR and logging options to catch issues early.
  • Partition Your Data: Split large datasets into smaller files for better parallel processing.
  • Monitor Performance: Use Snowflake’s query history and logs to identify bottlenecks. 

Conclusion

The Snowflake COPY command is a simple and effective command that facilitates data-loading tasks. By understanding its syntax, options, and best practices, you can optimize your data pipeline and have a seamless data ingestion into your Snowflake environment. Try such commands immediately, and feel free to experiment with what suits your work best. 

For a simpler and faster method to load data into Snowflake, Sign Up for a 14-day free trial and experience the feature-rich Hevo suite firsthand.

FAQs

1. What is the COPY command in Snowflake?

The COPY command in Snowflake is used to load data from external storage (e.g., S3, Azure Blob) into Snowflake tables or unload data from tables to external storage.

2. How do you copy Snowflake results?

Use the COPY INTO command to export query results from a table to an external stage or cloud storage in a specified format.

3. How to automate COPY command in Snowflake?

Automate using Snowflake tasks and streams, or by integrating with orchestration tools like Apache Airflow or cloud schedulers.

Vishal Agrawal
Technical Content Writer, Hevo Data

Vishal Agarwal is a Data Engineer with 10+ years of experience in the data field. He has designed scalable and efficient data solutions, and his expertise lies in AWS, Azure, Spark, GCP, SQL, Python, and other related technologies. By combining his passion for writing and the knowledge he has acquired over the years, he wishes to help data practitioners solve the day-to-day challenges they face in data engineering. In his article, Vishal applies his analytical thinking and problem-solving approaches to untangle the intricacies of data integration and analysis.