Summary of Data Loading Features

This topic provides a quick-reference of the supported features for using the COPY INTO table command to bulk load data from files into Snowflake tables.

In this Topic:

Data File Details

Feature Supported Type Notes
Location of files Local environment Files are first staged in a Snowflake internal location or external location (i.e. S3 bucket), then loaded into a table.
AWS S3 Files can be loaded directly from any user-supplied S3 bucket in Amazon.
File formats Delimited (CSV, TSV, etc.) Any single-character delimiter is supported; default is comma (i.e. CSV).
Avro Includes automatic detection and processing of Snappy-compressed Avro files.
XML Supported as a preview feature.
File encoding UTF-8  

Compression of Staged Files

Feature Supported Type Notes
Uncompressed files gzip When staging uncompressed files in a Snowflake internal location, the files are automatically compressed using gzip, unless compression is explicitly disabled.
Already-compressed files gzip When loading staged files that have already been compressed, Snowflake can automatically detect any of these compression methods or you can explicitly specify the method that was used to compress the files.
Brotli Loading files compressed using Brotli or ZStandard is currently a preview feature. Auto-detection is not yet supported for these methods; when staging or loading files compressed with either of these methods, the compression method must be explicitly specified.

Encryption of Staged Files

Feature Supported Type Notes
Unencrypted files 128-bit or 256-bit keys When staging unencrypted files in a Snowflake internal location, the files are automatically encrypted using 128-bit keys. 256-bit keys can be enabled (for stronger encryption); however, additional configuration is required.
Already-encrypted files Files in S3 Files that are already encrypted can be loaded into Snowflake; however, they must be loaded from an S3 bucket and the key used to encrypt the files must be provided to Snowflake.