Overview of Data Loading

This topic describes concepts related to loading data into Snowflake tables.

In this Topic:

Bulk Loading Using COPY

This section describes bulk data loading into Snowflake tables using the COPY INTO table command. The information is similar regardless if you are loading from data files on your local file system or in Amazon S3 buckets or Microsoft Azure containers.

Data Loading Process

Data loading is performed in two, separate steps:

  1. Upload (i.e. stage) one or more data files into either an internal stage (i.e. within Snowflake) or an external location:

    Internal:Use the PUT command to stage the files.
    External:Currently, Amazon S3 and Microsoft Azure are the only services supported for staging external data. Snowflake assumes the files have already been staged in one of these locations. If they haven’t been staged already, use the upload interfaces/utilities provided by the service that hosts the location.
  2. Use the COPY INTO table command to load the contents of the staged file(s) into a Snowflake database table.

    This step requires a running virtual warehouse that is also the current warehouse (i.e. in use) for the session. The warehouse provides the compute resources to perform the actual insertion of rows into the table.

Tasks for Loading Data

For information about the tasks associated with loading data, see:

In addition, Snowflake provides a data loading wizard in the web interface. For more information, see Loading Using the Web Interface (Limited).

Continuous Loading Using Snowpipe

The process and tasks for loading data continuously using Snowpipe are described in Loading Continuously Using Snowpipe.