Overview of Data Loading

This topic describes concepts related to loading data into Snowflake tables.

In this Topic:

Bulk Data Loading Using COPY

This section describes bulk data loading into Snowflake tables using the COPY INTO table command. The information is similar regardless if you are loading from data files on your local file system or in Amazon S3 buckets.

Data Loading Process

Data loading is performed in two, separate steps:

  1. Upload (i.e. stage) one or more data files into either an internal stage (i.e. within Snowflake) or an external location:

    Internal:Use the PUT command to stage the files.
    External:Use the upload interfaces/utilities provided by the service that hosts the location to stage the files. Currently, Amazon S3 is the only service supported for staging external data.
  2. Use the COPY INTO table command to load the contents of the staged file(s) into a Snowflake database table.

    This step requires a running virtual warehouse that is also the current warehouse (i.e. in use) for the session. The warehouse provides the compute resources to perform the actual insertion of rows into the table.

Tasks for Loading Data

For information about the tasks associated with loading data, see:

In addition, Snowflake provides a data loading wizard in the web interface. For more information, see Loading (Limited) Data Using the Web Interface.

Continuous Data Loading Using Snowpipe

The process and tasks for loading data continuously using Snowpipe are described in Loading Data Continuously Using Snowpipe.