Contents
Use SQL to import data
You can use one of the following options to import data:
- Use bulk insert SQL query: The batch insert data is ideal for large data volumes. integrator.io builds the insert query for you automatically with this option.
Note: The default number of records bound in an array is 100, but you can use the Batch size field to adjust the number of records bound in the array.
- Use SQL query: Execute a SQL query once for each record. You can write your SQL command in the SQL Query builder.
- Use SQL query once per page of data: Execute a SQL query once per page of data. You can write your SQL command in the SQL Query builder.
The following example uses a bulk insert SQL query:

For more information on Snowflake bulk insert functionality, see Binding an array for bulk insert. For more information on Snowflake query size limitations, see Query size limits.
Use bulk insert SQL query for JSON semi-structured data

When using the bulk insert option, you can take advantage of Snowflake technology to insert a JSON payload (16 MB compressed, max., into a single column of one row).

When you map the JSON payload, you must choose JSON for your data type. For more information on Snowflake semi-structured data, see Semi-structured concepts.
Create a Shopify_customer table in Snowflake
This example demonstrates how to create a Shopify_customer table in Snowflake:
- Create a flow with Shopify as the source application and Snowflake as the destination application.

- Click the mapping icon (
) in the Snowflake import, and configure your mapping to create or replace the Shopify_customer table.

In this example, the fields are mapped as follows:
- Shopify id maps to Snowflake ID as a string value.
- Shopify accepts_marketing maps to Snowflake ACCEPTS_MKTG_BOOLEAN as a boolean value.
- Shopify handlebars value {{{jsonSerialize this}}} maps to Snowflake properties.
Use SQL query once per page of data for update/insert (MERGE SQL)
MERGE is useful to insert, update, and delete values in a table based on values in a second table or a subquery. Use this method if a second table is a change log that contains new rows (to be inserted), modified rows (to be updated), or marked rows (to be deleted) in the target table.
Note: You must add an identifier to logically group and identify records within one page. You can use a pre save hook script to accomplish this task. Here is a script example:
function preSavePage (options) {
let pageIdentifier = Date.now();
for (var i= 0; i< options.data.length ; i++) {
options.data[i].pageMarker= pageIdentifier;
}
return {
data: options.data,
errors: options.errors,
abort: false
}
}
Configuration demo video
Run demo video
Use COPY API to move data from Amazon S3 into Snowflake
Use this strategy if you want to load all data stored in an Amazon S3 file once per flow run. The following steps explain how move data stored in an S3 file into Snowflake with COPY API:
- Extract all data from the source application into an S3 file.

- Provide the COPY command in the query builder with Snowflake syntax.
Comments
0 comments
Please sign in to leave a comment.