You can export data from Snowflake in either of two ways:
-
Retrieve data with a SQL command: This method allows you to use SQL queries to retrieve data from the Snowflake platform and export it into integrator.io
-
Extract data via the COPY API into Amazon S3: You can use an Amazon S3 bucket to store extracted Snowflake data and then retrieve the copied file from Amazon S3 for use in a flow
You can use Snowflake SQL queries , or you can export data from Snowflake via SQL queries that access semi-structured data using special operators and functions. The following example describes how to retrieve JSON data from Snowflake:
-
In Flow Builder , click Add source.
-
Select Snowflake as your source application, and choose the Snowflake connection you want to use to export data.
-
Click Next. The Create export panel opens.
-
Enter a name and optional description for the export.
-
In the What would you like to export? section, enter your SQL query in the Query field. For more information on constructing queries, see the Snowflake query documentation.
Tip
Click to open the Build SQL query editor and ask Celigo AI to generate a SQL query for you. Enter a description of your SQL query in plain English in the space provided, and Celigo AI will attempt to generate a SQL query based on your prompt.
Note
To successfully export extremely small or large floating-point numbers, use
TO_CHAR (FLOATNUM)
. For example,"query": "select TO CHAR (FLOATNUM) from TEST_DEMODB.PUBLIC.ALLDATATYPES"
select value:first_name::string as "first name", value:last_name::string as "last name", value:city::string as "city", value:zip::string as "zip" from shopify_customer, lateral flatten(input => properties:addresses);
If you already use S3 buckets for storing and managing your data files, you can use your existing buckets and folder paths when unloading data from Snowflake tables, and then retrieve that data with your Amazon S3 connection for use in a flow.
Note
If you have not yet configured Snowflake to Amazon S3 storage integration, see the Snowflake documentation on S3 external storage . For information about unloading Snowflake data to Amazon S3, see the Snowflake user guide.
If you have configured your Snowflake account to use Amazon S3 storage, enter the following command in the Query field to automatically copy Snowflake data to Amazon S3:
COPY INTO @my_ext_unload_stage/dl_file_prefix from Shopify_customer OVERWRITE = TRUE file_format = (format_name ='mycsvformat' compression='NONE');
Comments
Please sign in to leave a comment.