Tip
Click to open the Build SQL query editor and ask Celigo AI to generate a BigQuery query for you. Enter a description of your query in plain English in the space provided, and Celigo AI will attempt to generate a query based on your prompt.
The bulk insert data option is ideal for large data volumes. integrator.io builds the insert query for you automatically for each batch of records. Each batch's default number of records bound is 100, but you can use the Batch size field to tune your imports as needed. There is a query limit of 100KB, so for larger batch sizes, the number of records imported in a single request is adjusted to the size limit.
Note
For bulk insert, if you have data types such as money, numeric, decimal, smalldatetime, then you must check if the values to be inserted are within the accepted range of the destination column. If the values are out of range, use another data type that supports the required range. To verify the range based on the server version used, see Data types (Transact-SQL).
By using metadata, you can now:
-
Search for a specific database, dataset, table, or column to add to your query.
-
Use lookups to find data.
-
Refresh your data to get new updates automatically.
-
You can also use Mapper 2.0 to map imported records.
Note
You can use any schema and table name regardless of what you used in your connection with the following format in your import’s Destination table search: database.schema.table name.
Execute a SQL query once for each record.
You can write your SQL command in the SQL query text field. Click Edit () to the right of the text field to open the SQL Query builder AFE.
Execute a SQL query once per page of data. You can write your SQL command in the SQL query text field. Click Edit () to the right of the text field to open the SQL Query builder AFE.
Comments
Please sign in to leave a comment.