I'm trying to get a bit better performance out of data loading into BigQuery by batching together rows when I do an InsertAll (https://cloud.google.com/bigquery/docs/reference/rest/v2/tabledata/insertAll). I use the URL formatted as:
and I also set the batch limit to 100 with a modification to my handlebars template to iterate through the rows, as an example:
The problem arises, I think, because of the way BigQuery returns successes without confirming the ID's and just returning the errors, using the following schema:
This I think leads to the error: Processing of submitResponse did not return same record count as the current batch size.
Has anyone had any luck implementing batching for BigQuery or know of a way to process or ignore this functionality ?
Please sign in to leave a comment.