Shopify Payout Transactions: response stream exceeded limit of 5242880 bytes.

Hello,

I'm trying to export data from Shopify using a custom flow that exports Shopify payout transactions using payout ID. However, I have been receiving a response stream exceeding limit of 5242880 bytes. Customer support isn't that helpful and not getting the right response. they are reporting this is flow building error and requires professional service. I tried all sorts of tricks like page size, and limiting per-page records from Shopify.

Please help me if you have come across this issue before.

0

Comments

3 comments
Date Votes
  • Harish Bakka I reviewed the ticket and the meeting recording and have a solution for you. This is actually the solution we use for our own integration apps.

     

    To summarize, you have an export that is getting Shopify payouts, then you have a lookup step that is getting the transactions for the exported payout id, and then lastly you are wanting to put each transaction into Google BigQuery. The issue you're running into is a 5 MB page size limit due to having so many transactions for a given payout (a good problem to have since sales should be good!). 

     

    To solve for this, you'll need 2 flows. The first flow will grab the payouts, insert the payout data to Google BigQuery, then have a lookup step at the end to get the transactions. On the ending lookup step, you'll have a preSave page script that makes an api call to the second flow where it sends the returned transaction data from the lookup. The second flow will then receive the incoming transactions and insert them into Google BigQuery. The reason this works is because the preSave script will end up returning empty objects because they are no longer needed for flow 1 and they've already been sent to flow 2. Since the script is returning empty records, it never hits the 5 MB page limit.

     

    Here is the setup below. Note, you'll need to use your own export id which can be found on the IO listener (not to be confused with a webhook (choose Celigo integrator.io and choose listen)) and update the script with your id. Additionally, disregard me using Celigo integrator.io bubbles on everything expect the listener. I just used them as placeholders, using the mirror endpoint, for real Shopify and Google BigQuery bubbles.

    import {connections,integrations,exports,imports,flows,request} from 'integrator-api';

    function preSavePage (options) {
      
      exports.run({_id: '6668f68176867e29a18f722e', listenerData: options.data});
      let output = [];
      options.data.forEach((d) => {
        output.push({});
      })
      
      return {
        data: output,
        errors: options.errors,
        abort: false,
        newErrorsAndRetryData: []
      }
    }

    2
  • Hi Tyler Lamparter,

    Thank you for the brief context, however, is there a way we can do it in a single flow rather than an additional flow? adding additional flow would cost too.

    Also, why can't Look up exporter can't records into multiple pages? since you're sending data to an additional listener, why can't Celigo just send it to an additional tile BigQuery? 

    It is Shopify call that you're exporting data by page limit 250 records and sending over by page size.

    Thank you
    Harish

    0
  • Harish Bakka the only way to get this into one flow would be to have a virtual import in the lookup's preSave page script that imports directly to Google BigQuery instead of sending to another listener flow. The issue with this is that virtual imports don't respect a connections concurrency so if you run into a rate limit error you wouldn't have a retry mechanism. Given the volume of data, I wouldn't recommend this. 

     

    If you are concerned about flow counts or have no more flows, you could get real hacky and incorporate the second flow into an already existing flow. You'd need a branch condition to send the multiple exports down the right path.

     

    As for why, the page size you should mainly be referring to is the page size on the main export, which is the export for getting payouts. That page carries over for all downstream steps. When you hit the lookup, the pages from the lookup get merged back into the main page and it's that merging that puts you over 5 MB. The page size on the lookup is adjustable so that you can control how many records get into the preSave page script at once.

     

    We have a couple roadmap items that would solve for this so that you wouldn't need two flows, but I don't have a timeline to give. The idea would be to allow you to re-recordize everything after the lookup results.

    0

Please sign in to leave a comment.

 

Didn't find what you were looking for?

New post