Skip to main content

Configure connections to optimize throughput and governance

Comments

2 comments

  • Anna Lam

    For example, if the concurrency level is set to 6 and a long-running export is running in one of your flows, then you will still have 5 messages that can be processed in parallel. Those additional 5 messages could also be long-running exports, or they could be very quick record imports. It does not matter.

    In this example, could those additional 5 messages be from the same long-running export? Or does it need to be from a different export?

    If we are running only one export with a concurrency level of 6, does that mean up to 6 records from that same export can be processed concurrently? If so, what happens when another export using the same connection suddenly starts? How do those two exports split the concurrency levels?

    0
  • Tyler Lamparter Senior Solutions Consultant
    Answer Pro
    Celigo University Level 4: Legendary

    A long running export (if on the source side of the flow) will typically only use 1 concurrency because it's going out and making 1 api request for a page of data. The concurrency level is for how many api requests we are making to the endpoint, which can be different than the number of actual records when getting a list of records back from an api or making a batch import request to an api. 

    In the case of a long export, we would make one request to get the first page of results, then make subsequent requests for more pages of data. This inherently uses 1 concurrency because I have to get a response back before making a request for a new page of data.

    Where an export could take up more concurrency is if the export is a lookup step within your flow. For example, if you need to make an api request per record in order to get additional information about the original record. In this case, the lookup step would use additional concurrency because it can ping the api in parallel for each specific record.

    Hopefully that helps! 

    0

Please sign in to leave a comment.