Flow API endpoints

Comments

20 comments

  • Pablo Gonzalez
    Celigo University Level 4: Legendary
    Engaged

    Hello,

    What are the parameters (both required and optional) for `/flows/<_id>/clone` and response?

    Thank you!

    0
  • Tyler Lamparter Principal Product Manager
    Awesome Follow-up
    Engaged
    Top Contributor
    Answer Pro
    Celigo University Level 4: Legendary

    Pablo Gonzalez here you go. We'll get docs updated as well.

    {
      "connectionMap": {
          "<current connection id>": "<new connection id>",
          "<current connection id>": "<new connection id>"
      },
      "sandbox": false,
      "name": "Name Flow Here",
      "_integrationId": "<integration id where the cloned flow should go>",
      "_flowGroupingId": "<flow group id where the cloned flow should go - optional>"
    }
    2
  • Pablo Gonzalez
    Celigo University Level 4: Legendary
    Engaged

    Hi Tyler Lamparter,

    Awesome, thank you for the info. I saw the connectionMap and was wondering why it was being written twice and decided to try this another time. I looked at it again today and it makes sense! It's literally mapping the amount of connections from the source flow to the destination flow. I was able to successfully clone my flow via API, thank you again for your quick response! 

    - Pablo G.

    0
  • Emil Daniel Ordoñez Avila

    Hi Tyler Lamparter. Am I able to send non date parameters using the Flow Run Endpoint?

    Example: 

    {
       "export":{
          "sku":"XX-XXXX-SAMPLE-SKU",
          "startDate":"2023-01-05T00:00:00.000",
          "endDate":"2023-01-06T00:00:00.000"
        }
    }

    I Know the endDate and startDate are both mapping to currentExportDateTime and lastExportDateTime respectively, but I'm not able to use additional parameters as the sku on the example.

    This behavior would be helpful for APIs that are not based on Dates for filtering.

    Using the Run Export API, I'm able to send that SKU parameter, but not on the Flow API.
    0
  • Tyler Lamparter Principal Product Manager
    Awesome Follow-up
    Engaged
    Top Contributor
    Answer Pro
    Celigo University Level 4: Legendary

    Emil Daniel Ordoñez Avila unfortunately not at this time. It is currently slated as an enhancement for summer/fall of this year.

    0
  • Geoff Tipley
    Engaged
    Awesome Follow-up

    Tyler Lamparter being able to run another flow and pass in custom data for it to reference would be HUGE for me. Is there any way to work around this by having a flow change a value in Integration settings?

    0
  • Tyler Lamparter Principal Product Manager
    Awesome Follow-up
    Engaged
    Top Contributor
    Answer Pro
    Celigo University Level 4: Legendary

    Geoff Tipley yes you can update the integration settings prior to calling the flow to run. The only downside I see of doing this is that a flow can only have 1 job run at a time and I assume you need to run this for dozens, hundreds, or thousands of skus right? I'm curious what your workflow is here.

    Could another solution be to have a webhook on flow #2, have flow #1 post data to that webhook, then have a lookup step that runs the report for that sku? I'm not sure how many results are expected for that one lookup.

     

    0
  • Geoff Tipley
    Engaged
    Awesome Follow-up

    My project is syncing data between two Salesforce environments. If I'm syncing an Opportunity and it has an Account that hasn't been synced yet, I want to sync that Account and then continue with my Opportunity flow. My current fix should avoid the issue you named where only one instance of a flow can run at a time. I'm going to include the three components of my Account flow inside the Opportunity flow. Using the same components should allow it to be maintainable since I shouldn't have to make the same changes to two different flows when revising Account flow components.

    For a webhook approach, what would that take, incidentally? I imagine you create the Webhook listener in Celigo as a second source to the Account flow with Secret URL verification for instance, save that Public URL, and in the Account flow make an HTTP component to post data to that URL? I'm not sure it would help anything in this case because I believe the Account flow would run asynchronously and have no way of responding to the Opportunity flow with the IDs that were created.

    0
  • Tyler Lamparter Principal Product Manager
    Awesome Follow-up
    Engaged
    Top Contributor
    Answer Pro
    Celigo University Level 4: Legendary

    Geoff Tipley yeah it sounds like webhook process wouldn't work in your case. You're basically wanting your opportunity syncing flow (flow #1) to call another flow for account syncing, wait on that flow to complete, then move on to create the opportunity record in flow 1 so that you only have a single place to maintain account mappings and logic, right?

     

    Reusing the import/lookup steps like you are doing will work, but it would be nice to visually show that reuse differently (i.e. sub flow).

    0
  • Tyler Lamparter Principal Product Manager
    Awesome Follow-up
    Engaged
    Top Contributor
    Answer Pro
    Celigo University Level 4: Legendary

    Adding another flow run option here in case anyone needs it. Since IO now has the ability to run single exports on a flow, the /run object can include a payload of the exportIds that you want to run. Use the same /flows/<_id>/run as above, but use the below body. Use your own exportIds of course.

    {
       "_exportIds": [
           "655bd************1bf2",
           "655bdb1***********7bf94"
       ]
    }

    0
  • Tyler Lamparter Principal Product Manager
    Awesome Follow-up
    Engaged
    Top Contributor
    Answer Pro
    Celigo University Level 4: Legendary

    Another one we're working to add into the docs is replacing a connection at the flow level. To do so, use the following endpoint with the following payload.

    PUT /v1/flows/<_flowId>/replaceConnection

    {
      "_connectionId": "<_existingConnectionId>",
      "_newConnectionId": "<_newConnectionId>"
    }

    0
  • Bijan Samiee

    Tyler Lamparter Is it possible to perform a "Download Flow" via API call, similar to how the action is available in the Integration Flows list on the dashboard? If not, would the only solution be to make calls for all the connections, exports, imports, flows, and scripts separately? Thank you!

    0
  • Tyler Lamparter Principal Product Manager
    Awesome Follow-up
    Engaged
    Top Contributor
    Answer Pro
    Celigo University Level 4: Legendary

    Bijan Samiee yes there is one available. It returns a pre-signed url that you can then use to download.

    Request:

    GET https://api.integrator.io/v1/flows/<flowId>/template

     

    Response:

    {
    "signedURL":"https://integrator-templates.s3.amazonaws.com/<flowId>.zip?AWSAccessKeyId=XXXXXXX&Expires=1707272837&Signature=XXXXXXXXX",
    "key":"<flowId>.zip"
    }
    0
  • Tyler Lamparter Principal Product Manager
    Awesome Follow-up
    Engaged
    Top Contributor
    Answer Pro
    Celigo University Level 4: Legendary

    You can also download an integration via:

    GET https://api.integrator.io/v1/integrations/<integrationId>/template
    0
  • Josh Braun

    In this comment (https://docs.celigo.com/hc/en-us/articles/7707985934363/comments/15350161055899) from Tyler Lamparter it was mentioned a feature to allow non-start/end date fields to be sent to the flow/<id>/run end-point. 

    Was this ever completed? If so, how is it done? 

    The only other way I see being able to do this is using a webhook or creating a My API that calls the flow's exports/imports but that would be silly to create a flow only to have to write it out in JavaScript in an API to pass in custom data. 

    Webhooks work but you don't get a JobID or anything returned. 

    0
  • Tyler Lamparter Principal Product Manager
    Awesome Follow-up
    Engaged
    Top Contributor
    Answer Pro
    Celigo University Level 4: Legendary

    Josh Braun, no this was not done. You'd need to:

    1. Use a webhook as a source of a flow and then a lookup step to pass the date params in. This would be fine as long as the data returned is less than 5 MB.
    2. Use a MyAPI as a source where the javascript handles it and you setup a paging mechanism on the export and MyAPI. 
    3. Update the settings on the flow and reference the flow settings within your export. This would be limited to 1 set of start/end dates per flow run so would be less ideal, but is the same behavior current flows have where it can only have 1 job running at a time.
    0
  • Josh Braun

    Tyler Lamparter

    Thanks for the response. 

    1. Webhooks do work, however, nothing is returned from the request. Additional lookups would have to be done to find the Job and get status/errors. (At least my webhooks tests showed no response body.)
    2. Could you explain the MyAPI and paging solution a bit more? My understanding of the MyAPI solution is that I would have to recreate the Flow I already created but in the script file. So if my Flow has 1 export and 5 lookups/imports I need to script all those steps. The Flow would effectively only be used for reference but all the components are used in the script.
    3. Settings could work, or at least could work in the future. But if they are limited only to a start/end date and don't allow other settings (at least to be passed in externally) I'm not sure that would work for us here. 

    Ultimately the goal is to send IDs for the Flow's export to use. For example sending IDs 1, 2, and 3, and the export uses those IDs to find the records 1, 2, and 3 to export instead of all records between a start and end date or a since-last-run date. 

    0
  • Tyler Lamparter Principal Product Manager
    Awesome Follow-up
    Engaged
    Top Contributor
    Answer Pro
    Celigo University Level 4: Legendary

    Josh Braun curious why you need a response back? If a webhook is the source, then you do some lookups, you could send that to where you need? Maybe I don't fully understand your use case yet.

    0
  • Josh Braun

    Tyler Lamparter

    Our use case is that we have UI in our system that the user would use to select the records to send to whatever system. We'd call the celigo endpoint (MyAPI or webhook) to pass the IDs in to begin the process for the selected records and we would be able to provide status updates/errors reported from the Flow for those records. 

    0
  • Tyler Lamparter Principal Product Manager
    Awesome Follow-up
    Engaged
    Top Contributor
    Answer Pro
    Celigo University Level 4: Legendary

    Josh Braun that makes sense. For this use case, I assume you would prefer to get the status back in real-time vs being asynchronously processed? A MyAPI is the only tool we currently have that would allow you to response back in real-time with the outcome. If you used a webhook or passed settings to a flow, those would be asynchronous and you would need the flow itself to write back the status to your application. 

    0

Please sign in to leave a comment.