Understand and create custom flows
Relative URI |
Method |
Success code |
Description |
---|---|---|---|
|
|
|
Get all flows. |
|
|
Create a flow. |
|
|
|
|
Get a specific flow. |
|
|
Update a specific flow. |
|
|
|
Update part of a specific flow. |
|
|
|
Delete a specific flow. |
|
|
|
|
Clone a specific flow. |
|
|
|
Run a specific flow on demand. |
|
|
|
Replace a connection at the flow level. |
|
|
|
Get a log for a specific flow. |
|
|
|
Download a flow. |
|
|
|
Get all resources using or used by this flow. |
The following fields can be updated using PATCH
:
Field |
Sub-field |
Data type |
---|---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
GET /v1/flows/55e••••••••••••••••••367 HTTP/1.1 Host: api.integrator.io Authorization: Bearer my_api_token
{ "_id":"55e••••••••••••••••••367", "lastModified":"2017-06-19T21:27:09.945Z", "name":"Send GitHub Events to Slack", "disabled":false, "timezone":"America/Los_Angeles", "_exportId":"55e••••••••••••••••••366", "_importId":"55e••••••••••••••••••d14", "_integrationId":"58f••••••••••••••••••1bc", "skipRetries":false, "createdAt":"2017-06-19T21:27:09.900Z" }
GET /v1/flows/598••••••••••••••••••d9a HTTP/1.1 Host: api.integrator.io Authorization: Bearer my_api_token
Get a specific flow with multiple exports and imports linked together
{ "_id":"598••••••••••••••••••d9a", "lastModified":"2017-08-19T17:04:54.005Z", "name":"Update Usage Stats for all Trialers", "disabled":false, "schedule":"? 0 2 ? * *", "timezone":"America/Los_Angeles", "_integrationId":"593••••••••••••••••••74e", "skipRetries":false, "pageProcessors":[ { "type":"export", "_exportId":"598••••••••••••••••••428", "responseMapping":{ "lists":[], "fields":[ { "extract":"data", "generate":"numConnections" } ] } }, { "type":"export", "_exportId":"598••••••••••••••••••d32", "responseMapping":{ "lists":[], "fields":[ { "extract":"data", "generate":"dlSuccessAndErrors" } ] } }, { "type":"export", "_exportId":"598••••••••••••••••••6c6", "responseMapping":{ "lists":[], "fields":[ { "extract":"data", "generate":"lastSignIn" } ] } }, { "type":"export", "_exportId":"598••••••••••••••••••638", "responseMapping":{ "lists":[], "fields":[ { "extract":"data", "generate":"numDataLoaders" } ] } }, { "type":"export", "_exportId":"598e0c", "responseMapping":{ "lists":[], "fields":[ { "extract":"data", "generate":"numFlows" } ] } }, { "type":"export", "_exportId":"598••••••••••••••••••54b", "responseMapping":{ "lists":[], "fields":[ { "extract":"data", "generate":"numPasswordChanges" } ] } }, { "type":"export", "_exportId":"598••••••••••••••••••e22", "responseMapping":{ "lists":[], "fields":[ { "extract":"data", "generate":"numPasswordResets" } ] } }, { "type":"export", "_exportId":"598e29", "responseMapping":{ "lists":[], "fields":[ { "extract":"data", "generate":"numSignIns" } ] } }, { "type":"export", "_exportId":"598e37", "responseMapping":{ "lists":[], "fields":[ { "extract":"data", "generate":"successAndErrors" } ] } }, { "type":"import", "_importId":"598••••••••••••••••••615", "proceedOnFailure":true, "responseMapping":{ "lists":[], "fields":[] } }, { "type":"import", "_importId":"598••••••••••••••••••9f9", "responseMapping":{ "lists":[], "fields":[] } } ], "pageGenerators":[ { "_exportId":"598••••••••••••••••••286", "_id":"599••••••••••••••••••b36" }, { "_exportId":"598••••••••••••••••••6dd", "_id":"599••••••••••••••••••b35" } ], "createdAt":"2017-08-01T20:43:42.156Z" }
POST /v1/flows/55e••••••••••••••••••367 HTTP/1.1 Host: api.integrator.io Authorization: Bearer my_api_token
Send a JSON body payload that replaces existing connection details with new ones.
PUT /v1/flows/55e••••••••••••••••••367/replaceConnection HTTP/1.1 Host: api.integrator.io Authorization: Bearer my_api_token
Use the GET
/flows/<_id>
endpoint to retrieve the JSON representation of the flow you wish to modify. In the retrieved JSON, locate the existing import you want to replace and update it with the new import information. Send the modified JSON as the body of a request to the PUT
/flows/<_id>
endpoint.
PUT /v1/flows/55e••••••••••••••••••367 HTTP/1.1 Host: api.integrator.io Authorization: Bearer my_api_token
{ "_id": "55e••••••••••••••••••367", "lastModified": "2021-12-14T12:39:24.138Z", "name": "Send list of all integrations", "disabled": true, "_integrationId": "61b••••••••••••••••a6c", "pageGenerators": [ { "_exportId": "61b•••••••••••8adb", "skipRetries": false } ], "pageProcessors": [ { "responseMapping": { "fields": [], "lists": [] }, "type": "import", "_importId": "61b8••••••••••8ae0" } ], "createdAt": "2021-12-14T12:39:24.021Z", "free": false, "_templateId": "602c••••••••••••••7eb8", "_sourceId": "603•••••••••••••83d", "autoResolveMatchingTraceKeys": true }
POST /v1/flows/55e••••••••••••••••••367/run HTTP/1.1 Host: api.integrator.io Authorization: Bearer my_api_token
POST /v1/flows/55e••••••••••••••••••367/clone HTTP/1.1 Host: api.integrator.io Authorization: Bearer my_api_token
Send a JSON body payload that maps existing connection IDs to the connection IDs in the new flow. You can map to the same connection ID if you don’t want to change it. The connectionMap must map every connection in the existing flow. For example, if you have 5 connections in your flow, you need to map each connection, even if the connections don’t all change.
{ "connectionMap": { "<existingConnectionId #1>": "<newConnectionId #1>", "<existingConnectionId #2>": "<newConnectionId #2>", …… }, "sandbox": false, "name": "<Name Flow Here>", "_integrationId": "<integration id where the cloned flow should go>", "_flowGroupingId": "<flow grouping id where the cloned flow should go - optional>" }
POST /v1/flows/55e••••••••••••••••••367/template HTTP/1.1 Host: api.integrator.io Authorization: Bearer my_api_token
Comments
Hello,
What are the parameters (both required and optional) for `/flows/<_id>/clone` and response?
Thank you!
Pablo Gonzalez here you go. We'll get docs updated as well.
Hi Tyler Lamparter,
Awesome, thank you for the info. I saw the connectionMap and was wondering why it was being written twice and decided to try this another time. I looked at it again today and it makes sense! It's literally mapping the amount of connections from the source flow to the destination flow. I was able to successfully clone my flow via API, thank you again for your quick response!
- Pablo G.
Hi Tyler Lamparter. Am I able to send non date parameters using the Flow Run Endpoint?
Example:
I Know the endDate and startDate are both mapping to currentExportDateTime and lastExportDateTime respectively, but I'm not able to use additional parameters as the sku on the example.
This behavior would be helpful for APIs that are not based on Dates for filtering.
Using the Run Export API, I'm able to send that SKU parameter, but not on the Flow API.
Emil Daniel Ordoñez Avila unfortunately not at this time. It is currently slated as an enhancement for summer/fall of this year.
Tyler Lamparter being able to run another flow and pass in custom data for it to reference would be HUGE for me. Is there any way to work around this by having a flow change a value in Integration settings?
Geoff Tipley yes you can update the integration settings prior to calling the flow to run. The only downside I see of doing this is that a flow can only have 1 job run at a time and I assume you need to run this for dozens, hundreds, or thousands of skus right? I'm curious what your workflow is here.
Could another solution be to have a webhook on flow #2, have flow #1 post data to that webhook, then have a lookup step that runs the report for that sku? I'm not sure how many results are expected for that one lookup.
My project is syncing data between two Salesforce environments. If I'm syncing an Opportunity and it has an Account that hasn't been synced yet, I want to sync that Account and then continue with my Opportunity flow. My current fix should avoid the issue you named where only one instance of a flow can run at a time. I'm going to include the three components of my Account flow inside the Opportunity flow. Using the same components should allow it to be maintainable since I shouldn't have to make the same changes to two different flows when revising Account flow components.
For a webhook approach, what would that take, incidentally? I imagine you create the Webhook listener in Celigo as a second source to the Account flow with Secret URL verification for instance, save that Public URL, and in the Account flow make an HTTP component to post data to that URL? I'm not sure it would help anything in this case because I believe the Account flow would run asynchronously and have no way of responding to the Opportunity flow with the IDs that were created.
Geoff Tipley yeah it sounds like webhook process wouldn't work in your case. You're basically wanting your opportunity syncing flow (flow #1) to call another flow for account syncing, wait on that flow to complete, then move on to create the opportunity record in flow 1 so that you only have a single place to maintain account mappings and logic, right?
Reusing the import/lookup steps like you are doing will work, but it would be nice to visually show that reuse differently (i.e. sub flow).
Adding another flow run option here in case anyone needs it. Since IO now has the ability to run single exports on a flow, the /run object can include a payload of the exportIds that you want to run. Use the same /flows/<_id>/run as above, but use the below body. Use your own exportIds of course.
Another one we're working to add into the docs is replacing a connection at the flow level. To do so, use the following endpoint with the following payload.
Tyler Lamparter Is it possible to perform a "Download Flow" via API call, similar to how the action is available in the Integration Flows list on the dashboard? If not, would the only solution be to make calls for all the connections, exports, imports, flows, and scripts separately? Thank you!
Bijan Samiee yes there is one available. It returns a pre-signed url that you can then use to download.
Request:
Response:
You can also download an integration via:
In this comment (https://docs.celigo.com/hc/en-us/articles/7707985934363/comments/15350161055899) from Tyler Lamparter it was mentioned a feature to allow non-start/end date fields to be sent to the flow/<id>/run end-point.
Was this ever completed? If so, how is it done?
The only other way I see being able to do this is using a webhook or creating a My API that calls the flow's exports/imports but that would be silly to create a flow only to have to write it out in JavaScript in an API to pass in custom data.
Webhooks work but you don't get a JobID or anything returned.
Josh Braun, no this was not done. You'd need to:
Tyler Lamparter
Thanks for the response.
Ultimately the goal is to send IDs for the Flow's export to use. For example sending IDs 1, 2, and 3, and the export uses those IDs to find the records 1, 2, and 3 to export instead of all records between a start and end date or a since-last-run date.
Josh Braun curious why you need a response back? If a webhook is the source, then you do some lookups, you could send that to where you need? Maybe I don't fully understand your use case yet.
Tyler Lamparter
Our use case is that we have UI in our system that the user would use to select the records to send to whatever system. We'd call the celigo endpoint (MyAPI or webhook) to pass the IDs in to begin the process for the selected records and we would be able to provide status updates/errors reported from the Flow for those records.
Josh Braun that makes sense. For this use case, I assume you would prefer to get the status back in real-time vs being asynchronously processed? A MyAPI is the only tool we currently have that would allow you to response back in real-time with the outcome. If you used a webhook or passed settings to a flow, those would be asynchronous and you would need the flow itself to write back the status to your application.
Please sign in to leave a comment.