Articles in this section

Flow API endpoints

download.svg​​ Download Postman collection

Understand and create custom flows

Flow API endpoints

Relative URI

Method

Success code

Description

/flows

GET

200

Get all flows.

POST

201

Create a flow.

/flows/<_id>

GET

200

Get a specific flow.

PUT

200

Update a specific flow.

PATCH

204

Update part of a specific flow.

DELETE

204

Delete a specific flow.

/flows/<_id>/clone

POST

201

Clone a specific flow.

/flows/<_id>/run

POST

201

Run a specific flow on demand.

/flows/<_id>/replaceConnection

PUT

200

Replace a connection at the flow level.

/flows/<_id>/audit

GET

200

Get a log for a specific flow.

/flows/<_id>/template

GET

200

Download a flow.

/flows/<_id>/dependencies

GET

200

Get all resources using or used by this flow.

The following fields can be updated using PATCH:

Field

Sub-field

Data type

name

string

description

string

schedule

string

timezone

string

disabled

boolean

runPageGeneratorsInParallel

boolean

pageGenerators[*]

schedule

string

skipRetries

boolean

routers[*]

name

string

branches[*].name

string

branches[*].description

string

aliases

mixed

Flow API examples

Get a specific simple flow (with just a single export and import)

GET /v1/flows/55e••••••••••••••••••367 HTTP/1.1
Host: api.integrator.io
Authorization: Bearer my_api_token

Sample response

{
  "_id":"55e••••••••••••••••••367",
  "lastModified":"2017-06-19T21:27:09.945Z",
  "name":"Send GitHub Events to Slack",
  "disabled":false,
  "timezone":"America/Los_Angeles",
  "_exportId":"55e••••••••••••••••••366",
  "_importId":"55e••••••••••••••••••d14",
  "_integrationId":"58f••••••••••••••••••1bc",
  "skipRetries":false,
  "createdAt":"2017-06-19T21:27:09.900Z"
}

Get a specific flow (with multiple exports and imports)

GET /v1/flows/598••••••••••••••••••d9a HTTP/1.1
Host: api.integrator.io
Authorization: Bearer my_api_token

Sample response

Get a specific flow with multiple exports and imports linked together

{
  "_id":"598••••••••••••••••••d9a",
  "lastModified":"2017-08-19T17:04:54.005Z",
  "name":"Update Usage Stats for all Trialers",
  "disabled":false,
  "schedule":"? 0 2 ? * *",
  "timezone":"America/Los_Angeles",
  "_integrationId":"593••••••••••••••••••74e",
  "skipRetries":false,
  "pageProcessors":[
    {
      "type":"export",
      "_exportId":"598••••••••••••••••••428",
      "responseMapping":{
        "lists":[],
        "fields":[
          {
            "extract":"data",
            "generate":"numConnections"
          }
        ]
      }
    },
    {
      "type":"export",
      "_exportId":"598••••••••••••••••••d32",
      "responseMapping":{
        "lists":[],
        "fields":[
          {
            "extract":"data",
            "generate":"dlSuccessAndErrors"
          }
        ]
      }
    },
    {
      "type":"export",
      "_exportId":"598••••••••••••••••••6c6",
      "responseMapping":{
        "lists":[],
        "fields":[
          {
            "extract":"data",
            "generate":"lastSignIn"
          }
        ]
      }
    },
    {
      "type":"export",
      "_exportId":"598••••••••••••••••••638",
      "responseMapping":{
        "lists":[],
        "fields":[
          {
            "extract":"data",
            "generate":"numDataLoaders"
          }
        ]
      }
    },
    {
      "type":"export",
      "_exportId":"598e0c",
      "responseMapping":{
        "lists":[],
        "fields":[
          {
            "extract":"data",
            "generate":"numFlows"
          }
        ]
      }
    },
    {
      "type":"export",
      "_exportId":"598••••••••••••••••••54b",
      "responseMapping":{
        "lists":[],
        "fields":[
          {
             "extract":"data",
             "generate":"numPasswordChanges"
          }
        ]
      } 
    },
    {
      "type":"export",
      "_exportId":"598••••••••••••••••••e22",
      "responseMapping":{
        "lists":[],
        "fields":[
          { 
            "extract":"data",
            "generate":"numPasswordResets"
          }
        ]
      }
    },
    {
      "type":"export",
      "_exportId":"598e29",
      "responseMapping":{
        "lists":[],
        "fields":[
          {
            "extract":"data",
            "generate":"numSignIns"
          }
        ]
      }
    },
    {
      "type":"export",
      "_exportId":"598e37",
      "responseMapping":{
        "lists":[],
        "fields":[
          {
            "extract":"data",
            "generate":"successAndErrors"
          }
        ]
      }
    },
    {
      "type":"import",
      "_importId":"598••••••••••••••••••615",
      "proceedOnFailure":true,
      "responseMapping":{
        "lists":[],
        "fields":[]
      }
    },
    {
      "type":"import",
      "_importId":"598••••••••••••••••••9f9",
      "responseMapping":{
        "lists":[],
        "fields":[]
      }
    }
  ],
  "pageGenerators":[
    {
      "_exportId":"598••••••••••••••••••286",
      "_id":"599••••••••••••••••••b36"
    },
    {
      "_exportId":"598••••••••••••••••••6dd",
      "_id":"599••••••••••••••••••b35"
    }
  ],
  "createdAt":"2017-08-01T20:43:42.156Z"
}

Run a delta flow and specify the delta start date

POST /v1/flows/55e••••••••••••••••••367 HTTP/1.1
Host: api.integrator.io
Authorization: Bearer my_api_token

Sample body

Send a JSON body payload and the POST /run command in the following syntax. In this example, the delta startDate would be January 1st, 2019.

{"export": {"startDate": 2019-01-01T00:00:00.000"}}
    

Replace a connection at the flow level

Send a JSON body payload that replaces existing connection details with new ones.

PUT /v1/flows/55e••••••••••••••••••367/replaceConnection HTTP/1.1
Host: api.integrator.io
Authorization: Bearer my_api_token

Sample body

Send a JSON body payload that replaces existing connection details with new ones.

          {
   "_connectionId": "<_existingConnectionId>",
   "_newConnectionId": "<_newConnectionId>"
}
        

Replace an import at the flow level

Use the GET /flows/<_id> endpoint to retrieve the JSON representation of the flow you wish to modify. In the retrieved JSON, locate the existing import you want to replace and update it with the new import information. Send the modified JSON as the body of a request to the PUT /flows/<_id> endpoint.

PUT /v1/flows/55e••••••••••••••••••367 HTTP/1.1
Host: api.integrator.io
Authorization: Bearer my_api_token

Sample body

{
    "_id": "55e••••••••••••••••••367",
    "lastModified": "2021-12-14T12:39:24.138Z",
    "name": "Send list of all integrations",
    "disabled": true,
    "_integrationId": "61b••••••••••••••••a6c",
    "pageGenerators": [
        {
            "_exportId": "61b•••••••••••8adb",
            "skipRetries": false
        }
    ],
    "pageProcessors": [
        {
            "responseMapping": {
                "fields": [],
                "lists": []
            },
            "type": "import",
            "_importId": "61b8••••••••••8ae0"
        }
    ],
    "createdAt": "2021-12-14T12:39:24.021Z",
    "free": false,
    "_templateId": "602c••••••••••••••7eb8",
    "_sourceId": "603•••••••••••••83d",
    "autoResolveMatchingTraceKeys": true
}

Run multiple exports from the same flow

POST /v1/flows/55e••••••••••••••••••367/run HTTP/1.1
Host: api.integrator.io
Authorization: Bearer my_api_token

Sample body

Send a JSON body payload that runs the specified exports in a flow. If no JSON body is provided, all exports on the flow will run.

      {
   "_exportIds": [
       "655bd•••••••••••••1bf2",
       "54gfa••••••••••••7bf94"
   ]
} 

Clone a specific flow and map the connection IDs

POST /v1/flows/55e••••••••••••••••••367/clone HTTP/1.1
Host: api.integrator.io
Authorization: Bearer my_api_token

Sample body

Send a JSON body payload that maps existing connection IDs to the connection IDs in the new flow. You can map to the same connection ID if you don’t want to change it. The connectionMap must map every connection in the existing flow. For example, if you have 5 connections in your flow, you need to map each connection, even if the connections don’t all change.

      {
  "connectionMap": {
      "<existingConnectionId #1>": "<newConnectionId #1>",
      "<existingConnectionId #2>": "<newConnectionId #2>", 
      ……
  },
  "sandbox": false,
  "name": "<Name Flow Here>",
  "_integrationId": "<integration id where the cloned flow should go>",
  "_flowGroupingId": "<flow grouping id where the cloned flow should go - optional>"
}

    

Download a flow

POST /v1/flows/55e••••••••••••••••••367/template HTTP/1.1
Host: api.integrator.io
Authorization: Bearer my_api_token

Sample body

{
"signedURL":"https://integrator-templates.s3.amazonaws.com/55e••••••••••••••••••367.zip?AWSAccessKeyId=XXXXXXX&Expires=1707272837&Signature=XXXXXXXXX",
"key":"<flowId>.zip"
}
    
Was this article helpful?
0 out of 0 found this helpful

Comments

20 comments
Date Votes
  • Hello,

    What are the parameters (both required and optional) for `/flows/<_id>/clone` and response?

    Thank you!

    0
  • Pablo Gonzalez here you go. We'll get docs updated as well.

    {
      "connectionMap": {
          "<current connection id>": "<new connection id>",
          "<current connection id>": "<new connection id>"
      },
      "sandbox": false,
      "name": "Name Flow Here",
      "_integrationId": "<integration id where the cloned flow should go>",
      "_flowGroupingId": "<flow group id where the cloned flow should go - optional>"
    }
    2
  • Hi Tyler Lamparter,

    Awesome, thank you for the info. I saw the connectionMap and was wondering why it was being written twice and decided to try this another time. I looked at it again today and it makes sense! It's literally mapping the amount of connections from the source flow to the destination flow. I was able to successfully clone my flow via API, thank you again for your quick response! 

    - Pablo G.

    0
  • Hi Tyler Lamparter. Am I able to send non date parameters using the Flow Run Endpoint?

    Example: 

    {
       "export":{
          "sku":"XX-XXXX-SAMPLE-SKU",
          "startDate":"2023-01-05T00:00:00.000",
          "endDate":"2023-01-06T00:00:00.000"
        }
    }

    I Know the endDate and startDate are both mapping to currentExportDateTime and lastExportDateTime respectively, but I'm not able to use additional parameters as the sku on the example.

    This behavior would be helpful for APIs that are not based on Dates for filtering.

    Using the Run Export API, I'm able to send that SKU parameter, but not on the Flow API.
    0
  • Emil Daniel Ordoñez Avila unfortunately not at this time. It is currently slated as an enhancement for summer/fall of this year.

    0
  • Tyler Lamparter being able to run another flow and pass in custom data for it to reference would be HUGE for me. Is there any way to work around this by having a flow change a value in Integration settings?

    0
  • Geoff Tipley yes you can update the integration settings prior to calling the flow to run. The only downside I see of doing this is that a flow can only have 1 job run at a time and I assume you need to run this for dozens, hundreds, or thousands of skus right? I'm curious what your workflow is here.

    Could another solution be to have a webhook on flow #2, have flow #1 post data to that webhook, then have a lookup step that runs the report for that sku? I'm not sure how many results are expected for that one lookup.

     

    0
  • My project is syncing data between two Salesforce environments. If I'm syncing an Opportunity and it has an Account that hasn't been synced yet, I want to sync that Account and then continue with my Opportunity flow. My current fix should avoid the issue you named where only one instance of a flow can run at a time. I'm going to include the three components of my Account flow inside the Opportunity flow. Using the same components should allow it to be maintainable since I shouldn't have to make the same changes to two different flows when revising Account flow components.

    For a webhook approach, what would that take, incidentally? I imagine you create the Webhook listener in Celigo as a second source to the Account flow with Secret URL verification for instance, save that Public URL, and in the Account flow make an HTTP component to post data to that URL? I'm not sure it would help anything in this case because I believe the Account flow would run asynchronously and have no way of responding to the Opportunity flow with the IDs that were created.

    0
  • Geoff Tipley yeah it sounds like webhook process wouldn't work in your case. You're basically wanting your opportunity syncing flow (flow #1) to call another flow for account syncing, wait on that flow to complete, then move on to create the opportunity record in flow 1 so that you only have a single place to maintain account mappings and logic, right?

     

    Reusing the import/lookup steps like you are doing will work, but it would be nice to visually show that reuse differently (i.e. sub flow).

    0
  • Adding another flow run option here in case anyone needs it. Since IO now has the ability to run single exports on a flow, the /run object can include a payload of the exportIds that you want to run. Use the same /flows/<_id>/run as above, but use the below body. Use your own exportIds of course.

    {
       "_exportIds": [
           "655bd************1bf2",
           "655bdb1***********7bf94"
       ]
    }

    0
  • Another one we're working to add into the docs is replacing a connection at the flow level. To do so, use the following endpoint with the following payload.

    PUT /v1/flows/<_flowId>/replaceConnection

    {
      "_connectionId": "<_existingConnectionId>",
      "_newConnectionId": "<_newConnectionId>"
    }

    0
  • Tyler Lamparter Is it possible to perform a "Download Flow" via API call, similar to how the action is available in the Integration Flows list on the dashboard? If not, would the only solution be to make calls for all the connections, exports, imports, flows, and scripts separately? Thank you!

    0
  • Bijan Samiee yes there is one available. It returns a pre-signed url that you can then use to download.

    Request:

    GET https://api.integrator.io/v1/flows/<flowId>/template

     

    Response:

    {
    "signedURL":"https://integrator-templates.s3.amazonaws.com/<flowId>.zip?AWSAccessKeyId=XXXXXXX&Expires=1707272837&Signature=XXXXXXXXX",
    "key":"<flowId>.zip"
    }
    0
  • You can also download an integration via:

    GET https://api.integrator.io/v1/integrations/<integrationId>/template
    0
  • In this comment (https://docs.celigo.com/hc/en-us/articles/7707985934363/comments/15350161055899) from Tyler Lamparter it was mentioned a feature to allow non-start/end date fields to be sent to the flow/<id>/run end-point. 

    Was this ever completed? If so, how is it done? 

    The only other way I see being able to do this is using a webhook or creating a My API that calls the flow's exports/imports but that would be silly to create a flow only to have to write it out in JavaScript in an API to pass in custom data. 

    Webhooks work but you don't get a JobID or anything returned. 

    0
  • Josh Braun, no this was not done. You'd need to:

    1. Use a webhook as a source of a flow and then a lookup step to pass the date params in. This would be fine as long as the data returned is less than 5 MB.
    2. Use a MyAPI as a source where the javascript handles it and you setup a paging mechanism on the export and MyAPI. 
    3. Update the settings on the flow and reference the flow settings within your export. This would be limited to 1 set of start/end dates per flow run so would be less ideal, but is the same behavior current flows have where it can only have 1 job running at a time.
    0
  • Tyler Lamparter

    Thanks for the response. 

    1. Webhooks do work, however, nothing is returned from the request. Additional lookups would have to be done to find the Job and get status/errors. (At least my webhooks tests showed no response body.)
    2. Could you explain the MyAPI and paging solution a bit more? My understanding of the MyAPI solution is that I would have to recreate the Flow I already created but in the script file. So if my Flow has 1 export and 5 lookups/imports I need to script all those steps. The Flow would effectively only be used for reference but all the components are used in the script.
    3. Settings could work, or at least could work in the future. But if they are limited only to a start/end date and don't allow other settings (at least to be passed in externally) I'm not sure that would work for us here. 

    Ultimately the goal is to send IDs for the Flow's export to use. For example sending IDs 1, 2, and 3, and the export uses those IDs to find the records 1, 2, and 3 to export instead of all records between a start and end date or a since-last-run date. 

    0
  • Josh Braun curious why you need a response back? If a webhook is the source, then you do some lookups, you could send that to where you need? Maybe I don't fully understand your use case yet.

    0
  • Tyler Lamparter

    Our use case is that we have UI in our system that the user would use to select the records to send to whatever system. We'd call the celigo endpoint (MyAPI or webhook) to pass the IDs in to begin the process for the selected records and we would be able to provide status updates/errors reported from the Flow for those records. 

    0
  • Josh Braun that makes sense. For this use case, I assume you would prefer to get the status back in real-time vs being asynchronously processed? A MyAPI is the only tool we currently have that would allow you to response back in real-time with the outcome. If you used a webhook or passed settings to a flow, those would be asynchronous and you would need the flow itself to write back the status to your application. 

    0

Please sign in to leave a comment.