Articles in this section

JavaScript hooks

Hooks (​​hook.svg​​) are your place to write custom code that can run at different stages during the execution of your flow to modify the functionality of the flow. Hooks are typically invoked individually for each page of data. A page is a set of records processed in smaller batches to improve performance. You can configure the number of records per page via the Page size property in the Advanced section of your export or listener.

Tip

Press the Home key to return to the top of this article (Fn + Left Arrow for Mac).

Warning

Any hook that calls an export, lookup, or import must be in an integration tile rather than a standalone flow. You cannot call a resource inside a hook if your hook is in a standalone flow. To call a resource inside a hook, create a new integration tile and then add your flow to the tile.

20386055441819-hooks panel.png
20386055450139-preSave script editor.png

preSavePage

The preSavePage hook is invoked on a page of records before the page is sent to subsequent steps in your flow. This hook can be used to add, update or delete records. This hook is a great place to execute logic on batches of records at the same time. For example:

  • Filter or group records in batches

  • Apply calculations in batches

  • Perform complex data transformations in batches

  • Validate data and return errors for records

Function stub

 /*
* preSavePageFunction stub:
*
* The name of the function can be changed to anything you like.
*
* The function will be passed one 'options' argument that has the following fields:
*   'data' - an array of records representing one page of data. A record can be an object {} or array [] depending on the data source.
*   'files' - file exports only. files[i] contains source file metadata for data[i]. i.e. files[i].fileMeta.fileName.
*   'errors' - an array of errors where each error has the structure {code: '', message: '', source: '', retryDataKey: ''}.
*   'retryData' - a dictionary object containing the retry data for all errors: {retryDataKey: { data: <record>, stage: '', traceKey: ''}}.
*   '_exportId' - the _exportId currently running.
*   '_connectionId' - the _connectionId currently running.
*   '_flowId' - the _flowId currently running.
*   '_integrationId' - the _integrationId currently running.
*   'pageIndex' - 0 based. context is the batch export currently running.
*   'lastExportDateTime' - delta exports only.
*   'currentExportDateTime' - delta exports only.
*   'settings' - all custom settings in scope for the export currently running.
*   'job' - the job currently running.
*   'testMode' - Boolean flag that executes script only on test mode and preview/send actions.
*
* The function needs to return an object that has the following fields:
*   'data' - your modified data.
*   'errors' - your modified errors.
*   'abort' - instruct the batch export currently running to stop generating new pages of data.
*   'newErrorsAndRetryData' - return brand new errors linked to retry data: [{retryData: <record>, errors: [<error>]}].
* Throwing an exception will signal a fatal error and stop the flow.
*/
function preSavePage (options) {
  // sample code that simply passes on what has been exported
  return {
    data: options.data,
    errors: options.errors,
    abort: false,
    newErrorsAndRetryData: []
  }
}

Example

This preSavePage hook function replaces a potentially large sub-object with an error message if its size exceeds 500,000 characters (approximately 1 MB) in each data record.

function swapHugeSubObjectWithInlineErrorMsg (options) {
  options.data.forEach(function(d) {
    let size = JSON.stringify(d.potentiallyBigSubObject).length
    if (size > 500000) {
      d.potentiallyBigSubObject = {
        errorMessage: "potentiallyBigSubObject  1MB"
      }
    }
  })
  return {
    data: options.data,
    errors: options.errors,
    abort: false
  }
}

Other examples:

Restructure payload with a preSavePage hook

preMap

The preMap hook is invoked on a page of records before the records are mapped from source to destination structures. This hook can be used to validate, update, or ignore records before mapping rules are run. Changes made to records in this hook are localized and will not get passed along to subsequent steps in the flow. This hook is a great place to execute logic on batches of records to optimize visual field mapping. For example:

  • Reformat fields or object structures to avoid complex mappings

  • Perform calculations on lists to avoid tedious mapping expressions

  • Load the destination record to pre-populate data needed by mapping logic

Function stub

/*
* preMapFunction stub:
*
* The name of the function can be changed to anything you like.
*
* The function will be passed one ‘options’ argument that has the following fields:
*   ‘data’ - an array of records representing the page of data before it has been mapped.  A record can be an object {} or array [] depending on the data source.
*   '_importId' - the _importId currently running.
*   '_connectionId' - the _connectionId currently running.
*   '_flowId' - the _flowId currently running.
*   '_integrationId' - the _integrationId currently running.
*   'settings' - all custom settings in scope for the import currently running.
*   'job' - the job currently running.
*   'testMode' - Boolean flag that executes script only on test mode and preview/send actions.
*
* The function needs to return an array, and the length MUST match the options.data array length.
* Each element in the array represents the actions that should be taken on the record at that index.
* Each element in the array should have the following fields:
*   'data' - the modified/unmodified record that should be passed along for processing.
*   'errors' -  used to report one or more errors for the specific record.  Each error must have the following structure: {code: '', message: '', source: ‘’ }
* Returning an empty object {} for a specific record will indicate that the record should be ignored.
* Returning both 'data' and 'errors' for a specific record will indicate that the record should be processed but errors should also be logged.
* Throwing an exception will fail the entire page of records.
*/
function preMap (options) {
  return options.data.map((d) => {
    return {
      data: d
    }
  })
}

Example

This function will calculate and set the total field on each record in the data array, before the mapping has been applied.

function calculateTotalsUsingPreMap(options) {
  return options.data.map(record => {
    // Initialize total to 0 for each record
    let total = 0;

    // Check if the record has items and iterate over them
    if (record.items && Array.isArray(record.items)) {
      record.items.forEach(item => {
        // Calculate total based on item properties, e.g., quantity and price
        total += (item.quantity || 0) * (item.price || 0);
      });
    }

    // Set the total on the record
    record.total = total;

    // Return the modified record
    return {
      data: record
    };
  });
}

postMap

The post map hook is invoked on a page of records after the records are mapped from source to destination structures. This hook can be used to validate, update, or ignore records before they are submitted to the destination application. Changes made to source records in this hook will persist only for the duration of the import, and will not carry over to downstream applications in your flow. This hook is a great place to execute logic on batches of records to optimize the final payload-building experience in Flow builder. For example:

  • Reformat fields or object structures to avoid complex handlebars expressions

  • Perform calculations on lists to avoid tedious handlebars expressions

  • Load the destination record to dynamically change the fields being submitted

Function stub

/*
* postMapFunction stub:
*
* The name of the function can be changed to anything you like.
*
* The function will be passed one argument ‘options’ that has the following fields:
*   ‘preMapData’ - an array of records representing the page of data before it was mapped.  A record can be an object {} or array [] depending on the data source.
*   ‘postMapData’ - an array of records representing the page of data after it was mapped.  A record can be an object {} or array [] depending on the data source.
*   '_importId' - the _importId currently running.
*   '_connectionId' - the _connectionId currently running.
*   '_flowId' - the _flowId currently running.
*   '_integrationId' - the _integrationId currently running.
*   'settings' - all custom settings in scope for the import currently running.
*   'job' - the job currently running.
*   'testMode' - Boolean flag that executes script only on test mode and preview/send actions.
*
* The function needs to return an array, and the length MUST match the options.data array length.
* Each element in the array represents the actions that should be taken on the record at that index.
* Each element in the array should have the following fields:
*   'data' - the modified/unmodified record that should be passed along for processing.
*   'errors' - used to report one or more errors for the specific record.  Each error must have the following structure: {code: '', message: '', source: ‘’ }
* Returning an empty object {} for a specific record will indicate that the record should be ignored.
* Returning both 'data' and 'errors' for a specific record will indicate that the record should be processed but errors should also be logged.
* Throwing an exception will fail the entire page of records.
*/
function postMap (options) {
  return options.postMapData.map((d) => {
    return {
      data: d
    }
  })
}

Example

This function will calculate and set the total field on each record in the postMapData array, after the mapping has been applied.

function calculateTotalsUsingPostMap(options) {
  return options.postMapData.map(record => {
    // Initialize total to 0 for each record
    let total = 0;

    // Check if the record has items and iterate over them
    if (record.items && Array.isArray(record.items)) {
      record.items.forEach(item => {
        // Calculate total based on item properties, e.g., quantity and price
        total += (item.quantity || 0) * (item.price || 0);
      });
    }

    // Set the total on the record
    record.total = total;

    // Return the modified record
    return {
      data: record
    };
  });
}

postSubmit

The post submit hook is invoked on a page of records after the records are submitted to the destination application. This hook can be used to enhance error messages and/or modify the response objects returned by the destination application. Changes made to the response object are localized and must be mapped back into the source record using a ‘response mapping’ to be visible in the flow. This hook is a great place to execute logic on batches of records to mitigate errors or to optimize response structures needed by subsequent steps in the flow. For example:

  • Ignore errors that have a specific code or message

  • Enhance confusing or cryptic error messages returned by external APIs

  • Delete sensitive data returned in the response _json

Function stub

/*
* postSubmitFunction stub:
*
* The name of the function can be changed to anything you like.
*
* The function will be passed one ‘options’ argument that has the following fields:
*  ‘preMapData’ - an array of records representing the page of data before it was mapped.  A record can be an object {} or array [] depending on the data source.
*  ‘postMapData’ - an array of records representing the page of data after it was mapped.  A record can be an object {} or array [] depending on the data source.
*  ‘responseData’ - an array of responses for the page of data that was submitted to the import application.  An individual response will have the following fields:
*       ‘statusCode’ - 200 is a success.  422 is a data error.  403 means the connection went offline.
*       ‘errors’ - [{code: '', message: '', source: ‘’}]
*       ‘ignored’ - true if the record was filtered/skipped, false otherwise.
*       ‘id’ - the id from the import application response.
*       ‘_json’ - the complete response data from the import application.
*       ‘dataURI’ - if possible, a URI for the data in the import application (populated only for errored records).
*  '_importId' - the _importId currently running.
*  '_connectionId' - the _connectionId currently running.
*  '_flowId' - the _flowId currently running.
*  '_integrationId' - the _integrationId currently running.
*  'settings' - all custom settings in scope for the import currently running.
*  'job' - the job currently running.
*  'testMode' - Boolean flag that executes script only on test mode and preview/send actions.
*
* The function needs to return the responseData array provided by options.responseData. The length of the responseData array MUST remain unchanged.  Elements within the responseData array can be modified to enhance error messages, modify the complete _json response data, etc...
* Throwing an exception will fail the entire page of records.
*/
function postSubmit (options) {
  return options.responseData
}

Example

This function enhances cryptic error messages returned by an external application.

function enhanceErrorMessages(options) {
  return options.responseData.map(response => {
    if (response.errors && response.errors.length > 0) {
      response.errors = response.errors.map(error => {
        // Example: Customize the error message for a specific error code
        if (error.code === 'SPECIFIC_ERROR_CODE') {
          error.message = 'A more descriptive and user-friendly error message.';
        }
        return error;
      });
    }
    return response;
  });
}

Other examples:

Ignore specific errors in a postSubmit hook

postResponseMap

The post response map hook is invoked on a page of records after response or results mapping. This hook is a great place to execute logic on batches of records to optimize the response/results data to be merged back into the source records. For example:

  • Perform calculations using the results of a lookup, and then discard the lookup data

  • Sort, group, and filter records returned by a lookup

  • Remove or restructure response fields returned in the _json response data object

Important

Important lookup information:

  • Due to a bug, post response map hooks created prior to the 2023.9.1 release execute on a per record basis, and not per page page of records.

  • If you clone a hook created prior to the 2023.9.1 release, it will still execute on a per record basis, and not per page of records.

  • If you want your older post response map hook to run per page, you must create another script with the old script pasted into it.

Function stub

/*
* postResponseMapFunction stub:
*
* The name of the function can be changed to anything you like.
*
* The function will be passed one 'options' argument that has the following fields:
*       'postResponseMapData' - an array of records representing the page of data after response mapping is completed. A record can be an object {} or array [] depending on the data source.
*       'responseData' - the array of responses for the page of data.  An individual response will have the following fields:
*       'statusCode' - 200 is a success.  422 is a data error.  403 means the connection went offline.
*       'errors' - [{code: '', message: '', source: ''}]
*       'ignored' - true if the record was filtered/skipped, false otherwise.
*       'data' - exports only.  the array of records returned by the export application.
*       'id' - imports only.  the id from the import application response.
*       '_json' - imports only.  the complete response data from the import application.
*       'dataURI' - imports only.  a URI for the data in the import application (populated only for errored records).
*       'oneToMany' - as configured on your export/import resource.
*       'pathToMany' - as configured on your export/import resource.
*       '_exportId' - the _exportId currently running.
*       '_importId' - the _importId currently running.
*       '_connectionId' - the _connectionId currently running.
*       '_flowId' - the _flowId currently running.
*       '_integrationId' - the _integrationId currently running.
*       'settings' - all custom settings in scope for the export/import currently running.
*       'job' - the job currently running.
*       'testMode' - Boolean flag that executes script only on test mode and preview/send actions.
*
* The function needs to return the postResponseMapData array provided by options.postResponseMapData.  The length of postResponseMapData MUST remain unchanged.  Elements within postResponseMapData can be changed however needed.

* Throwing an exception will signal a fatal error and fail the entire page of records.
*/

function postResponseMap (options) {
  return options.postResponseMapData
}

Example

This function segregates data from the lookup/export application into validGroups (where size is less than 150) and invalidGroups (where size is 150 or greater), and assigns these groups to their respective properties within each postResponseMapData record.

function segregateGroupsReturnedByLookup(options) {
  // Iterate over each record in postResponseMapData
  for (let i = 0; i < options.postResponseMapData.length; i++) {
    // Split data from export application into valid and invalid groups
    let validGroups = options.responseData[i].data.filter(d => d.size < 150);
    let invalidGroups = options.responseData[i].data.filter(d => d.size >= 150);

    // Assign the valid and invalid groups to their respective properties
    options.postResponseMapData[i].validGroups = validGroups;
    options.postResponseMapData[i].invalidGroups = invalidGroups;
  }
  return options.postResponseMapData;
}

postAggregate

The post aggregate hook is invoked after the final aggregated file is uploaded to the destination service. This hook is used to get information about the final file that was aggregated and uploaded to the external destination. This hook will not execute when the skip aggregation field is set to true.

Function stub

  /*
* postAggregateFunction stub:
*
* The name of the function can be changed to anything you like.
*
* The function will be passed one 'options' argument that has the following fields:
*   'bearerToken' - a one-time bearer token which can be used to invoke selected integrator.io API routes.
*   'postAggregateData' - a container object with the following fields:
*     'success' - true if data aggregation was successful, false otherwise.
*     '_json' - information about the aggregated data transfer.  For example, the name of the aggregated file on the FTP site.
*     'code' - error code if data aggregation failed.
*     'message' - error message if data aggregation failed.
*     'source' - error source if data aggregation failed.
*   '_importId' - the _importId currently running.
*   '_connectionId' - the _connectionId currently running.
*   '_flowId' - the _flowId currently running.
*    '_integrationId' - the _integrationId currently running.
*   '_parentIntegrationId' - the parent of the _integrationId currently running.
*   'settings' - all custom settings in scope for the import currently running.
*   'job' - the job currently running.
*   'testMode'- boolean flag that executes script only on test mode and preview/send actions.
*
* The function doesn't need a return value.
* Throwing an exception will signal a fatal error.
*/
function postAggregate (options) {
}

Example

This function sends the id of a successfully uploaded file to a listener for subsequent processing.

import { exports as ioAPIExports } from 'integrator-api';

function sendSuccessfulFileIdToListener(options) {
  // Check if aggregation was successful and _json contains the 'id'
  if (options.postAggregateData.success && options.postAggregateData._json && options.postAggregateData._json.id) {
    // Extract the 'id' from the _json object
    const id = options.postAggregateData._json.id;

    // Send the 'id' to the listener
    ioAPIExports.run({_id: 'YOUR_LISTENER_ID', listenerData: [{id: id}]});
  }
}
Was this article helpful?
0 out of 0 found this helpful

Comments

0 comments

Please sign in to leave a comment.