Articles in this section

Import data into MongoDB

Before you begin

Set up a connection to MongoDB.

Create an import

From the Tools menu, select Flow builder. For the Destination application, click MongoDB. Select your MongoDB connection from the connection list and click Next.

–– OR ––

From the Resources menu, select Imports. In the resulting Imports page, click + New Import. From the application list, click MongoDB. Select your MongoDB connection, add a name and a description for your import.

One to many (required): There are advanced use cases where a parent record is being passed around in a flow, but you actually need to process child records contained within the parent record context. For example, if you're exporting Sales Order records out of NetSuite and importing them into Salesforce as Opportunity and Opportunity Line Item records, then you will need to import the Opportunity Line Item records using this option. One to many is used in cases where you have a single record that internally needs to create multiple records. This field cannot be used when importing a CSV file.

  • Path to many (required): If the records being processed are JSON objects, then use this field to select or enter the JSON path to the child records. This field does not need to be set for array/row-based data.

How would you like your records imported?

Method (required): Enter the method to use for adding or updating documents in your MongoDB instance. For more information on the available methods please refer to the MongoDB documentation.

Insert many

Collection (required): Enter the name of the MongoDB collection in your database that you would like to query from. For example: orders, items, users, customers, etc…

MongoDB document (optional): By default integrator.io will create new documents in your MongoDB instance using the raw JSON data returned by the exports running in your flow (or the raw JSON data that you submitted via the integrator.io API). If you want to modify the data before it is added to MongoDB (for example, using handlebars to convert timestamps to Dates) then enter a JSON string describing the expected document object structure in this field. The value of this field must be a valid JSON string describing a MongoDB document.

Ignore existing records (optional): When importing new data, if it is possible for the data being imported to already exist in the import application, or if you are worried that someone might accidentally re-import the same data twice, you can use this field to tell integrator.io to ignore existing data. It is definitely a best practice to have some sort of protection against duplicates, and this field is a good solution for most use-cases. The only downside of using this field is the slight performance hit needed to check first if something exists or not.

Update one

Collection (required): Enter the name of the MongoDB collection in your database that you would like to query from. For example: orders, items, users, customers, etc…

MongoDB filter (optional): If you want to update documents in your MongoDB instance please enter a filter object to find existing documents here. The value of this field must be a valid JSON string describing a MongoDB filter object in the correct format and with the correct operators. Refer to the MongoDB documentation for the list of valid query operators and the correct filter object syntax.

MongoDB document (optional): Enter the update object that specifies the fields to modify when updating documents in your MongoDB instance. The value of this field must be a valid JSON string describing a MongoDB update object in the correct format and with the correct operators. Refer to the MongoDB documentation for the list of valid update operators and the correct update object syntax. If this field is left blank then the default update object of { "set": } will be used.

Upsert (optional): Set this field to true if you want MongoDB to dynamically create new documents when nothing is found with the provided filter. Set this field to false (i.e. the default) if you want MongoDB to ignore documents that cannot be found with the provided filter.

Advanced

Concurrency ID lock template (optional): This field can be used to help prevent duplicate records from being submitted at the same time when the connection associated with this import is using a concurrency level greater than 1.

Saying this another way, there are fields on the connection record associated with this import to limit the number of concurrent requests that can be made at any one time, and if you are allowing more than 1 request at a time then it is possible for imports to override each other (i.e. a race condition) if multiple messages/updates for the same record are being processed at the same time.

This field allows you to enforce an ordering across concurrent requests such that imports for a specific record ID will queue up and happen one at a time (while still allowing imports for different record ids to happen in parallel).

The value of this field should be a handlebars template that generates a unique id for each exported record (note: we are using the raw exported data when generating the IDs -- before any import or mapping logic is invoked), and then with this id the integrator.io back-end will make sure that no two records with the same id are submitted for import at the same time.

One example, if you are exporting Zendesk records and importing them into NetSuite then you would most likely use '{{id}}' (the field Zendesk uses to identify unique records), and then no two records with the same Zendesk ID value would import into NetSuite at the same time.

Data URI template (optional): When your flow runs but has data errors this field can be really helpful in that it allows you to make sure that all the errors in your job dashboard have a link to the target data in the import application (where possible).

This field uses a handlebars template to generate the dynamic links based on the data being imported.

Note

The template you provide will run against your data after it has been mapped, and then again after it has been submitted to the import application, to maximize the ability to link to the right place.

For example, if you are updating a customer record in Shopify, you would most likely set this field to the following value: https://your-store.myshopify.com/admin/customers/{{{id}}}.

Map data in MongoDB

Map your data using Mapper 2.0. Mapper 2.0 provides a clear visual representation of both source and destination JSON structures, enabling you to reference sample input data from the source app while building the JSON structure to be sent to a destination app. You can easily build complex JSON structures that include nested arrays and make use of data type validation for required fields and incompatible data types.

Was this article helpful?
0 out of 0 found this helpful

Comments

0 comments

Please sign in to leave a comment.