Error Management Recommendations

I am getting ready to do a batch upload of roughly 5,000 records. I am planning to fail certain records (appropriately) if they are missing certain information from our source system, but its difficult for me to identify exactly how many records might fail.

My question is what is the best way to fail these records since this could be a large number? It seems cumbersome to deal with them one by one in Celigo, so would it be advised to fail them to a google sheet and work off of that to go back and update the source system and then replay the failed records?

Curious to know what others have done when they know ahead of time they could be dealing with a large amount of failed records?

0

Comments

2 comments
Date Votes
  • 5000 is not really a lot of records, and I would not do anything extra in my flow to process errors. The error management in integrator.io lets you bulk retry, bulk resolve, and you can also search and apply bulk actions to search results, or you can download all errors. If you want to rerun the 5k records without doing any error management at all, assuming you have a trace key for each record and the auto resolve matching trace keys setting enabled, then integrator.io will auto resolve errors for you, etc...

    That said, if you are ever processing much larger data sets, and it is possible to have >20k errors, then it might make sense to build special logic into your flow to route errors to a file after they fail an import, etc...

    0
  • Thanks Scott for your feedback. I appreciate the insight into this!

    0

Please sign in to leave a comment.

 

Didn't find what you were looking for?

New post