Bulk Data Validation Using Integrations

Hi Team,

I have a requirement to upload the excel and read the entries to populate the data into the appian database. while reading the data i need to check the valid data from integrations. For each and every row i need to call 2 integration to check if the data is valid or not.

Since we are handling huge amount of data it will have huge impact on performance if we go with the approach.

I am planning to create one process for integrations and to run the process by using MNI in the form of batches but no sure if i can go with this approach or not. Can anyone suggest me the ideal way to handle this issue.

Thanks in Advance!

  Discussion posts and replies are publicly visible

  • From your description it sounds like you have a good handle on the sorts of issues that need to be addressed. The main thing is to ensure that you have means of "throttling" the processing so that you don't flood the application server with process instances (which, again, sounds like you've recognised and have a design in mind to pre-empt this).

    Some questions about the nature of the validation integrations:

    • do they have to be integrations? that is, is there any way to bring the logic/data into Appian so that this can be conducted locally? That would definitely improve the performance, but may bring other issues to solve (e.g. keeping the Appian versions in sync)
    • do they have to be called at the individual row level? That is, could the integrations take batches of data, and so reduce the number of calls you have to make?
    • similarly, is there any middle-ware in the architecture that could make the two calls for you, so that you reduce the number of calls from Appian to just one per row?