processing 50000+ entries

Hi all,

We have the requirement to read from different databases 50k + entries, apply some validation rules and for the ones that fail the validaiton generate an approval workflow.

The recommendation that we have been given to the customer is that using Appian as an ETL tool is not a best practice. It is ok to use Appian for the approval workflow but to process 50k instances they should use an ETL tool. Or at least develop something in java and wrap it as an appian plugin. I dont see the benefit of using Appian for this and also I see potential problems with memory.

What is your opinion?

  Discussion posts and replies are publicly visible

Parents
    1. How complex are the validation rules? Could these be effectively applied in the initial data-set selection? That is, the data is effectively pre-filtered before it ever gets to Appian?
    2. If you can do 1. (and you can conduct some quite sophisticated logic in the database layer e.g. using a Stored Procedure) then what's the likely size of the result-set? 1%? 10%? 50%? 95%? It's worthwhile profiling the data to understand the magnitude of the problem and then you can design accordingly.
  • Hi Stewart. Option 1 is also our preferred way. We still Need to find out how complex is the logic to filter the data. If possible we will use a stored procedure. I dont know yet the size of the filtered data but even if it is big as it is already filtered we will only Need the list of Ids and Display them in some forms. Thanks for your answer.

Reply Children
No Data