processing 50000+ entries

Hi all,

We have the requirement to read from different databases 50k + entries, apply some validation rules and for the ones that fail the validaiton generate an approval workflow.

The recommendation that we have been given to the customer is that using Appian as an ETL tool is not a best practice. It is ok to use Appian for the approval workflow but to process 50k instances they should use an ETL tool. Or at least develop something in java and wrap it as an appian plugin. I dont see the benefit of using Appian for this and also I see potential problems with memory.

What is your opinion?

  Discussion posts and replies are publicly visible

Parents
  • 0
    Certified Lead Developer

    I would tend to agree that you probably don't want to do the ETL or validation rules directly in Appian processes, but context on data size, how often this is running, complexity of validation rules is going to be important when deciding on a final solution design.

    How often is this validation process occurring? If this is a one time validation and approval process than you can probably get away with a less elegant solution. Are your databases all on-premise or are you utilizing the Appian Cloud MySQL database as well as other on-premise databases? Can stored procedures be utilized in the databases to run some or all of the validation rules prior to moving any data? Processing the data before it is moved between system will lighten the load.

    I would try to get the data processed before doing any sort of transfer to minimize the amount of data that needs to be moved and subsequently processed within the transactional system. If you do need to process the data within the bounds of the transactional system, I would try to do it in stored procedures or Java. Hopefully that could be done off hours so you aren't consuming processor cycles or memory while users are actively working in the system. Once the data is processed you probably want to clean up data that won't be used and allow Appian to kickoff approval workflows for only the subset of data that requires an approval.

Reply
  • 0
    Certified Lead Developer

    I would tend to agree that you probably don't want to do the ETL or validation rules directly in Appian processes, but context on data size, how often this is running, complexity of validation rules is going to be important when deciding on a final solution design.

    How often is this validation process occurring? If this is a one time validation and approval process than you can probably get away with a less elegant solution. Are your databases all on-premise or are you utilizing the Appian Cloud MySQL database as well as other on-premise databases? Can stored procedures be utilized in the databases to run some or all of the validation rules prior to moving any data? Processing the data before it is moved between system will lighten the load.

    I would try to get the data processed before doing any sort of transfer to minimize the amount of data that needs to be moved and subsequently processed within the transactional system. If you do need to process the data within the bounds of the transactional system, I would try to do it in stored procedures or Java. Hopefully that could be done off hours so you aren't consuming processor cycles or memory while users are actively working in the system. Once the data is processed you probably want to clean up data that won't be used and allow Appian to kickoff approval workflows for only the subset of data that requires an approval.

Children