Use case for extracting 40+ MB data from one schema to Appian Business DB

Certified Associate Developer

I have a following Use case-

1- Need to extract 40+ MB data from different system as per Ondemand request from Appian Interface

2- Dump this data into some staging table residing in Appian Business Database schema and perform operation afterwards

Which solution will be most usefule in terms of Scalability and availability

1- Use SQLtoExcel plugin? will this support to get the dump into excel via User on demand request

2- Export Datastore entity to Excel? will it support 40+ MB data or will it be timed out?

Any other approach we can adopt?

Here major concern is On Demand request for processing the data via Appian Interface.

  Discussion posts and replies are publicly visible

Parents Reply
  • 0
    Certified Associate Developer
    in reply to Mathieu Drouin

    User will not augment the data with this process. It is just pulling and dumping into our database schema.

    Here requirement is to get the data from one table residing in different scheme not owned by us and then dump that data into our schema staging table. And this activity should be on demand.

    Post staging we will perform some filters etc and logic to reduce the volume at Database side via Stored Proc etc. And then filtered data will update the transactional data and user will be able to see the changes based on latest on demand request.

    Post that they will do workflow activities on updated data in a optimized way 

Children