I have a following Use case-
1- Need to extract 40+ MB data from different system as per Ondemand request from Appian Interface
2- Dump this data into some staging table residing in Appian Business Database schema and perform operation afterwards
Which solution will be most usefule in terms of Scalability and availability
1- Use SQLtoExcel plugin? will this support to get the dump into excel via User on demand request
2- Export Datastore entity to Excel? will it support 40+ MB data or will it be timed out?
Any other approach we can adopt?
Here major concern is On Demand request for processing the data via Appian Interface.
Discussion posts and replies are publicly visible
Not sure why you need to export to excel? The user will augment the data before re-importing it back in?
I'd say that Export Data Store Entity to Excel Smart Service is your best bet. It's supposed to handle large datasets. But you won't be able to do it synchronously.
https://docs.appian.com/suite/help/24.4/Export_To_Excel_Smart_Service.html
User will not augment the data with this process. It is just pulling and dumping into our database schema.
Here requirement is to get the data from one table residing in different scheme not owned by us and then dump that data into our schema staging table. And this activity should be on demand.
Post staging we will perform some filters etc and logic to reduce the volume at Database side via Stored Proc etc. And then filtered data will update the transactional data and user will be able to see the changes based on latest on demand request.
Post that they will do workflow activities on updated data in a optimized way
If the user will not augment the data, why not just directly write to the DB or simply sync it as a Service Backed Record as Stefan as suggested?