I have a following Use case-
1- Need to extract 40+ MB data from different system as per Ondemand request from Appian Interface
2- Dump this data into some staging table residing in Appian Business Database schema and perform operation afterwards
Which solution will be most usefule in terms of Scalability and availability
1- Use SQLtoExcel plugin? will this support to get the dump into excel via User on demand request
2- Export Datastore entity to Excel? will it support 40+ MB data or will it be timed out?
Any other approach we can adopt?
Here major concern is On Demand request for processing the data via Appian Interface.
Discussion posts and replies are publicly visible
Not sure why you need to export to excel? The user will augment the data before re-importing it back in?
I'd say that Export Data Store Entity to Excel Smart Service is your best bet. It's supposed to handle large datasets. But you won't be able to do it synchronously.
https://docs.appian.com/suite/help/24.4/Export_To_Excel_Smart_Service.html
This is not something I would want to implement directly in Appian. Triggering and monitoring the process from Appian is fine. And adding any Excel to the game does not improve that. I suggest to use an external ETL toolchain.
Unfortunately we don't have any other feasible tool to accomodate or consider or involve other expertise resources.
What do you think of exportSqltoExcel and then ImportexceltoDatabase smart services?
We are already using exceltodatabase for large dataset but need to know threshold for these smart services
User will not augment the data with this process. It is just pulling and dumping into our database schema.
Here requirement is to get the data from one table residing in different scheme not owned by us and then dump that data into our schema staging table. And this activity should be on demand.
Post staging we will perform some filters etc and logic to reduce the volume at Database side via Stored Proc etc. And then filtered data will update the transactional data and user will be able to see the changes based on latest on demand request.
Post that they will do workflow activities on updated data in a optimized way
For what reason do you need/want to copy that data? Would pointing a synced record to that data source not fit your needs?
If the user will not augment the data, why not just directly write to the DB or simply sync it as a Service Backed Record as Stefan as suggested?
DB table residing in another database/schema and it will have 150+ columns but record with sync facility can accommodate 100 columns only.
OK, and making it un-synced?
Yes.. I am trying the same with batch processing to write into our staging table.
Though here dealing with large volume so need to think about memory threshold and as well time to complete process.