I have a following Use case-
1- Need to extract 40+ MB data from different system as per Ondemand request from Appian Interface
2- Dump this data into some staging table residing in Appian Business Database schema and perform operation afterwards
Which solution will be most usefule in terms of Scalability and availability
1- Use SQLtoExcel plugin? will this support to get the dump into excel via User on demand request
2- Export Datastore entity to Excel? will it support 40+ MB data or will it be timed out?
Any other approach we can adopt?
Here major concern is On Demand request for processing the data via Appian Interface.
Discussion posts and replies are publicly visible
This is not something I would want to implement directly in Appian. Triggering and monitoring the process from Appian is fine. And adding any Excel to the game does not improve that. I suggest to use an external ETL toolchain.
Unfortunately we don't have any other feasible tool to accomodate or consider or involve other expertise resources.
What do you think of exportSqltoExcel and then ImportexceltoDatabase smart services?
We are already using exceltodatabase for large dataset but need to know threshold for these smart services
For what reason do you need/want to copy that data? Would pointing a synced record to that data source not fit your needs?
DB table residing in another database/schema and it will have 150+ columns but record with sync facility can accommodate 100 columns only.
OK, and making it un-synced?
Yes.. I am trying the same with batch processing to write into our staging table.
Though here dealing with large volume so need to think about memory threshold and as well time to complete process.