Export to Excel Funcitionality

Our system is hosted on Appian 16.2

We have a requirement to implement a report that would fetch and display ~30000 records from the database.
The source data set will have total 40 fields and user can select reporting fields to display in the grid using a checkbox.
They would also like to have the ability to export this data to excel.

Since user controls what fields to export, we were planning to use the Download Data Subset plugin.

Would it be feasible to download a large data subset(40 fields X 30000 records) using this plugin? Could we land into performance issues?

If yes, can any of you suggest a workaround to implement the export to excel functionality considering that a static base template cannot be used, as user controls what fields to export in the excel?

OriginalPostID-235787

  Discussion posts and replies are publicly visible

  • @vinayakab- Yes the source of the data is one single view configured in Oracle.
  • @preetamb: I was thinking of using an approach of executing via store procedure and sending the configured fields as an argument which will help to generate report
  • @preetamb To the best of my knowledge, 'Download Datasubset' isn't going to support such huge datasets. I would suggest strictly opting from this as it will end up in failure in exporting the huge dataset and also when you try to send a huge amount of data as input(chosen records and columns in order to apply filters) to the plugin.

    Export SQL data to CSV or Export SQL data to Excel smart services in the 'Excel Tools' plugins at https://forum.appian.com/suite/tempo/records/item/lMBCLGOdlMUpdGVqW3dQaIKmclBmvvNEj8vu_cjb7T-5YiPr4Fu8ly5Yj1s09uenE4RYzA8zKyx7eiUh-mhLnP-zRWuXMAWLSh_Ffkgey8jrCr-KQ/view/summary might be worth giving a try. These helps you accomplish the job with a simple SQL statement(afaik you don't even need a CDT and you can choose the columns and records selectively by dynamically forming the query) and also helps you generate the file without storing any data in the process as these smart services generate the file based on the SQL query. If you are planning to use this, make sure that you have an entity which does everything for you as you just need to run simple native SQL statement over it. The disadvantage with this is, you need to trigger a process in the backend which will notify(via email or task) the user post completion along with a document link and user has to download it separately.

    The other option might be 'RDBMS CSV Import Export' but I am not sure how far it helps you as it seems to me that this exports the complete entity rather than the selected records and columns. But again, this is one of the ways to export the dataset to excel without storing them in the process. The disadvantage with this is, you need to trigger a process in the backend which will notify(via email or task) the user post completion along with a document link and user has to download it separately.

    I have seen the web-api implementation for exporting the dataset to excel in a project recently and it is making use of queryEntity and its working fine as of now as the dataset isn't huge. I might be wrong but this kind of implementation might fail in the case of huge datasets with a timeout issue. Might be worth trying the same approach with the executestoredprocedure() function and see if you can do it this way also. Again, even this doesn't bring any data into Appian and is clean in some way. Added advantage is, the file will be available for the download immediately(similar to Download Data subset implementation) but check if this approach is effective and fail-safe in case of huge datasets.

    As said above, Execute Stored Procedure can be an option but a problem I can sense with this is, again you need to bring the entire dataset into Appian and store it in PV and thereafter choose a plugin which creates the Excel/CSV file based on the data in the PV. Additional burden as per my knowledge is you need to write the stored procedure that parses the inputs(values of filters) to query the data and should be changed whenever there are changes in the way the filtering is applied.

    Refer the documentation at https://forum.appian.com/suite/help/16.2/playbook/Transferring_Processing_Large_Data_Sets_(ETL).html and see what it's saying about testing and various smart services that we should/shouldn't opt for.
  • Using the Download Data Subset plugin you may run into the 1 MB data cap on query results, when trying to grab all records at once. You might want to think about batching, loop over the query rule / queryEntity in batches then combine data subsets.
  • Hi All, Thanks for your recommendations. We have been able to achieve the functionality using the Export SQL To Excel plugin.
  • @preetamb Good choice. If possible, would you please be able to mention the time taken for completion of smart service for a query that gets 30000 records * 40 fields? In case if you are running Health Check, would you please be able to let us know if this operation is flagged as High/Medium Risk?