Export Data to Excel for more than 50k rows and for 80 columns

Certified Lead Developer

Hello,

We need to export data from a database based on user-selected filters into an Excel sheet for reporting. Currently, our process queries data in batches of 500 rows and uses the deprecated "Export CDT to Excel" function to generate the report, as we need to export 80 columns. Since we are running in batches, we cannot keep the smart service into separate start process as we are appending the generated document with new values and incrementing the start index. 

However, as the data volume has increased, users are exporting data upto 50,000 rows, leading to memory heap issues causing Appian server to go down. The latest export functions have a limitation of 50 columns, which does not meet our requirements. We made process variables as hidden but still memory is hitting upto 6k AMU.

We are using mySQL as database which is hosted on AWS RDS. Currently we are using a!executeStoredProcedureForQuery() to query the data from database. Before we were using a view to query the data which caused db timeouts while querying. So we have created the store procedure and querying the data.

What would be the best approach to optimize performance while ensuring all 80 columns can be exported efficiently?

  Discussion posts and replies are publicly visible

Parents Reply Children
  • 0
    Certified Lead Developer
    in reply to Abhishek Karumuru

    Can you please elaborate on temporary tables? My main issue is because of the iterations as we are trying to generate excel while querying the data in batch of 500. Since we are running looping one instance 500 times, its taking time while generating excel as its appending the generated document.

    So, how can creating temporary tables solve the issue as we still have to query 500 times in the single session?