Hi All,
We have a new project coming up, it is expected to have almost around 600,000 records being created in a year. We have one existing application with let's say around 100,000 records using entity backed synced record. My concern here is that existing application performs very slow with 100,000 records and most of the times export to excel fails also with 20,000 records to be exported now if we talk about 600,000 cases I am not 100% sure to use Appian OOB records.
I know we can create different records based on the personas and default source filters to make it work but what if customer wants to see all the records?
What could be the alternative approaches to deal with this?
Thanks!
Discussion posts and replies are publicly visible
Hi Radhika Initially analyze the root causes of the performance issues in the existing application. Identify inefficient queries, or architectural issues that results in slow performance of the application. Conduct some performance testing with more larger datasets, Based on that you can start optimizing database queries, optimizing Appian expressions and rules, and improving UI rendering efficiency. You can also include some alternative approaches, Like data archiving move the unused data into different schema / Databases. You can increase the data in database in incremental way in smaller batches, These are couple of thoughts on handling large datasets.