Hello Community,
What is the best way to query one lakh records of data from DB.
please suggest!!!
Thanks
Discussion posts and replies are publicly visible
FYI a lot of readers here might not be familiar with the term "Lakh", referring to a hundred thousand (100,000) of something. I saw it mentioned previously a few years ago and I had to look it up at that time.
As for querying that much data -- we need more information to really give you meaningful / useful answers. What is the nature and depth of that data? What do you intend to do with it after it's been queried? In general, I've found in the past that the best way to deal with bulky data like this is to create a process that steps through batches of data at a time, processing between 1,000 and 10,000 rows at once (depending on the size of a given row). But that would vary per different use cases of course.
like a hundred thousand rows of data to be copied into an excel sheet(in one single sheet all queried data need to be copied).
The relatively new "Export Data Store Entity to Excel" smart service is probably your best bet, as it can be fed a very large data set and will internally handle paging so as to not break on a single query being too large.
thanks Mike Schmitt
Agree with Mike - we use this component for our standard report export functions, works great. Just tested an export to excel of 150,000 rows - ran in about 15 seconds without issues.
Working with any application (irrespective of Appian) if requirement is to query huge records from database below are few suggestions for same:
1. Make sure you have proper indexing on table. And while trying to make query use this indexes accordingly to make retrieval of data faster
2. In select query keep only those fields which needs to be displayed
3. Ask dba to check query execution plan, and make changes at db level to optimize the query plan