Large Dataset Query

Certified Associate Developer

I am looking for suggestions on the best and fastest way to query data that has about 50,000 rows of data.  My use case is data collection for inventory purposes where an operator will use a handheld device and scan and item_id barcode, the application retrieve a description and then they can scan a qty.  The reason for the retrieval is to verify a good part number, also to occasionally verify the barcode to the product.  The data currently resides in a SQL database.  We are on Appian 22.3 Cloud.

  Discussion posts and replies are publicly visible

Parents
  • These sound like standard use cases for a!queryEntity() - the only time you'd have issues is if you're trying to retrieve several thousand rows of data all at once, but the use cases you list here don't seem to indicate a need to do that.

    For instance, querying an item from a 50,000 row DB to match a single scanned bar code, I would not expect to be any sort of difficult thing for a normal query filtering by, for example, a barcode number field.

  • Agree with Mike - by filtering under a!queryEntity() you should only need to retrieve 1 or 0 rows to determine if the item is present (and get the details on it).  Aside from [most often] not being necessary, retrieving 50,000 rows in batches of < 1 MB, during each scan, would be a nightmare for performance.

    We have one MSSQL table with 7 million rows which is queried 2,000+ times a day for a subset of 0 to ~10 rows each time, runs like a champ.

Reply
  • Agree with Mike - by filtering under a!queryEntity() you should only need to retrieve 1 or 0 rows to determine if the item is present (and get the details on it).  Aside from [most often] not being necessary, retrieving 50,000 rows in batches of < 1 MB, during each scan, would be a nightmare for performance.

    We have one MSSQL table with 7 million rows which is queried 2,000+ times a day for a subset of 0 to ~10 rows each time, runs like a champ.

Children
No Data