Appian Community
Site
Search
Sign In/Register
Site
Search
User
DISCUSS
LEARN
SUCCESS
SUPPORT
Documentation
AppMarket
More
Cancel
I'm looking for ...
State
Not Answered
Replies
10 replies
Subscribers
7 subscribers
Views
7358 views
Users
0 members are here
Share
More
Cancel
Related Discussions
Home
»
Discussions
»
Integrations
Web API Limitations
judym598
Certified Senior Developer
over 8 years ago
What's the best way to 'GET' a large amount of data via Web API? I initially created Web APIs that used the a!queryEntity function to pull incremental changes from several tables, but have been asked to revise them to pull all records from these tables. There are several tables that have ~ 1000 records, which is not a problem (yet), but others are quite large. In the testing I've done so far on one table, I am able to retrieve 15,000 records but any more returns an error. Is there a way to include a looping function in the API to retrieve multiple extracts if the number of records exceeds a certain count?
OriginalPostID-267239
Discussion posts and replies are publicly visible
Parents
0
chetany
A Score Level 1
over 8 years ago
I think for 15,000 records the queryEntity returns data more than 1 MB and takes more than 10sec, and hence it timesout, which maybe causing an issue.
One solution would be - let the caller of the web api pass a startIndex and batchSize. In your web api you can extract those from the query params. And then you can send only requested amount of data rather than all of it.
The caller can then make another web api call with startIndex = oldStartIndex+batchSize. This will fetch the next batch of records. You will have to decide what batch Size is suitable.
So, in a way you will have multiple webapi calls to get all the data.
If on the other hand you want to get all the data in one-go, you will need to see if you can optimize at database level, maybe use indexes on proper columns, that may help with queryEntity timeout issue.
Another idea could be to return data in aggregate form - maybe create a DB View for the table which has lot of data. The View can aggregate data of multiple rows based on some column, this will reduce the number of rows returned. Then your web api will return less than 15000 rows. But, now the caller of the web api will need to apply some transformation on the data returned to get the exact data (since at DB side we grouped and aggregated the data in the View).
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
Reply
0
chetany
A Score Level 1
over 8 years ago
I think for 15,000 records the queryEntity returns data more than 1 MB and takes more than 10sec, and hence it timesout, which maybe causing an issue.
One solution would be - let the caller of the web api pass a startIndex and batchSize. In your web api you can extract those from the query params. And then you can send only requested amount of data rather than all of it.
The caller can then make another web api call with startIndex = oldStartIndex+batchSize. This will fetch the next batch of records. You will have to decide what batch Size is suitable.
So, in a way you will have multiple webapi calls to get all the data.
If on the other hand you want to get all the data in one-go, you will need to see if you can optimize at database level, maybe use indexes on proper columns, that may help with queryEntity timeout issue.
Another idea could be to return data in aggregate form - maybe create a DB View for the table which has lot of data. The View can aggregate data of multiple rows based on some column, this will reduce the number of rows returned. Then your web api will return less than 15000 rows. But, now the caller of the web api will need to apply some transformation on the data returned to get the exact data (since at DB side we grouped and aggregated the data in the View).
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
Children
No Data