Web API Limitations

Certified Senior Developer
What's the best way to 'GET' a large amount of data via Web API? I initially created Web APIs that used the a!queryEntity function to pull incremental changes from several tables, but have been asked to revise them to pull all records from these tables. There are several tables that have ~ 1000 records, which is not a problem (yet), but others are quite large. In the testing I've done so far on one table, I am able to retrieve 15,000 records but any more returns an error. Is there a way to include a looping function in the API to retrieve multiple extracts if the number of records exceeds a certain count?

OriginalPostID-267239

  Discussion posts and replies are publicly visible

Parents
  • @judym598 you'll have to play around with this tactic for a bit to make sure it works but i would start with creating the following variables in your process:

    1. counter
    2. startIndex
    3. batchSize

    For the 1st loop, set the counter to 1 and startIndex/batchSize accordingly (i.e. startIndex = 1, and batchSize = 3000). After the webservice call is done, append the data the webservice retrieved to another variable and do the following in a script task:

    1.increment count by 1 (this'll be done each time a loop is done)
    2.clear out pv!startIndex and append batchSize * pv!counter + 1 to startIndex (this is so each loop after the 1st loop knows to look at the next 3000 rows)

    in your query entity.....
    3.add logic that determines if pv!batchSize should be 3000 or 3000-1 if counter is equal to or greater than 2.
    4.add logic that the startIndex will be 1 UNLESS counter is equal to or greater than 2.

    exit this loop when counter = 5 (or greater...depends)
    OR
    when startindex = 15,001

    -here's an example of what rows should be retrieved each loop
    1st loop - 3000 (1-3000)
    2nd loop - 3000 (3001-6000)
    3rd loop - 3000 (6001-9000)
    4th loop - 3000 (9001-12,000)
    5th loop - 3000 (12,0001-15,000)

    hope this helps. this solution is open to critique if there's a more efficient method.
Reply
  • @judym598 you'll have to play around with this tactic for a bit to make sure it works but i would start with creating the following variables in your process:

    1. counter
    2. startIndex
    3. batchSize

    For the 1st loop, set the counter to 1 and startIndex/batchSize accordingly (i.e. startIndex = 1, and batchSize = 3000). After the webservice call is done, append the data the webservice retrieved to another variable and do the following in a script task:

    1.increment count by 1 (this'll be done each time a loop is done)
    2.clear out pv!startIndex and append batchSize * pv!counter + 1 to startIndex (this is so each loop after the 1st loop knows to look at the next 3000 rows)

    in your query entity.....
    3.add logic that determines if pv!batchSize should be 3000 or 3000-1 if counter is equal to or greater than 2.
    4.add logic that the startIndex will be 1 UNLESS counter is equal to or greater than 2.

    exit this loop when counter = 5 (or greater...depends)
    OR
    when startindex = 15,001

    -here's an example of what rows should be retrieved each loop
    1st loop - 3000 (1-3000)
    2nd loop - 3000 (3001-6000)
    3rd loop - 3000 (6001-9000)
    4th loop - 3000 (9001-12,000)
    5th loop - 3000 (12,0001-15,000)

    hope this helps. this solution is open to critique if there's a more efficient method.
Children
No Data