Dynamic BatchSize for queryentity

Certified Senior Developer

can we calculate batchsize based on datalength(Bytes) from table and amount of data we retrive from Appian query  ? because i have a requirement to send data from multiple tables to API

Ex: For one table

i have table with 235417 of data in that and datalength is (18366464 Bytes) . As we know we can retrieve only 1MB(1000000Bytes) of data at a time from query entity , So i am expecting my batch size should be dynamically decided based on datalength(18366464) and amount of data retrived at a time (1000000bytes) to send whole data in chunks . If i give hardcore value as 10k or 5k this may again cause issue for some other table with large data or large column . So my thought is it's better to decide batchsize dynamically based on tablelength  

  Discussion posts and replies are publicly visible

Parents Reply Children