Hi All,
I have a use case where I'm querying data from database table of 100 records set in paging Info. Now, Once I pass this data from my rule to process model, I need to extract the records one by one, perform some action on them and write to DB and then loop back again in the same process model to fetch the 2nd row data and repeat this in batches. My question is how do I fetch the data from a batch of 100 records and to get only one row and then 2nd row and so on without using an MNI? Please advise.
Discussion posts and replies are publicly visible
Did you consider looping inside the process model?docs.appian.com/.../looping.html
Dusty said: I need to extract the records one by one, perform some action on them and write to DB and then loop back again in the same process model to fetch the 2nd row data and repeat this in batches.
Instead of writing records one by one during processing, you should use a!forEach() to perform calculations on each record individually and collect all the processed results. Then, after the forEach completes, you can write all the processed records to the database in a single batch operation. This approach gives you the benefit of individual record processing while maintaining optimal database performance through batch writing, eliminating the need for MNI or complex looping patterns in your Appian process model.
Better way to achieve the same is querying the 100 records, do manipulation on them in an expression rule using a!foreach() and then pass the output of the rule in the write records node.
Thank you Harsha Sharma Could you please advise how to write this using a!foreach() function?
I have my expression rule as below and I'm trying to get 100 records in 1 batch to pass to the process model. Now I need to fetch the 1st row from this rule output into a process model - perform some action and then loop back to fetch the second row - perform some action, loop back third record to fetch 3rd row and so on until I reach my 100th record.
Rule:a!localVariables(local!getData: a!queryEntity_22r2(entity: cons!SAC_ENTITY_CLIENTCASES,fetchTotalCount: true,query: a!query(logicalExpression: a!queryLogicalExpression(operator: "AND",filters: {),a!queryFilter(field: "docsSent",operator: "is null")}),selection: a!querySelection(columns: {a!queryColumn(field: "caseId"),a!queryColumn(field: "caseStatus"),a!queryColumn(field: "clientName"),a!queryColumn(field: "folderId")}),pagingInfo: if(rule!APN_isBlank(ri!pagingInfo),a!pagingInfo(startIndex: 1,batchSize: 5,sort: a!sortInfo(field: "caseId",ascending: true)),ri!pagingInfo),)),local!getData)
Output of the Rule saved into PV of type data subset in process model.
HI. Try this inside a script task, and save the result in a map variable of type 'multiple'.
a!forEach( items: pv!data.data, /*adjust the logic based on your requirement*/ expression: a!update( data: fv!item, index: "caseStatus", value: "ACTIVE" ) )
Thanks osanchea In the expression part of a!forEach I don't want to perform any update, instead just fetch the first row of data and then in the next flow I will perform some actions. How do I fetch the first row of data? and then after lopping this same rule should fetch second row of data, third row of data....
In that case, you can follow these steps:1. create a process variable "counter" and set value 1(default)2. To reference each iteration use this expressión: index(pv!data, pv!counter, null)3. Configure XOR node --> pv!counter<100, if yes, go to "counter node +1", if no go to "end node"4. Increment counter +1 and save value in pv!counter---> pv!counter + 1
Thank you so much osanchea really appreciate it.