An error occurred while evaluating expression: workflowDetails:rule!CNAM_GC_qeForAllTables(Flag:"WorkflowbyWorkFlowId",wfid:pv!workflowDetails.workflw_id) (Expression evaluation error in rule 'cnam_gc_qeforalltables' at function a!queryEntity [line 254]: An error occurred while retrieving the data. Details: Memory threshold reached during output conversion (rule: [queryentity expression], type: [CNAMGCWORKFLOWDT41204], threshold: [1,048,576 bytes], objects successfully converted: [12,000])) (Data Outputs)
Discussion posts and replies are publicly visible
The expression "CNAM_GC_qeForAllTables" fetches too much data from DB. You will need to limit that.
How can i limit that?
Limit your batch size (in the pagingInfo of your queryEntity) to a value that you show on a UI at a time.
I suspect there's a chance that you're not using this in conjunction with a UI. It appears 12,000 items were done before you ran out of memory.
A common tactic is to do things like this in batches of 1000 repeatedly until the job is done. This reduces the amount of any one query. You have a lot of ways of doing this, including MNI, or subprocesses. For instance, you could make a process model that processes 1000 of these starting at some arbitrary location, then create another process that calls that one with 1, 1001, 2001, etc. to process them all.