Interface Definition: Expression evaluation error [evaluation ID = c9564:63ac5] : An error occurred while executing a save: Expression evaluation error in rule 'TID_documentsquery' at function a!queryEntity_18r3 [line 2]: An error occurred while retrieving the data. Details: Memory threshold reached during output conversion (rule: [queryentity expression], type: [TIDContentTagsVwDT31557], threshold: [1,048,576 bytes], objects successfully converted: [3,228])
Discussion posts and replies are publicly visible
Your query call has reached the memory threshold limit ~ 1MB.
docs.appian.com/.../Post-Install_Configurations.html
This usually happens when you try to call a query entity with batch size set as -1 and you are trying to pull more records from the database. Check if you can reduce your output size by applying filter or batch-size
Actually i need all the data to be fetched.
It is not recommended to query more than 1MB of data to avoid memory issues. If you still need it, you can query them in batches. Like if there are 50k records, query them 5 times with a 10K batch size.
Also, if you can share your use-case, community members can share some best practices around it.
For what purpose?
can u please share the code for reference.
a!queryEntity_18r3( entity: cons!TEST_ENTITY_CONTENT_VW, query: a!query( selection:a!querySelection( columns: { a!queryColumn( field: "contentId" ), a!queryColumn( field:"contentAreaId" ), a!queryColumn( field:"categoryId" ), a!queryColumn( field:"topicId" ), a!queryColumn( field:"UnitId" ), a!queryColumn( field:"Owner" ), a!queryColumn( field:"docName" ), a!queryColumn( field:"createdDt" ), a!queryColumn( field:"docDescription" ), a!queryColumn( field:"docLink" ) } ) , logicalExpression: a!queryLogicalExpression( operator: "AND", filters: { a!queryFilter( field: "contentId", operator: "not null" ), if( or(isnull(ri!contentAreaIds), rule!APN_isEmpty(ri!contentAreaIds)), {}, a!queryFilter( field: "contentAreaId", operator: "in", value: ri!contentAreaIds ) ), if( or(isnull(ri!categoryIds), rule!APN_isEmpty(ri!categoryIds)), {}, a!queryFilter( field: "categoryId", operator: "in", value: ri!categoryIds ) ), if( or(isnull(ri!topicIds), rule!APN_isEmpty(ri!topicIds)), {}, a!queryFilter( field: "topicId", operator: "in", value: ri!topicIds ) ), if( rule!APN_isBlank( ri!buId ), {}, a!queryFilter( field: "UnitId", operator: "=", value: ri!buId ) ), if( rule!APN_isBlank( ri!Owner ), {}, a!queryFilter( field: "Owner", operator: "=", value: ri!Owner ) ), if( rule!APN_isBlank( ri!title ), {}, a!queryFilter( field: "docName", operator: "includes", value: ri!title ) ), if( or(rule!APN_isBlank(ri!startDate), rule!APN_isBlank(ri!endDate)), {}, { a!queryFilter( field: "createdDt", operator: ">=", value: todatetime(ri!startDate) ), a!queryFilter( field: "createdDt", operator: "<=", value: todatetime(ri!endDate) ) } ) } ), pagingInfo: a!pagingInfo(startIndex:1, batchSize:-1) ) ),i have applied the selection also but still the issue persists.
Seems you are fetching whole data by giving -1 batch size(though you have few columns). Can you try increasing the batch size gradually to find the threshold point?
Yea i increased it and it works but then it doesn't fetch the complete count of data
i have provided the batch size 5405
Again, what is the purpose of fetching ALL data into memory?