Hi All,
I am running this queryEntity in script task inside a process model but i am getting an error
ERROR: Expression evaluation error at function a!queryEntity [line 1]: An error occurred while retrieving the data
a!queryEntity(
entity: cons!tableName,
query: a!query( pagingInfo: a!pagingInfo( 1, - 1 ))
) .data.columnName
Note: I am having 20,000 rows in that table will that be the reason?
Could any one help me with this issue
Discussion posts and replies are publicly visible
Hello nikkheelm,
Mostly like that kind of error message says the same and just to be sure could you please change the batch size to 5 or 10 and then try to run the node and see if that works if that works then we are sure it is the timeout error for the fetching large amount of data.
Also what is the type of the PV! that you are trying to store the value in.
It should be of the same CDT type.
hi ashwina0002 thanks for the answer
i have changed the batch size to 5 or 10 it is working i am getting data
Ashwin said:Also what is the type of the PV! that you are trying to store the value in.
i am using text pv type because as per my requirement i am getting all the values from one specific column in the above table and storing in this pv variable
touniformstring(a!queryEntity(...).data.column1) this is what i am doing in that script task
So, the error is volume related. You'll need to either retrieve the data in batches. Or, if you're wanting to do some aggregation on the data, do so in your a!queryEntity().
thank you Stewart Burchell
I think in this case i need to go with this solution
Stewart Burchell said:You'll need to either retrieve the data in batches
i will be getting data by batches and store them in different PV(for Ex 10 pv) and at last can i concade all the 10 pv into 1 PV will it be a best practice so that i can use that PV in entire process
I am not sure if that is the best practice appian follows because the size of the column will increase day by day and you everyday you need to create new pv to fetch the new data so I believe this is not the right approach moreover you can try to aggregate the data which might help you.
can you suggest me any other best practice
Nikkheel said:as per my requirement i am getting all the values from one specific column
What is this requirement attempting to accomplish, exactly?
I believe we should be spending more effort into figuring out what goal they're trying to reach, and figuring out a better way to accomplish that, because the current approach will not scale and will not perform.
Nikkheel, Create two PV of type Text (array). In First PV fetch a batch of data and then append this in second PV. You can repeat this activity until you get whole data and finally by using two PV(s), you will get entire data in second PV.Although, you must need to check that the issue is due to data volume or due to invalid data which can not be processed by Appian. One example mentioned by Edward Anthony Allen. Reason being 20,000 rows of single column value is reaching to threshold value is little unrealistic.