Hi guys,
Is there a new limit to how many rows can be stored in a process variable that is marked as multiple? I made a query that returned 146 elements and the length of the ac!variable is 146 but in the pv!variable there are only 101 elements. Everything after the 101st element seems to have gotten trimmed out.
Discussion posts and replies are publicly visible
Hi saurabhrajnala what type of data are you trying to store in PV or AC variables? what is your requirement, Having so much data in Pv can cause performance and memory management issues
I try to avoid doing such things. But I have to admit that I never tested that.
Okay I was able to bypass this. Instead of storing the indexed data (146 rows) of the datasubset in the variable I stored the data subset and that was saved with no problem since the PV thinks it is only 1 row of data but it actually consists of the 146 rows inside it. LOL, strange how this particular thing works
I think that's nor related with the number of elements, but with the size of the elements returned.
Anyway as Abhishek Karumuru and Stefan Helzle said..... I would try to avoid that .. because it can easily lead to performance issues....
No buddy, it isn’t related to the size of data. That’s what I wondered too. But I tried an array of CDT with 2 fields and an array of CDT with 20 fields. Both of them got trimmed up to the 101st element. And also, if you see my reply above your message, it’s quite evident that size isn’t the issue since I was able to bypass the issue by doing the above. And also, this node that queried the data executed in less than a second so performance and memory usage isn’t an issue
Sorry!!! I didn't read your last answer.... IN that case,maybe you have some limit about the array size?... did you check configuration files ?
When I mentioned that this can lead to performance problems... I mean that it is not a good approach to store too much information in variables... because that consumes a lot of memory... not because of the time it may take to query the data.
I agree it could lead to problems with performance if the CDT also has many fields. In my particular use case it’s not very critical.
Not sure about the configuration files, but maybe something to look into. Does this same trimming till 101 elements not happen for you?
Can you check process variable size(i.e. number of items in the array) using the count() function? To make sure that elements are getting trimmed. Because I am not able to reproduce it.
Which version are you using and what kind of installation?
I have one scenario in my previous project which I would like to share we were fetching a large amount of data inside the process variable which was of text data type and the number of characters it holds was more than 400000 (4 lacs) characters when we were fetching similar data in an array it was giving result upto 4 rows only so the point is that it depends of threshhold limit of the variable in different data type on how much data it can withstand.Although I agree that it should be avoided as it leads to performance issues while monitoring the instances