Hi guys,
Is there a new limit to how many rows can be stored in a process variable that is marked as multiple? I made a query that returned 146 elements and the length of the ac!variable is 146 but in the pv!variable there are only 101 elements. Everything after the 101st element seems to have gotten trimmed out.
Discussion posts and replies are publicly visible
I think that's nor related with the number of elements, but with the size of the elements returned.
Anyway as Abhishek Karumuru and Stefan Helzle said..... I would try to avoid that .. because it can easily lead to performance issues....
No buddy, it isn’t related to the size of data. That’s what I wondered too. But I tried an array of CDT with 2 fields and an array of CDT with 20 fields. Both of them got trimmed up to the 101st element. And also, if you see my reply above your message, it’s quite evident that size isn’t the issue since I was able to bypass the issue by doing the above. And also, this node that queried the data executed in less than a second so performance and memory usage isn’t an issue
I agree it could lead to problems with performance if the CDT also has many fields. In my particular use case it’s not very critical.
Not sure about the configuration files, but maybe something to look into. Does this same trimming till 101 elements not happen for you?
Which version are you using and what kind of installation?
We are using 23.4 with 3 appian engines and Single application server configuration.
Could you share the code snippet where you fetch the data?
It is not only that query. I created an array of text of 200 elements using enumerate and the array was trimmed to 101 elements.
Are you talking about the actual data array size stored in the PV, or the size of information Appian will display in the UI before truncation? AFAIK these are two different things. And just because the information displayed in the GUI is truncated does not mean the data itself has been truncated - to determine that you'd need to call length() on the variable (in process) and look what the output is, and/or feed the PV's value into an interface and display it in a paragraph field / grid / etc (something where you're able to fully judge the size of the data contained therein), or one of another methods. Viewing the process instance "variables" value and/or the same in Process History is not necessarily a method guaranteed to show you the full contents of a data-heavy variable.
Posting the same time as Mike here..
Yes the Process History tab 'truncates' your large data sets, but the Variables (actual data set) will not. That screen shot is the key - check the Variables tab as well
Sample test in my environment, Process History:
Variables:
Not necessarily an array, but even the Variables tab is not perfect in my experience - for example a plaintext value with a fairly long value (i.e. at least a thousand or so characters, just guestimating by memory), will eventually be truncated in what's displayed in the Variables tab (but the value itself is not actually truncated).
I would agree we should not rely on the variables tab explicitly, especially for larger values. I am assuming it will show that no truncation is occurring in this case however
Thanks Mike Schmitt and Chris that was insightful. The reason I felt certain that the entire variable was being truncated was that it was supposed to write to DB for the 124th element but it never did. I'm not quite sure why that happened because as suggested by you guys, when I see the variable, it does seem to have all the elements and not just the truncated 101 that the process history showed.