Appian 24.1
My project receives a JSON document that is somewhere between 50 - 100 KB in size normally. We jamb it into a Record Type (as a json string), read it from a process model, convert this to a CDT (hierarchy), pass it to an interface (a hierarchy of them really). So far, all that we have built runs fine.
Now we had a case where the JSON document was over 1 MB. This threw an error when we tried to write it to a record type. Apparently, there is a record size limit of 1 MB within Appian for tables that sync to record types. So, now we are looking at redesigning the data storage.
Clearly, we need to break up/normalize the data into a set of relational tables (vs just 1 table as we have today). But from there I seem to have some options and I don't know what is best
Option 1 - Read the record types within the process model and rebuild the JSON object so it may be passed to the interfaces.This moves the max size question from the table row to the process model variable. Can it hold 1-2 MB data? Should it? This also minimizes the changes to the current system as the interfaces need no alteration.
Option 2 - Have the interfaces read the record types dynamically when displaying the data.Here the process model would pass PK information to the interface and the interface pulls the data when it renders. This changes the problem from a data size concern to a UX performance concern. I see lag when the process model reads/writes the record type of a few seconds. I would hate to see multiple, few second lags due to various interfaces loading data from different record types.
Which of these should I use? Or is there some other way to handle large JSON data?
Thanks.
Discussion posts and replies are publicly visible