How to handle Json data of unusual size

Appian 24.1

My project receives a JSON document that is somewhere between 50 - 100 KB in size normally. We jamb it into a Record Type (as a json string), read it from a process model, convert this to a CDT (hierarchy), pass it to an interface (a hierarchy of them really). So far, all that we have built runs fine.

Now we had a case where the JSON document was over 1 MB. This threw an error when we tried to write it to a record type. Apparently, there is a record size limit of 1 MB within Appian for tables that sync to record types. So, now we are looking at redesigning the data storage.

Clearly, we need to break up/normalize the data into a set of relational tables (vs just 1 table as we have today). But from there I seem to have some options and I don't know what is best

Option 1 - Read the record types within the process model and rebuild the JSON object so it may be passed to the interfaces.
This moves the max size question from the table row to the process model variable. Can it hold 1-2 MB data? Should it? This also minimizes the changes to the current system as the interfaces need no alteration.

Option 2 - Have the interfaces read the record types dynamically when displaying the data.
Here the process model would pass PK information to the interface and the interface pulls the data when it renders. This changes the problem from a data size concern to a UX performance concern. I see lag when the process model reads/writes the record type of a few seconds. I would hate to see multiple, few second lags due to various interfaces loading data from different record types.

Which of these should I use? Or is there some other way to handle large JSON data?

Thanks.

  Discussion posts and replies are publicly visible

Parents
  • 0
    Certified Lead Developer

    Option 2 with Document storage: Store raw JSON as Document (bypasses 1MB record limit), normalize to relational tables, and have process models pass only primary keys to interfaces which query records with pagination (avoids 5MB process variable limits and memory issues). 

  • To save the JSON would I use Text Doc From Template smart service?  (I'm receiving the entire JSON document from an integration call).
    EDIT:  I was able to create the json document using Text Doc From Template.  That leads to another question.... what limits apply to these documents?  Is it okay If I have, say, a quarter million json documents?

    Would I need both to have the raw JSON as a document AND still need relational tables?  Currently, the JSON data is converted into a CDT structure which is passed to the interfaces.  If I read the JSON from a document then convert to CDT, couldn't I just pass the whole CDT (as I do today) to the interface?  

    Thanks.

  • 0
    Certified Lead Developer
    in reply to stevenh6272

    Working directly with external data structures is always a risk, as this data typically changes frequently.

    See my podcast episode: https://appian.rocks/2025/11/06/episode-31-dealing-with-external-data-models/

    Once you made that decision, I would probably transform it into a records structure after receiving.

    The query time you observe in process is not the actual query time, but reflects the end-to-end node execution time which includes queuing etc. Appian does to take a second to fetch some fields from DB.

    If you only need to display that data in an interface, pass only the PK, and fetch data as needed. If you need to modify the data, fetch the data in process, pass it to the interface, and write it back in a write records node.

  • I think your podcast was a bit too broad and high-level for my purposes.  I do receive a bit of data from an external system but it becomes a permanent copy of the source data within Appian.  There is no synchronizing of the data back to the source that is needed.  The sending system already formats the data specifically to accommodate our Appian solution's need... thus no need for a DTE as the coupling problem has already been addressed.

    Appian uses the received data mostly for read only purposes, but it also adds to the data in the form of line item approvals and some other data.  This additional data (mostly) remains in Appian... (We do make a couple integration calls to send some of the new data back to the source system).

    Mostly, what I am wrestling with is a data size issue... not really data integration/passing issue.  We store this data, as I mentioned, in a db table syncd to a record type.  In the normal case, the data is well within the 1 MB limit for record type syncing.  By normal, I mean 99.9999% of the time..... it is a very remote edge case where the data exceeds 1 MB.  In fact, it has only happened once.

    However, this one, remote edge case, will come back to us a year from now and we will need to be ready to handle it.  Thus my investigation into other storage options.

    I do both display the data and collect line item approvals so that would fall into your second suggestion of fetching the data in the process and passing it to the interface.  This was my first thought also... and is least intrusive to the existing solution.

    Thank you for your input.

Reply
  • I think your podcast was a bit too broad and high-level for my purposes.  I do receive a bit of data from an external system but it becomes a permanent copy of the source data within Appian.  There is no synchronizing of the data back to the source that is needed.  The sending system already formats the data specifically to accommodate our Appian solution's need... thus no need for a DTE as the coupling problem has already been addressed.

    Appian uses the received data mostly for read only purposes, but it also adds to the data in the form of line item approvals and some other data.  This additional data (mostly) remains in Appian... (We do make a couple integration calls to send some of the new data back to the source system).

    Mostly, what I am wrestling with is a data size issue... not really data integration/passing issue.  We store this data, as I mentioned, in a db table syncd to a record type.  In the normal case, the data is well within the 1 MB limit for record type syncing.  By normal, I mean 99.9999% of the time..... it is a very remote edge case where the data exceeds 1 MB.  In fact, it has only happened once.

    However, this one, remote edge case, will come back to us a year from now and we will need to be ready to handle it.  Thus my investigation into other storage options.

    I do both display the data and collect line item approvals so that would fall into your second suggestion of fetching the data in the process and passing it to the interface.  This was my first thought also... and is least intrusive to the existing solution.

    Thank you for your input.

Children
No Data