Hi,
We have a daily batch process to load in files from a remote FTP server into the Appian server as Documents, and then process each Document.
To manipulate the Document, it appears Appian is loading the "entire" Document into memory when we use the readtextfromfile() function, as opposed to using a stream-based approach where only the streamed portion of the Document needs to take up the memory space. So the question is: What is the max file/Document size allowed?
I did search for the existing QnA about this, but didn't find a definitive answer. Some say it's 1G, and configurable. We are running on Appian cloud for the moment, but will probably move to on-premise installation with Amazon EC2 later. In either case, what is the max document size limit - relative to the host server's memory size, etc...?
Thanks for your help!
Discussion posts and replies are publicly visible
Thanks. Is my understanding correct that, in order to parse the file with Appian, the entire file string (up to 1G) needs to be loaded into memory? Is there a "stream based" option?
any update ?
From my point of view, you can upload documents into Appian, but it's not a document management system....
The more documents you have, the fewer resources Appian will have to work with.
I wouldn't recommend loading a 1GB into memory with readtextfromfile.
I usually try to stay within the 75MB limit imposed by the Integrations.