Hi Experts,We have a requirement to get files from SFTP server and put in Appian Doc repository.The input we have received from client is that these files size can vary from 1MB to 1GB as there are multimedia files as well.What will be the recommended approach for meeting this requirement. Is it going to cause any impact on the performance of the server or not.What if total size of the files being moved to Appian doc repository exceeds 100 GB.In my assumption we will have to consider our existing cloud storage and request for extra if required. Is there any other server level configuration needs to be done and how can we manage memory in this case?
Thanks.
Discussion posts and replies are publicly visible
My production system had, approaching 1TB used (estimated) - there isn't really a hit on performance as long as someone's paying for it. (For reference, we developed a system to do the opposite of what you're saying, to take some bulky files and offload them to a bespoke sFTP server, for space and cost savings, which also worked pretty well. But prior to running it, there weren't any performance concerns per se.)
As long as the folks who are giving you this "requirement" are paying for the extra storage space from Appian, Appian will gladly sell it to you. Just make sure the client understands the cost trade-off is heavy.