Appian Community
Site
Search
Sign In/Register
Site
Search
User
DISCUSS
LEARN
SUCCESS
SUPPORT
Documentation
AppMarket
More
Cancel
I'm looking for ...
State
Not Answered
Replies
2 replies
Subscribers
5 subscribers
Views
957 views
Users
0 members are here
Share
More
Cancel
Related Discussions
Home
»
Discussions
»
Process
I am building a set of processes to generate documents in batches. I am importin
Brandon
over 10 years ago
I am building a set of processes to generate documents in batches. I am importing and parsing a pipe delimited flat file generated by our Cobol mainframe and populating PVs with that data. This seems to work well. The problem I am now facing is that the number of records in each file can be quite large (could be 2 or 3 thousand in a single flat file) and I am wondering if there are any limitations as to how many records a process instance can handle (or limitations on the size of a single array)?
My next step will be to setup a sub process that gets called to generate a document for each instance in the array.
I don't want to continue down this path if I am likely to hit problems with the number of records I am trying to work with.
any suggestions would be appreciated. thanks....
OriginalPostID-112345
OriginalPostID-112345
Discussion posts and replies are publicly visible
0
Aleksi White
Appian Employee
over 10 years ago
The process instance should be able to handle as many records as you can throw at it, as long as their is sufficient hardware that Appian is running on. As for the array size limit, the default is 1000 indexes. This can be configured to a larger number:
forum.appian.com/.../Configuring_Data_Capping
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
Brandon
over 10 years ago
thanks for your help
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel