Appian Community
Site
Search
Sign In/Register
Site
Search
User
DISCUSS
LEARN
SUCCESS
SUPPORT
Documentation
AppMarket
More
Cancel
I'm looking for ...
State
Not Answered
Replies
7 replies
Subscribers
6 subscribers
Views
5505 views
Users
0 members are here
Share
More
Cancel
Related Discussions
Home
»
Discussions
»
Process
Any workaround for MNI limit of 1000? Need to process 10,000+ records from
Jess
over 10 years ago
Any workaround for MNI limit of 1000?
Need to process 10,000+ records from CSV file and save it to database,but beforehand,it needs to process complex task ex. Generate/plot planned completion date based on milestones (7) and received date.. I don't think looping functions can be used here since there are lots of Database and Query rule interaction as part of the process. The records from CSV will be saved as the parent record and the generated data from each record as the child (1:M). Please advise. TIA.
OriginalPostID-144874
OriginalPostID-144874
Discussion posts and replies are publicly visible
0
Christine
Appian Employee
over 10 years ago
One idea to try:
Utilize the read excel spreadsheet utility (from shared components). Put the data into a temp table. Then execute a stored procedure to perform the calculations that you need and populate table(s) that you need. You can then pull data back into process as needed.
I'm not sure how much of this meets your use case. But hopefully this gives you a few ideas.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
ronp
over 10 years ago
thanks for the reply Christine! I already thought of it as well, however,it will only work for the first upload...but for second upload of status updates per record, it would be the same 10000+ records with 3-9 fields that that needs to be updated, there are 7 milestones,meaning 7x that the data will be updated from those batch files..any suggestion?
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
Christine
Appian Employee
over 10 years ago
If you don't need all rows for your calcs, after pulling into a cdt (via the shared component), utilize a counter pv and loop through the cdt by index (using counter for index nbr). Just make sure to utilize sub process and delete previously completed nodes.
Don't chain through. Delete the instances within a day.
My concern is that the cdt will consume memory due to size so you will really need to perform performance testing.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
Jess
over 10 years ago
what plugin i need to install to get the counter pv (counter for index that will work the same as how tp!instanceindex do)? cause when i use tp!instanceindex while spawning the subprocess, the 1000 limit error appears..but if i use the pv counter i think that will do and deleting the data once completed.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
Christine
Appian Employee
over 10 years ago
No plugin. Just define an int pv. Then as you loop or do things in batches, just increment the pv. If you store the count (size of cdt) and compare the int pv, you know which instances of the cdt to process and you know when to exit the loop.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
PhilB
A Score Level 1
over 10 years ago
You could also batch the processing in batches of 500, and run an MNI for each set of 500 results. Have a look at configuring the end node in a process to start an instance of itself - this is often useful when batching large amounts of data as the process can call itself until the processing is done, and then tell the parent process that it has completed by flowing to a different end (or terminate) node.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
Jess
over 10 years ago
thanks christine and philb. @philib, just to confirm, the use-case will be to use the Send Message event of the end node
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel