Parallel process and execution limit of 50 nodes

Certified Senior Developer

Hi,

I would have a theorical question :

Let's say I have a very complex process that count 60 nodes, and the latter can not reach the end because of the 50 nodes limit.

Let's admit that 30 first nodes car be executed in parallel with the 30 remaing nodes. Does the limit of 50 nodes will occur nevertheless? 

(using an AND gateway for the 2 paths)

  Discussion posts and replies are publicly visible

Parents
  • The guardrails are pretty clear: 50 chained nodes will be your limit, after which the synchronous experience your User was expecting will come to an end and the next thing they'll see is either:

    • an asynchronous Task in their Task list (if a Task is assigned to them after the chain limit is exceeded)
    • or the data they see in the interface from which they launched the process isn't reflecting the changes that the process actually conducted (until they refresh that interface after a point in time at which the process comes to an end)
    • or they'll not see anything that they didn't expect (in which case the process is affecting their UX or the data they're seeing, which then begs the question as to why it was chained from the 50th node onwards in the first place)

    In general any process model that reaches this size probably needs to be reviewed. You can achieve a lot in a combination of Expressions, SAIL interfaces, asynchronous patterns, and even "merging" nodes (e.g. by fetching data in the 'Input' tab and then post-processing it in the ';Output' tab before passing the result to a process variable, which means many opportunities to reduce the size to less than 50 nodes.

Reply
  • The guardrails are pretty clear: 50 chained nodes will be your limit, after which the synchronous experience your User was expecting will come to an end and the next thing they'll see is either:

    • an asynchronous Task in their Task list (if a Task is assigned to them after the chain limit is exceeded)
    • or the data they see in the interface from which they launched the process isn't reflecting the changes that the process actually conducted (until they refresh that interface after a point in time at which the process comes to an end)
    • or they'll not see anything that they didn't expect (in which case the process is affecting their UX or the data they're seeing, which then begs the question as to why it was chained from the 50th node onwards in the first place)

    In general any process model that reaches this size probably needs to be reviewed. You can achieve a lot in a combination of Expressions, SAIL interfaces, asynchronous patterns, and even "merging" nodes (e.g. by fetching data in the 'Input' tab and then post-processing it in the ';Output' tab before passing the result to a process variable, which means many opportunities to reduce the size to less than 50 nodes.

Children