Parallel process and execution limit of 50 nodes

Certified Senior Developer

Hi,

I would have a theorical question :

Let's say I have a very complex process that count 60 nodes, and the latter can not reach the end because of the 50 nodes limit.

Let's admit that 30 first nodes car be executed in parallel with the 30 remaing nodes. Does the limit of 50 nodes will occur nevertheless? 

(using an AND gateway for the 2 paths)

  Discussion posts and replies are publicly visible

Parents Reply Children
  • 0
    Certified Lead Developer
    in reply to cedric01

    I think we need to take a step back. Can you please clarify in detail:

    • What is the expected user experience, specifically as pertains to the Activity Chaining involved in this process?
    • What exactly is the issue or error you're experiencing at this point?
  • 0
    Certified Senior Developer
    in reply to Mike Schmitt

    Mike, we have already experienced the 50 nodes issue from many interfaces.

    Here is an example:

    We work on an application that displays a rich interface with a lot of section and fields.

    When the user submit the data (from a button), a big process is called (with more of 50 nodes that are mainly composed of script tasks and DB data updates), and then after the end of the process, the main screen does not display the updated data.

    When we look at the DB, all the data are correct (all new, deleted or updated records are fine).

    All is nearly fine except the screen is not refreshed.

    When we press F5 to refresh the browser, the screen is fine with all the data.

    2 years earlier, we've asked the question to an Appian architect, which have explain us the 50 nodes limit, and advised us to reduce the number of nodes (or to group update data in a multiple DP store smart service).

    In our case, we've reduced the number of nodes, and indeed, is has solved the issue.

    The process in question, is a big process that contains many sub processes but no input task.

  • 0
    Appian Employee
    in reply to cedric01

    I think what Mike is getting at is: does every single node in the sub-process need to occur before you return to the previous page? For example, if some of the database writes update data in a related record but that information is not displayed on the page, then that doesn't need to be included in the activity chain.

    Another thing to consider is if your large form should be broken up. For instance, a wizard is a common pattern where a single SAIL form can contain many fields. However, in some scenarios like this it can actually be advantageous to have several different forms that simulate a wizard that each have DB writes between them that split up the process into smaller chunks of less than 50 nodes.

    The last thing I'll mention is a little more about how activity chaining works. Any nodes that are activity chained are given a higher priority by Appian to try and execute as fast as possible. So while it's a good procedure for ensuring your data is updated, we intentially limit this to 50 nodes to ensure that we have a limited number of nodes running at this higher priority. For the same reason, it's usually a good idea to see how activity chaining can be limited or worked around in your application to ensure the best performance and scalability.

  • 0
    Certified Senior Developer
    in reply to Peter Lewis

    Thank you for your reply Peter.

    Ok I understand... we could let some DB data storing in background (without activity chaining) while the user get back his form.

    but maybe it is dangerous if the user choose to update again some data which are still in storing progress ?

    Your other points are very interesting, and I take good note.

    (maybe one day, the customer will ask us to rework that very big interface that contain many sub-interfaces, and complex processes, but we are not the decision-majers ;-) )

  • 0
    Appian Employee
    in reply to cedric01

    One other thing I just thought of as well - is there any logic that you could move from the process model to the interface? I like to do a lot of data manipulations / saves as part of the saveInto logic for the button click. Depending on what you need to do, you might be able to move some logic to the form and save yourself a few nodes in the process model.

  • 0
    Certified Senior Developer
    in reply to Peter Lewis

    Peter, do you mean, let the interface calling some ER that call directly "Write to Data Store Entity" Smart Services ? if yes, the reply is no : for this example, we let the process making all the DB data writing, but the interface makes upstream a lot of Saveinto to prepare the RI for the process.

    For this particuliar example, an Appian architect had advised us to call some "Write to Multiple Data Store Entities" Smart Service in place of the "Write to Data Store Entity" to reduce a little the number of nodes.

    In reference to my general question about parallel process, if I have well understood... move some sub-process in parallel (like my picture above) would not help, as it is the chaining that determines and causes the limit 50 nodes error ?

  • 0
    Certified Senior Developer
    in reply to Peter Lewis

    Hi,

    Let's take an example : If I have a process with 55 nodes chained together and no Input task between the nodes, the 50 nodes limit will be reached.

    Could you just confirm me one thing please ?

    If I create a new sub process taking the 30 nodes from the process, to leave the parent process with only 25 nodes, that will change nothing, the error will be still there. Isn't it ?

  • 0
    Certified Lead Developer
    in reply to cedric01
    the error will be still there

    Wait, are we talking about an error?  I thought before we were just talking about the chaining limit?  That is not an error, per se.

  • 0
    Certified Senior Developer
    in reply to Mike Schmitt

    Sorry, yes I was talking about the chaining limit that makes the initial (or current) interface from which the parent process is started, not refreshed if the last process node write some data in database, and the Interface tries to refresh some fields after the end of the process.

    I just need to be sure, that breaking a process into sub-processes does not change the chaining limit.

  • 0
    Certified Lead Developer
    in reply to cedric01

    So in your question, is the subprocess with 30 nodes being run asynchronously?  Because asynchronous subprocesses will not count against the chaining limit (but as they're run concurrently, they won't necessarily be guaranteed to complete before your chained process flow ends).  Honestly at this point if it's causing such an impact that users are arriving back at a record page before data is being updated by the running process, in my opinion you should just implement a "capture" task that the user lands on which stalls them a bit from reaching the record listing again so the rest of the process can complete.  This is fairly easy to do and relatively transparent from the user's standpoint.