How does too many nodes in a process model increase memory footprint?

Certified Associate Developer
How does too many nodes in a process model increase memory footprint?
Too many nodes in a process mode reduce readability and maintainability. However, not sure how it impacts(increase) memory footprint.
Please provide your thoughts..

  Discussion posts and replies are publicly visible

  • Certified Senior Developer

    hi   too many nodes need to handle many process variables that need to be stored and managed throughout the lifecycle of the process. If each node interacts with a large number of variables, or if there are many intermediate variables, this increases the memory footprint.

    Appian allocates threads to handle node execution. More nodes can lead to higher thread usage, which consumes memory.

    Managing the dependencies and interactions between the nodes may require additional memory.

    each node generates log more number of nodes may require more logging data to be stored

  • Certified Senior Developer

    Have you got a chance to look at this article?

    How to Create Memory Efficient Models

    Also as a simple answer I would say, More the number of nodes more the memory allocation to run the models efficiently. Your instances will consume more memory to keep up with the audit details such as your variable history or any other data logs. So smaller the model (Lesser nodes) better the performance. Converting your long models into smaller chunks and running them based on their priority as synchronous/asynchronous.

  • Certified Associate Developer

      Thanks for your inputs.

    Yes, the number of process variables and the data size will impact the memory.

    While doing sequential processing, the new thread will be span for each node one by one i.e. once the execution of one node is completed, create a thread for another node. Hence, the number of nodes should not have a major impact on memory.

    As mentioned by   and , a greater number of nodes will increase process history and other logistic data(log, etc.). However, I feel that it should not have much impact on memory since this data should not take much memory/space.

  • As long as the process instance is active, the PVs holding the data even if the previous nodes are completed. Unless the process is clearing the PVs which are not needed in rest of the process by setting them to null(), the data cannot be garbage collected. so I think that is how it is going to consume memory throughout the process active state.

  • Certified Lead Developer
    in reply to gaurav_laturkar
    Unless the process is clearing the PVs which are not needed in rest of the process by setting them to null(), the data cannot be garbage collected

    PVs in a process keep old versions. setting them to null will not help much.