Transaction Manager

Overview

The Transaction Manager application provides a solution to handle transaction processing across multiple queues and priorities with typical queue management features such as throughput handling, parallel execution, batch processing. If you have work items to be processed, or need to manage your resource utilisation the Transaction Manager can help you.

Below are some use-case examples.  Details about these use-cases are available as part of the documentation provided in the download package.

  • Maintain a continuous backlog of user tasks
  • Cancel a scheduled transaction
  • Schedule an item for future execution
  • Configure a recurring schedule
  • Priortise the order my transactions are processed
  • Limit the time period my transactions are processed
  • Retry X times before passing to a system administrator

Key Features & Functionality

  • Multi-queue processing
  • Throughput and scaling of individual queues
  • Resource utilisation management
  • Parallel execution across process engines
  • Microbatch processing for high-throughput use cases
Anonymous
  • Hi! 

    You need to install the Pessimistic Locking application before installing TM. Since version 1.6.0, these two apps must be deployed separately.

    If you scroll down through the comments, you’ll find the release notes from a few years back mentioning this change and the same issue reported by other users.

    • v1.6.0 Release Notes
      Added support for Oracle.
      Refreshed look-and-feel and improved usability.
      Minimum Appian version increased to 21.1
      Removed the bundled Pessimistic Locking app (now to be deployed separately)
  • I tried installing the latest version of TM in a new environment, and got this error. It seems like the Object Locking app is not included in this installer

  • Issue on new job types

    When a new job type is added on the TM console and "TM Transaction Manager" process model has a running process instance, transactions of the new job type will trigger errors due to process variable holding list of process models not having the new configured process model in memory.

  • v2.0.2 Release Notes
    • Fix TM_GET_NEXT_TRANSACTIONS edge-case that caused the same transaction to be added to multiple micro-batches
  • how can i download an older version? i'm trying to deploy on 23.4 and complains due to plugin version

  • v2.0.1 Release Notes
    • Patch to fix adding job type when micro batchsize is blank
  • v2.0.0 Release Notes
    • Appian 24.1 as the minimum version
    • Major performance improvements. Tested with over 5m transactions. Main dashboard and stored procedure load < 1s
    • New: Microbatch feature for use cases that require massive throughput
    • New: Ability to disable auto context (process parameter) provisioning. For use cases that have large contexts that are complex to model or not required to be passed to the process. The relevant parts of the context can be retrieved in-process.
    • Exponential backoff for sleep, to reduce system load during low usage periods. This allows a smaller sleep time to be configured where low-latency is required
    • Removed dependency on Execute Stored Procedure plug-in. No longer relies on any plug-in
    • Fixed 1000 process model limit
    • Note: Oracle and SQL Server scripts will require updating (community contribution is welcome)
  • Hi, similar to the previous commenter: we have tried to use this application for larger volume of transactions. The issue is, if transaction count is close to 1 million, the db view which is used to generate transaction statistics starts performing very slow, and the screen TM_if_Dashboard fails because of the db timeout. What would be your recommendation for such kind of situations?

  • Hi Team, the TM_if_Dashboard is erroring out because of tm_v_job_type_transaction_counts exceeding query threshold limit on cloud, any recommendations for improving the performance of the query?

  • Hi!

    The events you consume from Kafka will be stored in "tm_job_transaction" table in BBDD once you be able to suscribe to your topics. After this implementation, you should create Job Types in Transaction Manager Site, which each Job Type is related to each Kafka Topic, so TM will be triggered, in 60-seconds batches (configurable), several instances of the Process(s) Model(s) (via Job Type in TM Site) who manages the event and its information.

    Hope this helps you!