Kafka Tools


KafkaTools provides the following functionality that allows publishing to and consuming from topics on Kafka Servers.

Smart Services

  • Publish To Kafka
  • Consume From Kafka

In order to process the messages consumed from Kafka, it is recommended to use the Transaction Manager application. The models designed to process messages will be configured an assigned through the transaction manager job types. See documentation for Transaction Manager

Key Features & Functionality

Please refer to the README for additional details.

  • Hi,

    We are trying to use this plug-in to consume and publish from Kafka with Schema Registry from Confluent Cloud.

    This plug-in does not support AVRO format in Confluent's Schema Registry, so we're evaluating if we could solve this by creating a Smart Service Plug-in to serialize/deserialize with the AVRO Java libraries.

    The idea is to serialize the messages before using Publish to Kafka and then deserialize the data after Consume from Kafka stores them in db.

    Does anyone know if this Smart services can send and receive the binary in AVRO format? Has anyone tried to do this or something similar?

    Thanks in advance

  • Hi,

    I think so, that is the expected behavior. It seems that, when the smart service starts, it subscribes to the topic that is passed to it as input at that moment. In fact, I have detected that, if we force the subscription to a topic to be cut, canceling the process model that that consumer has, the subscription is NOT cut until the "Runtime in minutes" period ends, in my case , set to 58 minutes. You can see this fact by consulting the log "tomcat-stdOut.log"

  • Hi,

    We are using the 1.4.1 version and have set up of 20-minute runtime. 

    For topics, we are passing the expression rule as input. Values returning in the expression rule output may vary dynamically (due to some custom processes we have configured in the backend).

    But the already active "consume from Kafka" smart service keeps on consuming the topics that were set at the start. It is not considering the value changes in real-time. Is this the expected behaviour?

  • We are using 1.0.4 version and it has reported some vulnerabilities. Which version is compatible with Appian 22.4 and how to download specific version of this plugin?

  • Hi Patricia,

    Me and other people on this chat has this problem with our client.

    In my case, the solution that my client has adopted regarding this is to develop an external microservice and put it as an intermediate layer between Appian and Confluent. The microservice is able to read the message in AVRO format in Confluent's Schema Registry and transform it to String type without any Schema Registry, and will publish it in another Kafka topic in Confluent in string format. So, the reading from Appian is correct, because we are finally reading a string type. I hope this information helps you :)

  • Hi, has anyone used this plug-in with Schema Registry from Confluent Cloud? I haven't seen any reference to this in the documentation.

    Thanks in advance

  • Given the implementation of this plugin, "consume from Kafka" smart service will be save the events in a DDBB table "tm_job_transaction". Yo can try to create the same table from yourself with the same structure that "tm_job_transaction" table. I think its works too

  • Hi,

    We are considering using this plug-in with a client and we have seen that the smart service Consume from Kafka does not return information about the recovered messages and it's necessary to use the Transaction manager application.

    Is there a way to retrieve and process the events from Kafka directly without the Transaction manager application? Has anyone tried something similar?

    Thanks in advance

  • Please someone help me to know which Kafka client version is being used by below mentioned versions of Kafka tools

    * Kafka-tools 1.0.2

    * Kafka-tools 1.4.1

  • Does anybody know what version of Kafka this plugin is intended to be used with? I don't see any kafka version in the associated documentation. I'm attempting to connect to a kafka server running kafka version 2.7 but I'm continually getting an error.