Kafka Tools

Overview

KafkaTools provides the following functionality that allows publishing to and consuming from topics on Kafka Servers.

Smart Services

  • Publish To Kafka
  • Consume From Kafka

In order to process the messages consumed from Kafka, it is recommended to use the Transaction Manager application. The models designed to process messages will be configured an assigned through the transaction manager job types. See documentation for Transaction Manager

Key Features & Functionality

Please refer to the README for additional details.

Anonymous
  • Hi,

    We are using the 1.4.1 version and have set up of 20-minute runtime. 

    For topics, we are passing the expression rule as input. Values returning in the expression rule output may vary dynamically (due to some custom processes we have configured in the backend).

    But the already active "consume from Kafka" smart service keeps on consuming the topics that were set at the start. It is not considering the value changes in real-time. Is this the expected behaviour?

  • We are using 1.0.4 version and it has reported some vulnerabilities. Which version is compatible with Appian 22.4 and how to download specific version of this plugin?

  • Hi Patricia,

    Me and other people on this chat has this problem with our client.

    In my case, the solution that my client has adopted regarding this is to develop an external microservice and put it as an intermediate layer between Appian and Confluent. The microservice is able to read the message in AVRO format in Confluent's Schema Registry and transform it to String type without any Schema Registry, and will publish it in another Kafka topic in Confluent in string format. So, the reading from Appian is correct, because we are finally reading a string type. I hope this information helps you :)

  • Hi, has anyone used this plug-in with Schema Registry from Confluent Cloud? I haven't seen any reference to this in the documentation.

    Thanks in advance

  • Given the implementation of this plugin, "consume from Kafka" smart service will be save the events in a DDBB table "tm_job_transaction". Yo can try to create the same table from yourself with the same structure that "tm_job_transaction" table. I think its works too

  • Hi,

    We are considering using this plug-in with a client and we have seen that the smart service Consume from Kafka does not return information about the recovered messages and it's necessary to use the Transaction manager application.

    Is there a way to retrieve and process the events from Kafka directly without the Transaction manager application? Has anyone tried something similar?

    Thanks in advance

  • Please someone help me to know which Kafka client version is being used by below mentioned versions of Kafka tools

    * Kafka-tools 1.0.2

    * Kafka-tools 1.4.1

  • Does anybody know what version of Kafka this plugin is intended to be used with? I don't see any kafka version in the associated documentation. I'm attempting to connect to a kafka server running kafka version 2.7 but I'm continually getting an error.

  • Yep that was the issue. Thank you very much.

  • Try it out the label with underscores: "SASL_SHA_512" could be work for you