Kafka Tools

Overview

KafkaTools provides the following functionality that allows publishing to and consuming from topics on Kafka Servers.

Smart Services

  • Publish To Kafka
  • Consume From Kafka

In order to process the messages consumed from Kafka, it is recommended to use the Transaction Manager application. The models designed to process messages will be configured an assigned through the transaction manager job types. See documentation for Transaction Manager

Key Features & Functionality

Please refer to the README for additional details.

Anonymous
  • Hi,

    We are currently using the 1.4.1 version of this plugin, and it's working as expected. We are planning on upgrading a separate Kafka cluster from 2.6.2 to 2.8.2.tiered version (Supported Apache Kafka versions - Amazon Managed Streaming for Apache Kafka). Do you have information on the compatible versions of the Kafka components that support this plugin?

  • Hi Miguel, 

    We have a few questions on how this solution is implemented if you could please add more details. We especially want to know if the microservice you made also publish in the Kafka topic for Appian and if the Appian plug-in connects to the microservice or directly to Confluent.

    Thank you

  • Hi, 

    Does anyone know how this plug-in works? I mean, in what format does it publish messages to Kafka and how does it save the data in the database. 

    As my colleague Patricia says below, we need to use Schema Registry to serialize and deserialize messages in AVRO format. Can this be done with this plug-in?

    I've seen in the documentation that Consume from Kafka Smart Service has an input parameter called "Deserializer Class Name" but Publish to Kafka Smart Service hasn't any input parameter related to serialization. Does this mean that the plugin serializes the messages in some way? You can provide it the Java class to use for deserialization but not how it should serialize the messages to publish to Kafka? This is really confusing.

    Thanks in advance. 

  • Hi,

    We are trying to use this plug-in to consume and publish from Kafka with Schema Registry from Confluent Cloud.

    This plug-in does not support AVRO format in Confluent's Schema Registry, so we're evaluating if we could solve this by creating a Smart Service Plug-in to serialize/deserialize with the AVRO Java libraries.

    The idea is to serialize the messages before using Publish to Kafka and then deserialize the data after Consume from Kafka stores them in db.

    Does anyone know if this Smart services can send and receive the binary in AVRO format? Has anyone tried to do this or something similar?

    Thanks in advance

  • Hi,

    I think so, that is the expected behavior. It seems that, when the smart service starts, it subscribes to the topic that is passed to it as input at that moment. In fact, I have detected that, if we force the subscription to a topic to be cut, canceling the process model that that consumer has, the subscription is NOT cut until the "Runtime in minutes" period ends, in my case , set to 58 minutes. You can see this fact by consulting the log "tomcat-stdOut.log"

  • Hi,

    We are using the 1.4.1 version and have set up of 20-minute runtime. 

    For topics, we are passing the expression rule as input. Values returning in the expression rule output may vary dynamically (due to some custom processes we have configured in the backend).

    But the already active "consume from Kafka" smart service keeps on consuming the topics that were set at the start. It is not considering the value changes in real-time. Is this the expected behaviour?

  • We are using 1.0.4 version and it has reported some vulnerabilities. Which version is compatible with Appian 22.4 and how to download specific version of this plugin?

  • Hi Patricia,

    Me and other people on this chat has this problem with our client.

    In my case, the solution that my client has adopted regarding this is to develop an external microservice and put it as an intermediate layer between Appian and Confluent. The microservice is able to read the message in AVRO format in Confluent's Schema Registry and transform it to String type without any Schema Registry, and will publish it in another Kafka topic in Confluent in string format. So, the reading from Appian is correct, because we are finally reading a string type. I hope this information helps you :)

  • Hi, has anyone used this plug-in with Schema Registry from Confluent Cloud? I haven't seen any reference to this in the documentation.

    Thanks in advance

  • Given the implementation of this plugin, "consume from Kafka" smart service will be save the events in a DDBB table "tm_job_transaction". Yo can try to create the same table from yourself with the same structure that "tm_job_transaction" table. I think its works too