Kafka Tools

Overview

KafkaTools provides the following functionality that allows publishing to and consuming from topics on Kafka Servers.

Smart Services

  • Publish To Kafka
  • Consume From Kafka

In order to process the messages consumed from Kafka, it is recommended to use the Transaction Manager application. The models designed to process messages will be configured an assigned through the transaction manager job types. See documentation for Transaction Manager

Key Features & Functionality

Please refer to the README for additional details.

Anonymous
  • How to set-up "Data Source Name" and "Transaction Table Name"?

    For "Data Source Name":

            -  I have create a dummy table along with data type & data store & Record Type, since the type is text, I have tried put the table name, data store name, even recordtype, non of them working

    For "Transaction Table Name":

            -  Since it default linked to "tm_job_transaction", I tried leave there also tried use the table I created, nether of them work.

    When I run the PM, it do give me success, however nothing received, no message, no table got updated.

  • Our organisation use Confluent Kafka with events/messages sent in JSON with schema (AVRO).
    Will the Kafka Tool support AVRO serialization and AVRO deserialization in near future?

  • The solution that my client has adopted regarding this is to develop an external microservice and put it as an intermediate layer between Appian and Confluent. The microservice is able to read the message in AVRO format and transform it to String type, and will publish it to a Kafka topic in Confluent in string format. So, the reading from Appian is correct, because we are finally reading a string type. I hope this information helps you :)

  • Me and my client is phasing exactly the same problem. Except byte array deserializing (with ByteArrayDeserializer) we also need to deserialize the Avro to be able to store a readable string into TM database.

    Have you got any input or coming any further into a solution to the problem?

  • V1.4.1. Release Notes
    • Adds support for consuming from multiple topics

  • v1.4.0 Release Notes
    • Adds support for setting the Key and Partition when Publishing to Kafka
  • v1.3.2 Release Notes
    • Updated jackson core and jackson databind libraries

  • v1.3.1 Release Notes
    • Updated snappy java and jackson databind libraries

  • v1.3.0 Release Notes
    • Updated API usage and license file

  • Our client uses the Confluent tool, which encompasses Kafka.
    From Appian, with this plugin, we can publish & consume messages in a topic, but, as is being done through Confluent, the messages negotiate with a scheme, where the message format is AVRO (Json with headers) and each message published and consumed have to go through serialization / deserialization. Therefore, consumed messages arrive at Appian in an unreadable format.

    This is a message to the author of the "Kafka Tools" plugin: Is there an update in progress for this Appian plugin that would allow for the necessary Java libraries for Confluent, as well as the necessary classes to allow the message to be serialized/deserialized?

    And please, if anyone has had a situation similar to what I am talking about, please contact me, I will greatly appreciate it.

    Thank you so much,
    I am looking forward to your response