Kafka Tools

Overview

KafkaTools provides the following functionality that allows publishing to and consuming from topics on Kafka Servers.

Smart Services

  • Publish To Kafka
  • Consume From Kafka

In order to process the messages consumed from Kafka, it is recommended to use the Transaction Manager application. The models designed to process messages will be configured an assigned through the transaction manager job types. See documentation for Transaction Manager

Key Features & Functionality

Please refer to the README for additional details.

Anonymous
  • We are able to successfully produce messages to Kafka, but have need for the message to use a Json Serializer. Is it possible to add the ability to use a different serializer than just String serializer when producing messages?

  • We have an issue consuming messages from one topic.

    We use 2 consumers with same configuration (test and acc) to consume from same topic. TEST consumed 2 messages and ACC consumed 3 messages. Both consumers are alive on the topic and have lag 0. Any ideas what could have gone wrong ? Also, any logs where I could check something about what is consumed ?

    We've been using this plugin successfully for the past 2 years to consume from other topics but this has happened twice in past month with a new topic.

    v1.4.2

    Thanks

  • We successfully integrated the Kafka tool in our dev environment. We're now looking to integrate it in our other environments. The Kafka tool requires the setup of third-party credentials. We use Salt Stack to set up all our environments. Is it possible to set up the Kafka Tool third party credentials using Salt? Thanks

  • As a weird workaround, you may be able to upload the kafka certificate into the Admin Console. Login as admin -> go to the admin console -> Integration/Certificates -> Trusted Server Certificates -> New Trusted Server Certificate and upload the PEM here.

    If for some reason that didn't work, you can possibly disable feature toggle `ae.databases-and-search.trust-manager-factory-provider.enabled`.

  • v1.4.3 Release Notes
    • Upgraded kafka-clients library
  • Hello, 


    We are using 1.2.1 version plugin. Recently appian cloud version got updated with 24.4 version. From there onwards it is throwing below error.

    ERROR com.appiancorp.cs.plugin.kafka.consumer.AppianKafkaConsumerThread - Received an exception from consumer: 1177940 - SSL handshake failed

    Could any one please help on this on urgent basis.?

    Thank you in advance.!

  • Hi,

    We are currently using the 1.4.1 version of this plugin, and it's working as expected. We are planning on upgrading a separate Kafka cluster from 2.6.2 to 2.8.2.tiered version (Supported Apache Kafka versions - Amazon Managed Streaming for Apache Kafka). Do you have information on the compatible versions of the Kafka components that support this plugin?

  • Hi Miguel, 

    We have a few questions on how this solution is implemented if you could please add more details. We especially want to know if the microservice you made also publish in the Kafka topic for Appian and if the Appian plug-in connects to the microservice or directly to Confluent.

    Thank you

  • Hi, 

    Does anyone know how this plug-in works? I mean, in what format does it publish messages to Kafka and how does it save the data in the database. 

    As my colleague Patricia says below, we need to use Schema Registry to serialize and deserialize messages in AVRO format. Can this be done with this plug-in?

    I've seen in the documentation that Consume from Kafka Smart Service has an input parameter called "Deserializer Class Name" but Publish to Kafka Smart Service hasn't any input parameter related to serialization. Does this mean that the plugin serializes the messages in some way? You can provide it the Java class to use for deserialization but not how it should serialize the messages to publish to Kafka? This is really confusing.

    Thanks in advance.