Kafka Tools

Overview

KafkaTools provides the following functionality that allows publishing to and consuming from topics on Kafka Servers.

Smart Services

  • Publish To Kafka
  • Consume From Kafka

In order to process the messages consumed from Kafka, it is recommended to use the Transaction Manager application. The models designed to process messages will be configured an assigned through the transaction manager job types. See documentation for Transaction Manager

Key Features & Functionality

Please refer to the README for additional details.

Anonymous
  • Bharat - it seems that the code is pretty straightforward to update.

    My only observation would be to change the header parameter from required to optional.

    I would recommend for you to make the updates and submit a new version of the plug-in to community. As you may know, plug-ins are not owned by any individual. They are community supported so feel free to update as needed.

  • v1.0.5 Release Notes
    • Security Updates
  • Hi Sylvain,

    In continue Ketan's thread, I have check plug-in version 'kafka-tools-1.0.4. and we still not have option to pass the 'Header' as parameter with 'Topic' details.

    Below changes requires in plug-in code in order to fulfill our requirement

    Request to add in plug-in and share the latest version

    1) PublishKafkaSmartService.java 

    private String _header;

    @Order({
    // inputs
    "SecureCredentialsStoreKey", "Servers", "Topic", "Header", "Payload", "SecurityProtocol", "SASLMechanism",
    "Truststore", "Keystore",

    // outputs
    "Success", "ErrorMessage" })

    kafkaProducer.publish(_topic, _payload, _header);

    @Input(required = Required.ALWAYS)
    public void setHeader(String val) {
    this._header = val;
    }

    2) AppianKafkaProducer.java

    // _producer.send(new ProducerRecord<>(payload, header),
    ProducerRecord<String, String> record = new ProducerRecord<>(topic, payload, header);
     

    Thanks,

    Bharat

  • How do you resolve the issue "Topic appianTopic not present in metadata after 60000 ms" when publishing to a kafka topic
    We have configured the scs key
    and then configured the smart service as follow

    scs key: used the api key as username and api secret as password

    topic: topic created on confluent kakfa

    servers: bootstrap server provided by confluent kafka

    security protocol: SASL_PLAINTEXT

    sasl mechanism: PLAIN

    Where do you use the cluster id? 

  • I am trying to use this plugin using SASL_MECHANISM as SCRAM_SHA_512 but it is failing. I have created the trusted store using amazonca1.cer. I have downloaded this certificate from this URL https://www.amazontrust.com/repository/. In my secure credentials, I have configured username, password, and truststorepwd. In my process model on call of consume kafka I am passing my secure credential name. We are on Appian cloud and I am getting the error INVALID SASL_MECHANISM. I am able to consume Kafka directly on my machine but I am unable to get it on Appian environment. What could be the possible solution?

  • Hi borisb170, The Readme.pdf is now part of the deliverable.Please download the new version, you will find it inside the zip.

  • v1.0.4 Release Notes
    • Added support for reading compressed messages with Snappy
  • Hi Sylvain,

    Thanks for providing this plug-in. In the description of the plug-in you refer to a README file for additional details but in the download I can only find the .jar file. 
    Where can we find the README file for additional details?

    Thanks!