Kafka Tools

Overview

KafkaTools provides the following functionality that allows publishing to and consuming from topics on Kafka Servers.

Smart Services

  • Publish To Kafka
  • Consume From Kafka

In order to process the messages consumed from Kafka, it is recommended to use the Transaction Manager application. The models designed to process messages will be configured an assigned through the transaction manager job types. See documentation for Transaction Manager

Key Features & Functionality

Please refer to the README for additional details.

Anonymous
  • Yep that was the issue. Thank you very much.

  • Try it out the label with underscores: "SASL_SHA_512" could be work for you

  • Hi,

    I am facing an issue when publishing to a Kafka queue. When running the node it is returning the error [Invalid sasl mechanism]. I have double checked the mechanism value in the config file of the server and it is SCRAM-SHA-512 which is what I have configured in the smart service node. 

  • Hi, as those JKS are in general not portable between environments, we use to keep them inside "XXX_Infra" applications that are exists on every environments, then use a constant of type Integer with values relative to the environment and use it when configuring the Smart Service. As this constant is Environment Specific treat it in your CI/CD the same way you already do with other customization files

  • Hi Team,

    Use case: Given a process model with a Smart service "Consume from Kafka" configured with the "Runtime in minutes" parameter to 58 minutes.

    We have detected that if we manually cancel the instance of the live process model (which is alive for the entire subscription, 58 minutes) when it has spent, for example, 30 minutes of instance execution, it seems that the events continue arriving to Appian, that is, the subscription is still alive even after canceling the process model instance that has the Smart Service "Consume From Kafka".

    Once 58 minutes have passed, topic events stop arriving at Appian, even though the process model was canceled about 30 minutes before.

    We don't see any reference to that behavior in the documentation, but it is a behavior that is happening to us, and we understand that it happens to everyone who uses this plugin.

    Can someone confirm this behavior?

  • Hi,

    Does this Appian plugin community.appian.com/.../kafka-tools use the Kafka Rest API protocol?


    Thanks

  • Hi Team,

    We are facing issue while producing the message using Kafka Tools plugin (Publish to Kafka). We are getting an error with log4j.  The error is:

    This repository is not sync with any central repository. Please help us to solve the issue.

  • v1.4.2 Release Notes
    • Upgraded json-path library

  • Hi Sylvian/Team,

    As you know, Kafka Plugin is one such Smart Service which allows the workflow to send a Kafka Message to a designated topic. This smart service is configured as one of the workflow steps based on the business requirement. Since this Smart Service is provided by Appian, the internal functioning of this Smart Service is abstracted by the consumers. The Smart Service in current form does not support retrieval of certificates from AWS Secrets Manager at runtime instead in expect the .JKS file to be provided at runtime which should be prestored in Appian internal document store.

    Issue description: Currently we are exporting the application and pushing it to Bitbucket as part of our CICD application deployment. We have reviewed our code with client security team, They have found .jks file in the content of the application code and as per our industry best practices, we should not store any .jks files in bitbucket code.

    1: Is there any alternate solution to not use .jks file?

    2: Are there any other methods available to assure that the .jks files can be stored in the Appian Secured Document Center, but also not stored in the Bitbucket repository?

    Appreciate your response.

  • Hi, 

    For the "Data Source Name", it's indeed the name of your business data source (the same name from Data source in Admin Console).

    For the "Transaction Table Name", if you use a custom table instead of the tm_job_transaction, your table must have the same structure.

    Oracle exemple : 

    ID (Number)

    JOB_TYPE_ID (Number) - controlled by a sequence for auto-increment

    CONTEXT_JSON (varchar2)

    SCHEDULED_DATE (Timestamp)

    STATUS_ID (Number)

    TOPIC (varchar2)

    PARTITION (Number)

    OFFSET (Number)

    KEY (Varchar2)

    And yes, we success to retrieve kafka messages with this plug-in.

    Regards