Overview
KafkaTools provides the following functionality that allows publishing to and consuming from topics on Kafka Servers.
Smart Services
In order to process the messages consumed from Kafka, it is recommended to use the Transaction Manager application. The models designed to process messages will be configured an assigned through the transaction manager job types. See documentation for Transaction Manager
Key Features & Functionality
Please refer to the README for additional details.
Sylvain Furt, we are experiencing following error when trying to connect Kafka. Could you please check and let me know.
Hi all, my client uses MTLS instead of TLS. Does this component support MTLS?
No progress so far
Bharat - it seems that the code is pretty straightforward to update.
My only observation would be to change the header parameter from required to optional.
I would recommend for you to make the updates and submit a new version of the plug-in to community. As you may know, plug-ins are not owned by any individual. They are community supported so feel free to update as needed.
Did you solve that ?
Hi Sylvain,
In continue Ketan's thread, I have check plug-in version 'kafka-tools-1.0.4. and we still not have option to pass the 'Header' as parameter with 'Topic' details.
Below changes requires in plug-in code in order to fulfill our requirement
Request to add in plug-in and share the latest version
1) PublishKafkaSmartService.java
private String _header;
@Order({ // inputs "SecureCredentialsStoreKey", "Servers", "Topic", "Header", "Payload", "SecurityProtocol", "SASLMechanism", "Truststore", "Keystore",
// outputs "Success", "ErrorMessage" })
kafkaProducer.publish(_topic, _payload, _header);
@Input(required = Required.ALWAYS) public void setHeader(String val) { this._header = val; }
2) AppianKafkaProducer.java
// _producer.send(new ProducerRecord<>(payload, header), ProducerRecord<String, String> record = new ProducerRecord<>(topic, payload, header);
Thanks,
Bharat
How do you resolve the issue "Topic appianTopic not present in metadata after 60000 ms" when publishing to a kafka topicWe have configured the scs keyand then configured the smart service as follow
scs key: used the api key as username and api secret as password
topic: topic created on confluent kakfa
servers: bootstrap server provided by confluent kafka
security protocol: SASL_PLAINTEXT
sasl mechanism: PLAIN
Where do you use the cluster id?
I am trying to use this plugin using SASL_MECHANISM as SCRAM_SHA_512 but it is failing. I have created the trusted store using amazonca1.cer. I have downloaded this certificate from this URL https://www.amazontrust.com/repository/. In my secure credentials, I have configured username, password, and truststorepwd. In my process model on call of consume kafka I am passing my secure credential name. We are on Appian cloud and I am getting the error INVALID SASL_MECHANISM. I am able to consume Kafka directly on my machine but I am unable to get it on Appian environment. What could be the possible solution?