Overview
KafkaTools provides the following functionality that allows publishing to and consuming from topics on Kafka Servers.
Smart Services
In order to process the messages consumed from Kafka, it is recommended to use the Transaction Manager application. The models designed to process messages will be configured an assigned through the transaction manager job types. See documentation for Transaction Manager
Key Features & Functionality
Please refer to the README for additional details.
How's is the keystore configured/created for the cloud/SaaS environment?
Sylvain Furt, we are experiencing following error when trying to connect Kafka. Could you please check and let me know.
Hi all, my client uses MTLS instead of TLS. Does this component support MTLS?
No progress so far
Bharat - it seems that the code is pretty straightforward to update.
My only observation would be to change the header parameter from required to optional.
I would recommend for you to make the updates and submit a new version of the plug-in to community. As you may know, plug-ins are not owned by any individual. They are community supported so feel free to update as needed.
Did you solve that ?
Hi Sylvain,
In continue Ketan's thread, I have check plug-in version 'kafka-tools-1.0.4. and we still not have option to pass the 'Header' as parameter with 'Topic' details.
Below changes requires in plug-in code in order to fulfill our requirement
Request to add in plug-in and share the latest version
1) PublishKafkaSmartService.java
private String _header;
@Order({ // inputs "SecureCredentialsStoreKey", "Servers", "Topic", "Header", "Payload", "SecurityProtocol", "SASLMechanism", "Truststore", "Keystore",
// outputs "Success", "ErrorMessage" })
kafkaProducer.publish(_topic, _payload, _header);
@Input(required = Required.ALWAYS) public void setHeader(String val) { this._header = val; }
2) AppianKafkaProducer.java
// _producer.send(new ProducerRecord<>(payload, header), ProducerRecord<String, String> record = new ProducerRecord<>(topic, payload, header);
Thanks,
Bharat
How do you resolve the issue "Topic appianTopic not present in metadata after 60000 ms" when publishing to a kafka topicWe have configured the scs keyand then configured the smart service as follow
scs key: used the api key as username and api secret as password
topic: topic created on confluent kakfa
servers: bootstrap server provided by confluent kafka
security protocol: SASL_PLAINTEXT
sasl mechanism: PLAIN
Where do you use the cluster id?