Overview
KafkaTools provides the following functionality that allows publishing to and consuming from topics on Kafka Servers.
Smart Services
In order to process the messages consumed from Kafka, it is recommended to use the Transaction Manager application. The models designed to process messages will be configured an assigned through the transaction manager job types. See documentation for Transaction Manager
Key Features & Functionality
Please refer to the README for additional details.
Hi team,
We need this plug-in to support AVRO Confluent Cloud Schema Registry for our use case. We’ve seen that in the latest release this functionality was added to the Consume From Kafka smart service via the avroSchemaRegistryUrl parameter.
Is there any plan to add the same functionality to the Publish To Kafka smart service in the near future, so Schema Registry is supported for both consuming and publishing?
Thanks in advance!
Good afternoon -
We are frequently receiving this error from the Consumer process for this plugin:
The incoming tabular data stream (TDS) protocol stream is incorrect. The TDS headers contained errors.;
It just errors out but then everything picks up just fine on the next run. Our process runs every 10 minutes and we get 10-15 where this error pops up each day. Please advise how we can fix this error, thanks!
hello everyone! I am Gokulnath - PM for the Native Kafka Integration initiative in Appian. The beta for the native event consumption capability is now live until the Dec of this year. Hence I encourage you to try out this capability and share feedback. Please follow the instructions in this community link to participate in the beta : Kafka Integration Happy to answer any questions, thank you.
Are there plans to support non-Text values like Avro messages?
One options is to define a custom (De)serializer class, but this seem not to be used if you define this.. To code seems to assume the value is always a String. is the support for a customer Deserializer class with non-text values really supported?
Appian AppMarket , In the plugin Third-Party Credentials, Scope field missing / not possible to configure, due to that token generation is failing , is that possible to upgrade the plugin with Scope field.
We have the same issue. Are there any updates on this?
We are able to successfully produce messages to Kafka, but have need for the message to use a Json Serializer. Is it possible to add the ability to use a different serializer than just String serializer when producing messages?
We have an issue consuming messages from one topic.
We use 2 consumers with same configuration (test and acc) to consume from same topic. TEST consumed 2 messages and ACC consumed 3 messages. Both consumers are alive on the topic and have lag 0. Any ideas what could have gone wrong ? Also, any logs where I could check something about what is consumed ?
We've been using this plugin successfully for the past 2 years to consume from other topics but this has happened twice in past month with a new topic.
v1.4.2
Thanks