<?xml-stylesheet type="text/xsl" href="https://community.appian.com/cfs-file/__key/system/syndication/rss.xsl" media="screen"?><rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:slash="http://purl.org/rss/1.0/modules/slash/" xmlns:wfw="http://wellformedweb.org/CommentAPI/"><channel><title>Kafka Tools</title><link>/b/appmarket/posts/kafka-tools</link><description>Overview 
 KafkaTools provides the following functionality that allows publishing to and consuming from topics on Kafka Servers. 
 Smart Services 
 
 Publish To Kafka 
 Consume From Kafka 
 Consume From Kafka JWT Auth Grant 
 
 In order to process the</description><dc:language>en-US</dc:language><generator>Telligent Community 12</generator><item><title>RE: Kafka Tools</title><link>https://community.appian.com/b/appmarket/posts/kafka-tools</link><pubDate>Fri, 27 Feb 2026 23:00:00 GMT</pubDate><guid isPermaLink="false">d3a83456-d57b-489c-a84c-4e8267bb592a:b2fbd503-9213-405c-a0a3-ba2d2441eda2</guid><dc:creator>Appian AppMarket</dc:creator><slash:comments>0</slash:comments><description>&lt;div&gt;&lt;strong&gt;1.7.0 &lt;/strong&gt;&lt;strong&gt;Release Notes&lt;/strong&gt;&lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;ul&gt;&lt;li&gt;&lt;strong&gt;New Features: &lt;/strong&gt;New Smart Service Consume From Kafka JWT Auth Grant. This Smart Service implements a Kafka Consumer that authenticates using the OAuth 2.0 JWT Bearer Authorization Grant as defined in RFC 7523.&lt;/li&gt;&lt;li&gt;&lt;strong&gt;Bug Fixes: &lt;/strong&gt;Ensure connection rollback upon encountering exception&lt;/li&gt;&lt;/ul&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;img src="https://community.appian.com/aggbug?PostID=3248&amp;AppID=50&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Kafka Tools</title><link>https://community.appian.com/b/appmarket/posts/kafka-tools</link><pubDate>Thu, 18 Dec 2025 14:35:15 GMT</pubDate><guid isPermaLink="false">d3a83456-d57b-489c-a84c-4e8267bb592a:b2fbd503-9213-405c-a0a3-ba2d2441eda2</guid><dc:creator>Patricia</dc:creator><slash:comments>0</slash:comments><description>&lt;p&gt;Hi team,&lt;/p&gt;
&lt;p&gt;We need this plug-in to support AVRO Confluent Cloud Schema Registry for our use case. We&amp;rsquo;ve seen that in the latest release this functionality was added to the Consume From Kafka smart service via the avroSchemaRegistryUrl parameter.&lt;/p&gt;
&lt;p&gt;Is there any plan to add the same functionality to the Publish To Kafka smart service in the near future, so Schema Registry is supported for both consuming and publishing?&lt;/p&gt;
&lt;p&gt;Thanks in advance!&lt;/p&gt;&lt;img src="https://community.appian.com/aggbug?PostID=3248&amp;AppID=50&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Kafka Tools</title><link>https://community.appian.com/b/appmarket/posts/kafka-tools</link><pubDate>Wed, 10 Dec 2025 18:16:35 GMT</pubDate><guid isPermaLink="false">d3a83456-d57b-489c-a84c-4e8267bb592a:b2fbd503-9213-405c-a0a3-ba2d2441eda2</guid><dc:creator>mollyfarnam</dc:creator><slash:comments>0</slash:comments><description>&lt;p&gt;Good afternoon -&amp;nbsp;&lt;/p&gt;
&lt;p&gt;We are frequently receiving this error from the Consumer process for this plugin:&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The incoming tabular data stream (TDS) protocol stream is incorrect. The TDS headers contained errors.;&lt;/p&gt;
&lt;p&gt;It just errors out but then everything picks up just fine on the next run. Our process runs every 10 minutes and we get 10-15 where this error pops up each day. Please advise how we can fix this error, thanks!&lt;/p&gt;&lt;img src="https://community.appian.com/aggbug?PostID=3248&amp;AppID=50&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Kafka Tools</title><link>https://community.appian.com/b/appmarket/posts/kafka-tools</link><pubDate>Thu, 04 Dec 2025 23:00:00 GMT</pubDate><guid isPermaLink="false">d3a83456-d57b-489c-a84c-4e8267bb592a:b2fbd503-9213-405c-a0a3-ba2d2441eda2</guid><dc:creator>Appian AppMarket</dc:creator><slash:comments>0</slash:comments><description>&lt;div&gt;&lt;strong&gt;v1.6.0 Release Notes&lt;/strong&gt;&lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;ul&gt;&lt;li&gt;avroSchemaRegistryUrl: URL for the API endpoint for AVRO schema registry. If certificate is needed, it must be contained in truststore and keystore configuration. If specified, all topics configured to decode (consume) must be in AVRO format (no mixing of topics formats). To receive mixed format topics, 2 PM nodes needs to be configured - one for String based topics and one for Avro based topics.&lt;/li&gt;&lt;/ul&gt;&lt;img src="https://community.appian.com/aggbug?PostID=3248&amp;AppID=50&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Kafka Tools</title><link>https://community.appian.com/b/appmarket/posts/kafka-tools</link><pubDate>Fri, 07 Nov 2025 23:00:01 GMT</pubDate><guid isPermaLink="false">d3a83456-d57b-489c-a84c-4e8267bb592a:b2fbd503-9213-405c-a0a3-ba2d2441eda2</guid><dc:creator>Appian AppMarket</dc:creator><slash:comments>0</slash:comments><description>&lt;div&gt;&lt;strong&gt;v1.5.2 Release Notes&lt;/strong&gt;&lt;/div&gt;&lt;div&gt;&lt;br /&gt;&lt;/div&gt;&lt;ul&gt;&lt;li&gt;Support a new parameter `scope` on both Smart Services to enable OAuth with IdPs that require a scope parameter to be provided (e.g. EntraID).&lt;/li&gt;&lt;li&gt;Updated kafka-clients library.&lt;/li&gt;&lt;/ul&gt;&lt;img src="https://community.appian.com/aggbug?PostID=3248&amp;AppID=50&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Kafka Tools</title><link>https://community.appian.com/b/appmarket/posts/kafka-tools</link><pubDate>Fri, 12 Sep 2025 10:24:10 GMT</pubDate><guid isPermaLink="false">d3a83456-d57b-489c-a84c-4e8267bb592a:b2fbd503-9213-405c-a0a3-ba2d2441eda2</guid><dc:creator>gokulnath.sudharsun</dc:creator><slash:comments>0</slash:comments><description>&lt;p&gt;hello everyone! I am Gokulnath&amp;nbsp;-&amp;nbsp;PM for the Native Kafka Integration initiative in Appian. The beta for the native event consumption capability is now live until the Dec of this year. Hence I encourage you to try out this capability and share feedback. Please follow the instructions in this community link to participate in the beta :&amp;nbsp;[mention:33c7337cb9784429bcdbf4105df90226:f7d226abd59f475c9d224a79e3f0ec07]&amp;nbsp;Happy to answer any questions, thank you.&lt;/p&gt;&lt;img src="https://community.appian.com/aggbug?PostID=3248&amp;AppID=50&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Kafka Tools</title><link>https://community.appian.com/b/appmarket/posts/kafka-tools</link><pubDate>Wed, 20 Aug 2025 06:22:56 GMT</pubDate><guid isPermaLink="false">d3a83456-d57b-489c-a84c-4e8267bb592a:b2fbd503-9213-405c-a0a3-ba2d2441eda2</guid><dc:creator>klaaspieterb0001</dc:creator><slash:comments>0</slash:comments><description>&lt;p&gt;Are there plans to support non-Text values like Avro messages?&amp;nbsp;&lt;/p&gt;
&lt;p&gt;One options is to define a custom (De)serializer class, but this seem not to be used if you define this.. To code seems to assume the value is always a String.&amp;nbsp; is the support for a customer Deserializer class with non-text values really supported?&amp;nbsp;&amp;nbsp;&lt;/p&gt;&lt;img src="https://community.appian.com/aggbug?PostID=3248&amp;AppID=50&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Kafka Tools</title><link>https://community.appian.com/b/appmarket/posts/kafka-tools</link><pubDate>Fri, 01 Aug 2025 08:02:59 GMT</pubDate><guid isPermaLink="false">d3a83456-d57b-489c-a84c-4e8267bb592a:b2fbd503-9213-405c-a0a3-ba2d2441eda2</guid><dc:creator>sundarrajj0474</dc:creator><slash:comments>0</slash:comments><description>&lt;p&gt;[mention:92ac9d4559ff43edb28eaa01ed77691b:e9ed411860ed4f2ba0265705b8793d05]&amp;nbsp;, In the plugin&amp;nbsp;Third-Party Credentials,&amp;nbsp;&lt;strong&gt;Scope&lt;/strong&gt; field missing / not possible to configure, due to that token generation is failing , is that possible to upgrade the plugin with Scope field.&lt;/p&gt;
&lt;p&gt;&lt;img src="/resized-image/__size/640x480/__key/commentfiles/f7d226abd59f475c9d224a79e3f0ec07-b2fbd503-9213-405c-a0a3-ba2d2441eda2/pastedimage1754035146557v1.png" alt=" " /&gt;&lt;/p&gt;
&lt;p&gt;&lt;/p&gt;
&lt;p&gt;&lt;img src="/resized-image/__size/640x480/__key/commentfiles/f7d226abd59f475c9d224a79e3f0ec07-b2fbd503-9213-405c-a0a3-ba2d2441eda2/pastedimage1754035309147v2.png" alt=" " /&gt;&lt;/p&gt;
&lt;p&gt;&lt;/p&gt;
&lt;p&gt;&lt;img src="/resized-image/__size/640x480/__key/commentfiles/f7d226abd59f475c9d224a79e3f0ec07-b2fbd503-9213-405c-a0a3-ba2d2441eda2/pastedimage1754035329368v3.png" alt=" " /&gt;&lt;/p&gt;&lt;img src="https://community.appian.com/aggbug?PostID=3248&amp;AppID=50&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Kafka Tools</title><link>https://community.appian.com/b/appmarket/posts/kafka-tools</link><pubDate>Wed, 28 May 2025 20:09:46 GMT</pubDate><guid isPermaLink="false">d3a83456-d57b-489c-a84c-4e8267bb592a:b2fbd503-9213-405c-a0a3-ba2d2441eda2</guid><dc:creator>mollyfarnam</dc:creator><slash:comments>0</slash:comments><description>&lt;p&gt;We are able to successfully produce messages to Kafka, but have need for the message to use a Json Serializer. Is it possible to add the ability to use a different serializer than just String serializer when producing messages?&lt;/p&gt;&lt;img src="https://community.appian.com/aggbug?PostID=3248&amp;AppID=50&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Kafka Tools</title><link>https://community.appian.com/b/appmarket/posts/kafka-tools</link><pubDate>Wed, 21 May 2025 07:55:53 GMT</pubDate><guid isPermaLink="false">d3a83456-d57b-489c-a84c-4e8267bb592a:b2fbd503-9213-405c-a0a3-ba2d2441eda2</guid><dc:creator>mirunas306530</dc:creator><slash:comments>1</slash:comments><description>&lt;p&gt;We have an issue consuming messages from one topic.&lt;/p&gt;
&lt;p&gt;We use 2 consumers with same configuration (test and acc) to consume from same topic. TEST consumed 2 messages and ACC consumed 3 messages. Both consumers are alive on the topic and have lag 0. Any ideas what could have gone wrong ? Also, any logs where I could check something about what is consumed ?&lt;/p&gt;
&lt;p&gt;We&amp;#39;ve been using this plugin successfully for the past 2 years to consume from other topics but this has happened twice in past month with a new topic.&lt;/p&gt;
&lt;p&gt;v1.4.2&lt;/p&gt;
&lt;p&gt;&lt;/p&gt;
&lt;p&gt;Thanks&lt;/p&gt;&lt;img src="https://community.appian.com/aggbug?PostID=3248&amp;AppID=50&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Kafka Tools</title><link>https://community.appian.com/b/appmarket/posts/kafka-tools</link><pubDate>Fri, 04 Apr 2025 14:54:01 GMT</pubDate><guid isPermaLink="false">d3a83456-d57b-489c-a84c-4e8267bb592a:b2fbd503-9213-405c-a0a3-ba2d2441eda2</guid><dc:creator>joeys987</dc:creator><slash:comments>0</slash:comments><description>&lt;p&gt;We successfully integrated the Kafka tool in our dev environment. We&amp;#39;re now looking to integrate it in our other environments. The Kafka tool requires the setup of third-party credentials. We use Salt Stack to set up all our environments. Is it possible to set up the Kafka Tool third party credentials using Salt? Thanks&lt;/p&gt;&lt;img src="https://community.appian.com/aggbug?PostID=3248&amp;AppID=50&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Kafka Tools</title><link>https://community.appian.com/b/appmarket/posts/kafka-tools</link><pubDate>Thu, 13 Feb 2025 23:00:00 GMT</pubDate><guid isPermaLink="false">d3a83456-d57b-489c-a84c-4e8267bb592a:b2fbd503-9213-405c-a0a3-ba2d2441eda2</guid><dc:creator>Appian AppMarket</dc:creator><slash:comments>0</slash:comments><description>&lt;div&gt;&lt;strong&gt;v1.4.3 Release Notes&lt;/strong&gt;&lt;/div&gt;&lt;ul&gt;&lt;li&gt;Upgraded kafka-clients library&lt;/li&gt;&lt;/ul&gt;&lt;img src="https://community.appian.com/aggbug?PostID=3248&amp;AppID=50&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Kafka Tools</title><link>https://community.appian.com/b/appmarket/posts/kafka-tools</link><pubDate>Mon, 13 Jan 2025 19:50:08 GMT</pubDate><guid isPermaLink="false">d3a83456-d57b-489c-a84c-4e8267bb592a:b2fbd503-9213-405c-a0a3-ba2d2441eda2</guid><dc:creator>Shanmukha</dc:creator><slash:comments>1</slash:comments><description>&lt;p&gt;Hello,&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;br /&gt;We are using 1.2.1 version plugin. Recently appian cloud version got updated with 24.4 version. From there onwards it is throwing below error.&lt;/p&gt;
&lt;p&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt; ERROR com.appiancorp.cs.plugin.kafka.consumer.AppianKafkaConsumerThread - Received an exception from consumer: 1177940 - SSL handshake failed&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Could any one please help on this on urgent basis.?&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Thank you in advance.!&lt;/span&gt;&lt;/p&gt;&lt;img src="https://community.appian.com/aggbug?PostID=3248&amp;AppID=50&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Kafka Tools</title><link>https://community.appian.com/b/appmarket/posts/kafka-tools</link><pubDate>Tue, 20 Aug 2024 09:40:35 GMT</pubDate><guid isPermaLink="false">d3a83456-d57b-489c-a84c-4e8267bb592a:b2fbd503-9213-405c-a0a3-ba2d2441eda2</guid><dc:creator>Jones Prakash Selvaraj</dc:creator><slash:comments>0</slash:comments><description>&lt;p&gt;Hi,&lt;/p&gt;
&lt;p&gt;We are currently using the 1.4.1 version of this plugin, and it&amp;#39;s working as expected. We are planning on upgrading a separate Kafka cluster from 2.6.2 to 2.8.2.tiered version (&lt;a href="https://docs.aws.amazon.com/msk/latest/developerguide/supported-kafka-versions.html"&gt;Supported Apache Kafka versions - Amazon Managed Streaming for Apache Kafka&lt;/a&gt;). Do you have information on the compatible versions of the Kafka components that support this plugin?&lt;/p&gt;&lt;img src="https://community.appian.com/aggbug?PostID=3248&amp;AppID=50&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Kafka Tools</title><link>https://community.appian.com/b/appmarket/posts/kafka-tools</link><pubDate>Tue, 30 Jul 2024 15:02:32 GMT</pubDate><guid isPermaLink="false">d3a83456-d57b-489c-a84c-4e8267bb592a:b2fbd503-9213-405c-a0a3-ba2d2441eda2</guid><dc:creator>JJ Ca&amp;#241;as</dc:creator><slash:comments>0</slash:comments><description>&lt;p&gt;Hi,&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Does anyone know how this plug-in works? I mean, i&lt;span&gt;n what format does it publish messages to Kafka and how does it save the data in the database.&amp;nbsp;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;As my colleague Patricia says below, we need to use Schema Registry to serialize and deserialize messages in AVRO format. Can this be done with this plug-in? &lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;I&amp;#39;ve seen in the documentation that Consume from Kafka Smart Service has an input parameter called &amp;quot;Deserializer Class Name&amp;quot; but Publish to Kafka Smart Service hasn&amp;#39;t any input parameter related to serialization.&amp;nbsp;&lt;span class="jCAhz ChMk0b"&gt;&lt;span class="ryNqvb"&gt;Does this mean that the plugin serializes the messages in some way?&lt;/span&gt;&lt;/span&gt; &lt;span class="jCAhz ChMk0b"&gt;&lt;span class="ryNqvb"&gt;You can provide it the Java class to use for deserialization but not how it should serialize the messages to publish to Kafka?&lt;/span&gt;&lt;/span&gt; &lt;span class="jCAhz"&gt;&lt;span class="ryNqvb"&gt;This is really confusing.&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Thanks in advance.&amp;nbsp;&lt;/span&gt;&lt;/p&gt;&lt;img src="https://community.appian.com/aggbug?PostID=3248&amp;AppID=50&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item></channel></rss>