Overview
KafkaTools provides the following functionality that allows publishing to and consuming from topics on Kafka Servers.
Smart Services
In order to process the messages consumed from Kafka, it is recommended to use the Transaction Manager application. The models designed to process messages will be configured an assigned through the transaction manager job types. See documentation for Transaction Manager
Key Features & Functionality
Please refer to the README for additional details.
How to set-up "Data Source Name" and "Transaction Table Name"?
For "Data Source Name":
- I have create a dummy table along with data type & data store & Record Type, since the type is text, I have tried put the table name, data store name, even recordtype, non of them working
For "Transaction Table Name":
- Since it default linked to "tm_job_transaction", I tried leave there also tried use the table I created, nether of them work.
When I run the PM, it do give me success, however nothing received, no message, no table got updated.
Hi gavins5922,
You have to include the JDBC of your database in Text type, not the table. Example: "jdbc/Appian" (if you have no other Data Source configured, this is the default Data Source)
I haven't tried this, I use the "tm_job_transaction" table but does the table you are setting up have the same columns as tm_job_transaction? I guess if it has the same structure, it should work...
Hope this help you
Best regards
Hi,
For the "Data Source Name", it's indeed the name of your business data source (the same name from Data source in Admin Console).
For the "Transaction Table Name", if you use a custom table instead of the tm_job_transaction, your table must have the same structure.
Oracle exemple :
ID (Number)
JOB_TYPE_ID (Number) - controlled by a sequence for auto-increment
CONTEXT_JSON (varchar2)
SCHEDULED_DATE (Timestamp)
STATUS_ID (Number)
TOPIC (varchar2)
PARTITION (Number)
OFFSET (Number)
KEY (Varchar2)
And yes, we success to retrieve kafka messages with this plug-in.
Regards
Hi Miguel Galán,
Much appreciate for helping! Yep, as I update to "jdbc/Appian" at least it give me some error message.
Right now it saying "Failed to construct kafka consumer".
Did you successfully subscribe from Kafka via this plug-in?
Thank you so much!
Best Regards