Consume Kafka service logs

Certified Associate Developer

Hi Team,

We are using Kafka Tools plugin to utilize consume and Publish Kafka services. We can able to perform publish Kafka service without any issues. But when comes to Consume Kafka we have questions as follows,

  1. Consume Kafka service shows isSuccess output as true but it didn't received Kafka server. To find reason which log we should check?
  2. What is the difference between Group ID and Topics? whether we have to pass both as same?
  3. Deserializer Class Name field is not mandatory? if we pass "io.confluent.kafka.serializers.json.KafkaJsonSchemaDeserializer"  as deserializer it throws error like io.confluent.kafka.serializers.json.KafkaJsonSchemaDeserializer not found by com.appiancorp.cs.plugin.kafka [50]
  4. And in the process model, why TM_Transaction variable is not populated with value after consume Kafka service showing it is success?

  Discussion posts and replies are publicly visible

Parents
  • Hi  , 

    1. If you are using on-premise server, you can view the Kafka logs over here: <APPIAN_HOME>/services/data/kafka-logs
    but if it's a on-cloud server, then you can look into serverLogs.

    2. A Topic is a named stream of records whereas Group ID is a unique string that identifies a consumer group.

    3. Kafka consumers receive data from brokers as byte arrays. A deserializer is essential to convert these bytes back into a usable format, such as a string or a complex object.

    4. As far as I know, TM_Transaction variable in your process model is not being populated with the consumed message data because the Kafka Tools plugin is designed to write consumed messages directly to a database table, not to a process variable.

    Hope it helps..

Reply
  • Hi  , 

    1. If you are using on-premise server, you can view the Kafka logs over here: <APPIAN_HOME>/services/data/kafka-logs
    but if it's a on-cloud server, then you can look into serverLogs.

    2. A Topic is a named stream of records whereas Group ID is a unique string that identifies a consumer group.

    3. Kafka consumers receive data from brokers as byte arrays. A deserializer is essential to convert these bytes back into a usable format, such as a string or a complex object.

    4. As far as I know, TM_Transaction variable in your process model is not being populated with the consumed message data because the Kafka Tools plugin is designed to write consumed messages directly to a database table, not to a process variable.

    Hope it helps..

Children
No Data