Configuration Properties for the Kafka Clients
    • 1 Minute to read
    • Dark
      Light

    Configuration Properties for the Kafka Clients

    • Dark
      Light

    Article summary

    The following table lists all configuration properties for the Kafka clients (see also http://kafka.apache.org/documentation/#consumerconfigs):

    Property

    Description

    Value

    clientId

    This ID identifies your program in Kafka. It has only naming purposes. 

    You can set the value as required.

    We recommend using some meaningful naming that identifies the purpose of the consumer so we can troubleshoot problems you may experience as developer easily. Do not use any personally identifiable information (PII) nor anything that would leak sensitive information about your infrastructure.

    groupId

    This is the Kafka Consumer Group, in which your application is part.

    Mapp provides you with the value.

    endpoints

    The Kafka endpoint where you must consume the data from.

    Mapp provides you with the value.

    topic

    The Kafka Topic from which your application is consuming data.

    Mapp provides you with the value.

    autoOffsetResetPolicy

    Policy adopted by the consumer if there is no offset (point in Kafka time) to consume.

    Possible values:

    • earliest = This is the recommended value. Your consumer will not miss any data (but it is more likely that you will process items more than once).

    • latest = Your consumer will start consuming the most up to date data on the stream (however you may ignore some data that was already there).

    securityProtocol

    The security protocol enforced when connecting to our Kafka.

    SASL_SSL

    securitySaslMechanism

    The SASL mechanism used for performing the authentication.

    SCRAM-SHA-256

    schemaRegistryUrl

    The URL to the Mapp schema registry.

    Mapp provides you with the value.

    keyDeserializer/ keySerde

    The class that the Kafka Plain Consumer/ Kafka Streams Application will use to deserialize the record key.

    • Value for the Kafka Plain Consumer: apache.kafka.common.serialization.ByteArraySerializer

    • Value for the Kafka Streams Applications: apache.kafka.common.serialization.Serdes$ByteArraySerde

    valueDeserializer/ valueSerde

    The class that the Kafka Plain Consumer / Kafka Streams Application will use to deserialize the record value.

    • Value for the Kafka Plain Consumer: apache.kafka.common.serialization.StringDeserializer when consuming JSON Topics.

    • Value for the Kafka Plain Consumer: confluent.kafka.serializers.KafkaAvroDeserializer when consuming AVRO Topics.

    • Value for the Kafka Streams Applications: org.apache.kafka.common.serialization.Serdes$StringSerde when consuming JSON Topics.

    • Value for the Kafka Streams Applications:  io.confluent.kafka.streams.serdes.avro.GenericAvroSerde   when consuming AVRO Topics.

    valueSubjectNameStrategy

    The class that the consumer will use to figure out where to get name of the AVRO Schema from.

    • Value: confluent.kafka.serializers.subject.RecordNameStrategy when consuming a Root Stream.

    • Value: confluent.kafka.serializers.subject.TopicNameStrategy when consuming a Custom Stream.

    streamsNumOfThreads

    The number of threads your Kafka Stream will use for processing.

    The recommended value is at least 3.


    Was this article helpful?