github confluentinc/confluent-kafka-python v1.4.0

latest releases: semaphore-macos-13-5-1, add_pyproject9, add_pyproject8...
3 years ago

Confluent's Python client for Apache Kafka

v1.4.0 is a feature release:

  • KIP-98: Transactional Producer API
  • KIP-345: Static consumer group membership (by @rnpridgeon)
  • KIP-511: Report client software name and version to broker
  • Generic Serde API (experimental)
  • New AvroSerializer and AvroDeserializer implementations including configurable subject name strategies.
  • JSON Schema support (For Schema Registry)
  • Protobuf support (For Schema Registry)

confluent-kafka-python is based on librdkafka v1.4.0, see the librdkafka v1.4.0 release notes for a complete list of changes, enhancements, fixes and upgrade considerations.

Transactional Producer API

Release v1.4.0 for confluent-kafka-python adds complete Exactly-Once-Semantics (EOS) functionality, supporting the idempotent producer (since v1.0.0), a transaction-aware consumer (since v1.2.0) and full producer transaction support (v1.4.0).

This enables developers to create Exactly-Once applications with Apache Kafka.

See the Transactions in Apache Kafka page for an introduction and check the transactions example.

Generic Serializer API

Release v1.4.0 introduces a new, experimental, API which adds serialization capabilities to Kafka Producer and Consumer. This feature provides the ability to configure Producer/Consumer key and value serializers/deserializers independently. Previously all serialization must be handled prior to calling Producer.produce and after Consumer.poll.

This release ships with 3 built-in, Java compatible, standard serializer and deserializer classes:

Name Type Format
Double float IEEE 764 binary64
Integer int int32
String Unicode bytes*

* The StringSerializer codec is configurable and supports any one of Python's standard encodings. If left unspecified 'UTF-8' will be used.

Additional serialization implementations are possible through the extension of the Serializer and Deserializer base classes.

See avro_producer.py and avro_consumer.py for example usage.

Avro, Protobuf and JSON Schema Serializers

Release v1.4.0 for confluent-kafka-python adds support for two new Schema Registry serialization formats with its Generic Serialization API; JSON and Protobuf. A new set of Avro Serialization classes have also been added to conform to the new API.

Format Serializer Example Deserializer Example
Avro avro_producer.py avro_consumer.py
JSON json_producer.py json_consumer.py
Protobuf protobuf_producer.py protobuf_consumer.py

Security fixes

Two security issues have been identified in the SASL SCRAM protocol handler:

  • The client nonce, which is expected to be a random string, was a static string.
  • If sasl.username and sasl.password contained characters that needed escaping, a buffer overflow and heap corruption would occur. This was protected, but too late, by an assertion.

Both of these issues are fixed in this release.

Enhancements

  • Bump OpenSSL to v1.0.2u
  • Bump monitoring-interceptors to v0.11.3

Fixes

General:

Schema Registry/Avro:

Also see the librdkafka v1.4.0 release notes for fixes to the underlying client implementation.

Don't miss a new confluent-kafka-python release

NewReleases is sending notifications on new releases.