github confluentinc/confluent-kafka-python v0.11.4

latest releases: v2.6.0.post1.dev2, v2.6.0.post1.dev1, v2.6.0...
6 years ago

Simplified installation

This release adds binary wheels containing all required dependencies (librdkafka, openssl, zlib, etc) for Linux and OSX.

Should these wheels not work on your platform then please file an issue outlining what is failing, and then use the previous method of installing librdkafka manually followed by pip install --no-binary all confluent-kafka

Message header support

Support for Kafka message headers has been added (requires broker version >= v0.11.0).

When producing messages simply provide a list of key,value tuples as headers=:

    myproducer.produce(topic, 'A message payload', headers=[('hdr1', 'val1'), ('another', 'one'), ('hdr1', 'duplicates are supported and ordering is retained')])

Message headers are returned as a list of tuples for consumed messages:

   msg = myconsumer.poll(1)
   if msg is not None and not msg.error():
       headers = msg.headers()
       if headers is not None:
           # convert to dict, collapsing duplicate header keys
           headers_dict = dict(headers)

Enhancements

  • Message header support (@johnistan)
  • Added Consumer.seek()
  • Added consumer.pause/resume support (closes #120, @dangra)
  • Added Consumer.store_offsets() API (#245, @ctrochalakis)
  • Support for passing librdkafka logs to the standard logging module (see logger kwarg in constructors) (#148)
  • Enable produce.offset.report by default (#266) (#267)
  • Expose offsets_for_times consumer method. closes #224 (#268, @johnistan)
  • Add batch consume() API (closes #252, #282, @tburmeister)
  • Add hash func for UnionSchema (#228, @fyndiq)
  • Use schemaless reader to handle complex schema (#251, @fpietka)

Fixes

  • Fix librdkafka install command for macOS (#281, @vkroz)
  • Constructors now support both dict and kwargs
  • Add __version__ to __init__.py (@mrocklin)
  • Messages could be leaked&lost if exception raised from callback triggered by poll()
  • Make Consumer.commit(..,asynchronous=False) return offset commit results
  • Raise runtime error if accessing consumer after consumer close (#262, @johnistan)
  • Pass py.test arguments from tox (@ctrochalakis)
  • Rename async kwargs to asynchronous (async will continue working until the 1.0 API bump)

Don't miss a new confluent-kafka-python release

NewReleases is sending notifications on new releases.