Skip to content

Releases: confluentinc/confluent-kafka-python

v2.12.0

10 Oct 00:12
3a25bfa
Compare
Choose a tag to compare

confluent-kafka-python v2.12.0

v2.12.0 is a feature release with the following enhancements:

KIP-848 – General Availability

Starting with confluent-kafka-python 2.12.0, the next generation consumer group rebalance protocol defined in KIP-848 is production-ready. Please refer to the following migration guide for moving from classic to consumer protocol.

Note: The new consumer group protocol defined in KIP-848 is not enabled by default. There are few contract change associated with the new protocol and might cause breaking changes. group.protocol configuration property dictates whether to use the new consumer protocol or older classic protocol. It defaults to classic if not provided.

AsyncIO Producer (experimental)

Introduces beta class AIOProducer for asynchronous message production in asyncio applications.

Added

  • AsyncIO Producer (experimental): Introduces beta class AIOProducer for
    asynchronous message production in asyncio applications. This API offloads
    blocking librdkafka calls to a thread pool and schedules common callbacks
    (error_cb, throttle_cb, stats_cb, oauth_cb, logger) onto the event
    loop for safe usage inside async frameworks.

Features

  • Batched async produce: await AIOProducer(...).produce(topic, value=...)
    buffers messages and flushes when the buffer threshold or timeout is reached.
  • Async lifecycle: await producer.flush(), await producer.purge(), and
    transactional operations (init_transactions, begin_transaction,
    commit_transaction, abort_transaction).

Limitations

  • Per-message headers are not supported in the current batched async produce
    path. If headers are required, use the synchronous Producer.produce(...) or
    offload a sync produce call to a thread executor within your async app.

Guidance

  • Use the AsyncIO Producer inside async apps/servers (FastAPI/Starlette, aiohttp,
    asyncio tasks) to avoid blocking the event loop.
  • For batch jobs, scripts, or highest-throughput pipelines without an event
    loop, the synchronous Producer remains recommended.

Enhancement and Fixes

  • Kafka OAuth/OIDC metadata based authentication examples with Azure IMDS (#2083).

confluent-kafka-python v2.12.0 is based on librdkafka v2.12.0, see the
librdkafka release notes
for a complete list of changes, enhancements, fixes and upgrade considerations.

v2.11.1

18 Aug 21:31
866b970
Compare
Choose a tag to compare

confluent-kafka-python v2.11.1

v2.11.1 is a maintenance release with the following enhancements:

confluent-kafka-python v2.11.1 is based on librdkafka v2.11.1, see the
librdkafka release notes
for a complete list of changes, enhancements, fixes and upgrade considerations.

v2.11.0

03 Jul 17:53
0d81d29
Compare
Choose a tag to compare

confluent-kafka-python v2.11.0

v2.11.0 is a feature release with the following enhancements:

confluent-kafka-python v2.11.0 is based on librdkafka v2.11.0, see the
librdkafka release notes
for a complete list of changes, enhancements, fixes and upgrade considerations.

v2.10.1

11 Jun 14:04
0926dd6
Compare
Choose a tag to compare

confluent-kafka-python v2.10.1

v2.10.1 is a maintenance release with the following fixes

  • Handled None value for optional ctx parameter in ProtobufDeserializer (#1939)
  • Handled None value for optional ctx parameter in AvroDeserializer (#1973)

confluent-kafka-python v2.10.1 is based on librdkafka v2.10.1, see the
librdkafka release notes
for a complete list of changes, enhancements, fixes and upgrade considerations.

v2.10.0

17 Apr 20:39
2dd7b99
Compare
Choose a tag to compare

confluent-kafka-python v2.10.0

v2.10.0 is a feature release with the following fixes and enhancements:

  • [KIP-848] Group Config is now supported in AlterConfigs, IncrementalAlterConfigs and DescribeConfigs. (#1856)
  • [KIP-848] describe_consumer_groups() now supports KIP-848 introduced consumer groups. Two new fields for consumer group type and target assignment have also been added. Type defines whether this group is a classic or consumer group. Target assignment is only valid for the consumer protocol and its defaults to NULL. (#1873).

confluent-kafka-python v2.10.0 is based on librdkafka v2.10.0, see the
librdkafka release notes
for a complete list of changes, enhancements, fixes and upgrade considerations.

v2.9.0

28 Mar 22:11
9fe730c
Compare
Choose a tag to compare

confluent-kafka-python v2.9.0

v2.9.0 is a feature release with the following fixes and enhancements:

Add Client Credentials OAuth support for Schema Registry (#1919)
Add custom OAuth support for Schema Registry (#1925)
confluent-kafka-python v2.9.0 is based on librdkafka v2.8.0, see the librdkafka release notes for a complete list of changes, enhancements, fixes and upgrade considerations.

v2.8.2

01 Mar 03:18
d6a4e08
Compare
Choose a tag to compare

confluent-kafka-python v2.8.2

v2.8.2 is a maintenance release with the following fixes and enhancements:

  • Fixed caching to ensure cached schema matches input. (#1922)
  • Fix handling of named Avro schemas (#1928)

confluent-kafka-python v2.8.2 is based on librdkafka v2.8.0, see the
librdkafka release notes
for a complete list of changes, enhancements, fixes and upgrade considerations.

Note: Versioning is skipped due to breaking change in v2.8.1.
Do not run software with v2.8.1 installed.

v2.8.0

07 Jan 21:31
92c83e7
Compare
Choose a tag to compare

confluent-kafka-python v2.8.0

v2.8.0 is a feature release with the features, fixes and enhancements:

  • Ensure algorithm query param is passed for CSFLE (#1889)
  • DGS-19492 Handle records nested in arrays/maps when searching for tags (#1890)

confluent-kafka-python v2.8.0 is based on librdkafka v2.8.0, see the
librdkafka release notes
for a complete list of changes, enhancements, fixes and upgrade considerations.

v2.7.0

21 Dec 00:44
4f25c8c
Compare
Choose a tag to compare

confluent-kafka-python v2.7.0

Note: As part of this release, we are deprecating v2.6.2 release and yanking it from PyPI. Please refrain from using v2.6.2. Use v2.7.0 instead.

Note: This release modifies the dependencies of the Schema Registry client.
If you are using the Schema Registry client, please ensure that you install the
extra dependencies using the following syntax:

pip install confluent-kafka[schemaregistry] 

or

pip install confluent-kafka[avro,schemaregistry] 

Please see the README.md for more information related to installing protobuf, jsonschema or rules dependencies.

v2.7.0 is a feature release with the following features, fixes and enhancements:

  • Support for Data Contracts with Schema Registry, including
    • Data Quality rules
    • Data Transformation rules
    • Client-Side Field Level Encryption (CSFLE)
    • Schema Migration rules (requires Python 3.9+)
  • Migrated the Schema Registry client from requests to httpx
  • Add support for multiple URLs (#409)
  • Allow configuring timeout (#622)
  • Fix deletion semantics (#1127)
  • Python deserializer can take SR client (#1174)
  • Fix handling of Avro unions (#1562)
  • Remove deprecated RefResolver for JSON (#1840)
  • Support delete of subject version (#1851)
  • Added missing dependency on googleapis-common-protos when using protobufs. (#1881, @Tenzer)

confluent-kafka-python v2.7.0 is based on librdkafka v2.6.1, see the
librdkafka release notes
for a complete list of changes, enhancements, fixes and upgrade considerations.

v2.6.2

18 Dec 00:47
51ab802
Compare
Choose a tag to compare

confluent-kafka-python v2.6.2

Warning

Due to an error in which we included dependency changes to a recent patch release, Confluent recommends users to refrain from upgrading to 2.6.2 of Confluent Kafka. Confluent will release a new minor version, 2.7.0, where the dependency changes will be appropriately included. Users who have already upgraded to 2.6.2 and made the required dependency changes are free to remain on that version and are recommended to upgrade to 2.7.0 when that version is available. Upon the release of 2.7.0, the 2.6.2 version will be marked deprecated.
We apologize for the inconvenience and appreciate the feedback that we have gotten from the community.

Note: This version is yanked from PyPI. Use 2.7.0 instead.

Note: This release modifies the dependencies of the Schema Registry client.
If you are using the Schema Registry client, please ensure that you install the
extra dependencies using the following syntax:

pip install confluent-kafka[schemaregistry] 

or

pip install confluent-kafka[avro,schemaregistry] 

Please see the README.md for more information related to installing protobuf, jsonschema or rules dependencies.

v2.6.2 is a feature release with the following features, fixes and enhancements:

  • Support for Data Contracts with Schema Registry, including
    • Data Quality rules
    • Data Transformation rules
    • Client-Side Field Level Encryption (CSFLE)
    • Schema Migration rules (requires Python 3.9+)
  • Migrated the Schema Registry client from requests to httpx
  • Add support for multiple URLs (#409)
  • Allow configuring timeout (#622)
  • Fix deletion semantics (#1127)
  • Python deserializer can take SR client (#1174)
  • Fix handling of Avro unions (#1562)
  • Remove deprecated RefResolver for JSON (#1840)
  • Support delete of subject version (#1851)

confluent-kafka-python is based on librdkafka v2.6.1, see the
librdkafka release notes
for a complete list of changes, enhancements, fixes and upgrade considerations.