You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Trying to consume messages with the AIOKafkaConsumer leads to corrupted message exceptions. This error happens consistently and prevents any messages from being processed. Messages are valid and can be consumed successfully with the console consumer as well as java clients. I also had success consuming messages with confluent-kafka python client library. This application is running on s390x (zLinux) operating system which is using big-endian schema which is what I think may be causing the checksum verification to fail.
Here's the error that comes up:
result = await self._consumer.getmany(
File "/lib/python3.9/site-packages/aiokafka/consumer/consumer.py", line 1172, in getmany
records = await self._fetcher.fetched_records(
File "/lib/python3.9/site-packages/aiokafka/consumer/fetcher.py", line 1072, in fetched_records
records = res_or_error.getall(max_records)
File "/lib/python3.9/site-packages/aiokafka/consumer/fetcher.py", line 134, in getall
for msg in self._partition_records:
File "/lib/python3.9/site-packages/aiokafka/consumer/fetcher.py", line 201, in __next__
return next(self._records_iterator)
File "/lib/python3.9/site-packages/aiokafka/consumer/fetcher.py", line 217, in _unpack_records
raise Errors.CorruptRecordException(
kafka.errors.CorruptRecordException: [Error 2] CorruptRecordException: Invalid CRC - TopicPartition(topic='testtopic', partition=0)
Expected behaviour
Valid messages are consumed without exceptions.
Environment (please complete the following information):
aiokafka version (python -c "import aiokafka; print(aiokafka.__version__)"): 0.12.0
Kafka Broker version (kafka-topics.sh --version): 3.6.1
Linux kernel: 5.14.0-427.49.1.el9_4.s390x
Reproducible example
Running on a big endian operating system,
ssl_context=create_ssl_context('/path to ca file')
consumer=AIOKafkaConsumer(
*topics,
client_id="client_id",
group_id="group_id",
sasl_plain_username="user"sasl_plain_password="pw"security_protocol="SASL_SSL",
sasl_mechanism="SCRAM-SHA-512",
ssl_context=ssl_context,
bootstrap_servers="bootstrap_servers")
awaitconsumer.start()
try:
whileTrue:
result=awaitconsumer.getmany(timeout_ms=10*1000) # error herefortp, messagesinresult.items():
ifmessages:
print('wow, consumed some messages')
print(messages)
exceptExceptionase:
logger.fatal("Cannot start Kafka consumer! Service is unusable without it. Exiting now! Details: %s",
str(e),
exc_info=True)
finally:
ifnotconsumer._closed:
awaitconsumer.stop()
The text was updated successfully, but these errors were encountered:
Describe the bug
Trying to consume messages with the AIOKafkaConsumer leads to corrupted message exceptions. This error happens consistently and prevents any messages from being processed. Messages are valid and can be consumed successfully with the console consumer as well as java clients. I also had success consuming messages with
confluent-kafka
python client library. This application is running on s390x (zLinux) operating system which is using big-endian schema which is what I think may be causing the checksum verification to fail.Here's the error that comes up:
Expected behaviour
Valid messages are consumed without exceptions.
Environment (please complete the following information):
python -c "import aiokafka; print(aiokafka.__version__)"
): 0.12.0kafka-topics.sh --version
): 3.6.1Reproducible example
Running on a big endian operating system,
The text was updated successfully, but these errors were encountered: