Skip to content

AIOKafkaConsumer throws corrupted message exceptions on big-endian OS #1097

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
ChupaCabre opened this issue Mar 10, 2025 · 0 comments
Open

Comments

@ChupaCabre
Copy link

Describe the bug
Trying to consume messages with the AIOKafkaConsumer leads to corrupted message exceptions. This error happens consistently and prevents any messages from being processed. Messages are valid and can be consumed successfully with the console consumer as well as java clients. I also had success consuming messages with confluent-kafka python client library. This application is running on s390x (zLinux) operating system which is using big-endian schema which is what I think may be causing the checksum verification to fail.
Here's the error that comes up:

    result = await self._consumer.getmany(
  File "/lib/python3.9/site-packages/aiokafka/consumer/consumer.py", line 1172, in getmany
    records = await self._fetcher.fetched_records(
  File "/lib/python3.9/site-packages/aiokafka/consumer/fetcher.py", line 1072, in fetched_records
    records = res_or_error.getall(max_records)
  File "/lib/python3.9/site-packages/aiokafka/consumer/fetcher.py", line 134, in getall
    for msg in self._partition_records:
  File "/lib/python3.9/site-packages/aiokafka/consumer/fetcher.py", line 201, in __next__
    return next(self._records_iterator)
  File "/lib/python3.9/site-packages/aiokafka/consumer/fetcher.py", line 217, in _unpack_records
    raise Errors.CorruptRecordException(
kafka.errors.CorruptRecordException: [Error 2] CorruptRecordException: Invalid CRC - TopicPartition(topic='testtopic', partition=0) 

Expected behaviour
Valid messages are consumed without exceptions.

Environment (please complete the following information):

  • aiokafka version (python -c "import aiokafka; print(aiokafka.__version__)"): 0.12.0
  • Kafka Broker version (kafka-topics.sh --version): 3.6.1
  • Linux kernel: 5.14.0-427.49.1.el9_4.s390x

Reproducible example
Running on a big endian operating system,

ssl_context = create_ssl_context('/path to ca file')
consumer = AIOKafkaConsumer(
    *topics,
    client_id="client_id",
    group_id="group_id",
    sasl_plain_username="user"
    sasl_plain_password="pw"
    security_protocol="SASL_SSL",
    sasl_mechanism="SCRAM-SHA-512",
    ssl_context=ssl_context,
    bootstrap_servers="bootstrap_servers")
await consumer.start()
    try:
        while True:
            result = await consumer.getmany(timeout_ms=10 * 1000) # error here
            for tp, messages in result.items():
                if messages:
                    print('wow, consumed some messages')
                    print(messages)
    except Exception as e:
        logger.fatal("Cannot start Kafka consumer! Service is unusable without it. Exiting now! Details: %s",
                    str(e),
                    exc_info=True)
    finally:
        if not consumer._closed:
            await consumer.stop()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant