Skip to content

Issue hitting max records in a batch #3119

Closed
@EvanJGunn

Description

@EvanJGunn

The Issue

My team is getting this invalid array length error in our consumer:
{"level":"info","msg":"consumer/broker/1395569342 disconnecting due to error processing FetchRequest: kafka: error decoding packet: invalid array length","time":"2025-02-27 20:17:36"}

I tracked the error down and it looks like its happening here:

numRecs, err := pd.getArrayLength()

return -1, errInvalidArrayLength

Unless I am mistaken, it appears that the maximum number of records in a batch is set at 131070.

The Question

Is there a reason the max records in a batch is set at 131070?

More Background

My team is consuming a lot of small records from confluents warpstream product. For whatever reason, it looks like Fetch Max gets ignored by the warpstream product, so that unfortunately is not a solution for us.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions