kgo: return decompression errors while consuming #883
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Kafka can return partial batches, so decompression errors are common. If I ask for at most 100 bytes, and the broker has two 60 byte batches, I will receive one valid 60 byte batch and then a partial 40 byte batch. The second partial batch will fail at decompressing. This is the reason I previously never returned decompression errors.
However, if a client truly does produce somewhat-valid compressed data that some decompressors can process, but others (Go's) cannot, then the first batch received could fail to decompress. The client would fail processing, return an empty batch, and try consuming at the same spot. The client would spin loop trying to consume and the end user would never be aware.
Now, if the first error received is a decompression error, we bubble it up to the end user.
This is hard to test internally, so this was hack manually tested.
Scenario one:
Scenario two:
Closes #854.