GH-49502: [Parquet][C++] Fix missing overflow check for dictionary encoder indices count#49513
Open
aryansri05 wants to merge 1 commit intoapache:mainfrom
Open
GH-49502: [Parquet][C++] Fix missing overflow check for dictionary encoder indices count#49513aryansri05 wants to merge 1 commit intoapache:mainfrom
aryansri05 wants to merge 1 commit intoapache:mainfrom
Conversation
|
Thanks for opening a pull request! If this is not a minor PR. Could you open an issue for this pull request on GitHub? https://github.com/apache/arrow/issues/new/choose Opening GitHub issues ahead of time contributes to the Openness of the Apache Arrow project. Then could you also rename the pull request title in the following format? or See also: |
|
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Closes #49502
Rationale for this change
When writing large dictionary-encoded Parquet data with
ARROW_LARGE_MEMORY_TESTS=ON, two tests were failing:TestColumnWriter.WriteLargeDictEncodedPage— expected 2 pages, got 7501TestColumnWriter.ThrowsOnDictIndicesTooLarge— expected ParquetException,got nothing thrown
The root cause is that
PutIndicesTyped()inDictEncoderImplhad no checkfor when the total number of buffered dictionary indices exceeds
INT32_MAX.The existing overflow check in
FlushValues()only checks the buffer size inbytes, not the index count, so it never triggered for this case.
What changes are included in this PR?
Added an overflow check in
DictEncoderImpl::PutIndicesTyped()immediatelyafter
buffered_indices_.resize():if (buffered_indices_.size() >
static_cast<size_t>(std::numeric_limits<int32_t>::max())) {
throw ParquetException("Total dictionary indices count (",
buffered_indices_.size(),
") exceeds maximum int value");
}
This makes the encoder throw a
ParquetExceptionwith a message containing"exceeds maximum int value" when the index count overflows, which is exactly
what
ThrowsOnDictIndicesTooLargeexpects.Are these changes tested?
Yes — the existing tests in
column_writer_test.cccover this fix:TestColumnWriter.ThrowsOnDictIndicesTooLargeTestColumnWriter.WriteLargeDictEncodedPageBoth tests were failing before this fix and should pass after.
Tests require building with
ARROW_LARGE_MEMORY_TESTS=ON.This PR contains a "Critical Fix"— previously, writing dictionary-encoded
data with more than INT32_MAX indices would silently produce incorrect output
(wrong page count) instead of raising an error. This fix makes the encoder
correctly throw a
ParquetExceptionin that scenario.