-
Notifications
You must be signed in to change notification settings - Fork 266
fix: Fix to_json handling of NaN and Infinity values (#3016) #3018
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR fixes invalid JSON generation when to_json encounters special floating-point values (NaN, Infinity, -Infinity). These values are now normalized to null for Spark compatibility, ensuring valid JSON output is always produced.
Key Changes:
- Introduced
normalize_special_floatsfunction to convert special float string representations to null - Modified
array_to_json_stringto apply normalization after casting non-struct arrays to strings - Added comprehensive test coverage for NaN and Infinity handling
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| builder.append_null(); | ||
| } else { | ||
| match arr.value(i) { | ||
| "Infinity" | "-Infinity" | "NaN" => builder.append_null(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I haven't reviewed this in detail yet, but it seems odd to handle these values after they have already been converted to strings. Could the check not happen when converting float to string instead?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree that handling this earlier would be preferable in general. In this case, to_json delegates primitive type handling to spark_cast, and the goal here was to avoid changing spark_cast behavior globally since it is used by other expressions where preserving "NaN" / "Infinity" string output may be expected.
Normalizing the values at the to_json layer keeps the change scoped specifically to JSON semantics while still aligning the output with Spark’s behavior.
That said, I’m happy to move the check earlier or adjust the approach if you think handling this during float-to-string conversion would be more appropriate for Comet
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@andygrove for more details, run the unit tests from this pull request: #3011
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @Brijesh-Thakkar and @kazantsev-maksim. I will review this PR more carefully today.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I checked out this branch locally and then merged the changes from #3011 and saw different results from Spark and Comet for infinity:
Spark: [{"c0":false, ... "c5":"-Infinity", ...}]
Comet: [{"c0":false, ... "c5":-Infinity, ...}]
Let's get the unit tests merged first and then pull them into this PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Spark doesn't follow the JSON spec:
scala> spark.sql("""SELECT to_json(struct(cast('Infinity' as double) as inf,
| cast('NaN' as double) as nan))""").show(false)
+----------------------------------------------------------------------------+
|to_json(struct(CAST(Infinity AS DOUBLE) AS inf, CAST(NaN AS DOUBLE) AS nan))|
+----------------------------------------------------------------------------+
|{"inf":"Infinity","nan":"NaN"} |
+----------------------------------------------------------------------------+
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #3018 +/- ##
============================================
+ Coverage 56.12% 59.55% +3.43%
- Complexity 976 1368 +392
============================================
Files 119 167 +48
Lines 11743 15496 +3753
Branches 2251 2569 +318
============================================
+ Hits 6591 9229 +2638
- Misses 4012 4970 +958
- Partials 1140 1297 +157 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Which issue does this PR close?
Closes #3016
Rationale for this change
to_jsoncould emit invalid JSON when encountering special floating-point valuessuch as
NaN,+Infinity, or-Infinity. These values are not valid in JSON andresulted in incorrect or unparsable output.
Apache Spark normalizes such values to
nullwhen converting to JSON. SinceDataFusion-Comet aims to be Spark-compatible, this behavior needed to be aligned.
What changes are included in this PR?
NaN,+Infinity, and-Infinityvalues tonullduringto_jsonconversion
to_jsonalways produces valid JSON outputHow are these changes tested?
to_jsonbehavior forNaN,+Infinity, and-Infinityvaluesnative/spark-exprcrate pass