Skip to content

Correctly reading empty fields in as null rather than throwing exception #1831

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Dec 20, 2021

Conversation

masseyke
Copy link
Member

By default we intend to treat empty fields as nulls when being read in through spark sql. However we actually
turn them into None objects, which causes spark-sql to blow up in spark 2 and 3. This commit treats them
as nulls, which works for all versions of spark we currently support.
Closes #1635

…ion (elastic#1816)

By default we intend to treat empty fields as nulls when being read in through spark sql. However we actually
turn them into None objects, which causes spark-sql to blow up in spark 2 and 3. This commit treats them
as nulls, which works for all versions of spark we currently support.
Closes elastic#1635
@masseyke masseyke merged commit 0b7cefa into elastic:8.0 Dec 20, 2021
@masseyke masseyke deleted the backport/1816-None-error-8.0 branch December 20, 2021 16:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant