Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Spark] Recognize java8api date format in query optimizations #4201 #4228

Closed
wants to merge 1 commit into from

Conversation

mikoloay
Copy link

@mikoloay mikoloay commented Mar 5, 2025

Which Delta project/connector is this regarding?

  • Spark
  • Standalone
  • Flink
  • Kernel
  • Other (fill in here)

Description

Resolves #4201
The convertValueIfRequired function casts DateType columns to java.sql.Date. Optimizing subqueries involving date columns with spark.sql.datetime.java8API.enabled set to true currently results in ClassCastException errors. This fix correctly converts dates regardless of the java type of the date value.

How was this patch tested?

A unit test was added to verify that the optimizations now work regardless of the value of the spark.sql.datetime.java8API.enabled configuration (without this fix the unit test is failing).

Does this PR introduce any user-facing changes?

No

@mikoloay mikoloay closed this by deleting the head repository Mar 5, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
1 participant