New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-40497][BUILD] Re-upgrade Scala to 2.13.11 #42918
Conversation
.github/workflows/build_and_test.yml
Outdated
@@ -207,6 +207,7 @@ jobs: | |||
GITHUB_PREV_SHA: ${{ github.event.before }} | |||
SPARK_LOCAL_IP: localhost | |||
SKIP_PACKAGING: true | |||
SCALA_PROFILE: scala2.13 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will revert after test
https://github.com/LuciferYang/spark/runs/16776579478 Tested Scala 2.13 + Java 17, onnly the |
cc @dongjoon-hyun FYI |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Test building 2.13 with both Java 11 and Java 17 worked like expected. Thanks for fixing this!
2.13.12 has also been release https://github.com/scala/scala/releases/tag/v2.13.12 so we might want to look into that in the future.
Thanks @dongjoon-hyun and @eejbyfeldt |
We can test it once Ammonite releases a new version supporting Scala 2.13.12. |
Could you re-trigger the failed pipelines? |
Triggered |
Merged to master for Apache Spark 4.0.0. Thank you, @LuciferYang and all. |
What changes were proposed in this pull request?
This PR aims to re-upgrade Scala to 2.13.11, after SPARK-45144 was merged, the build issues mentioned in #41943 should no longer exist.
Additionally, this pr adds a new suppression rule for warning message:
Implicit definition should have explicit type
, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3Why are the changes needed?
This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17
sealed
:sealed
in Java sources scala/scala#10348PermittedSubclasses
in classfiles scala/scala#10105There are 2 known issues in this version:
@Deprecated
annotations when extending Java interface with deprecated default method causejava.lang.annotation.AnnotationFormatError
when accessed via Java reflection (2.13.11 regression) scala/bug#12799For the first one, there is no compilation warning messages related to
match may not be exhaustive
in Spark compile log, and for the second one, there is no case ofmethod.isAnnotationPresent(Deprecated.class)
in Spark code, there is justspark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala
Line 130 in 8c84d2c
in Spark Code, and I checked
@javax.annotation.Nonnull
no this issue.So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves
The full release notes as follows:
Does this PR introduce any user-facing change?
Yes, this is a Scala version change.
How was this patch tested?
Was this patch authored or co-authored using generative AI tooling?
No