New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Warn that implicit should have explicit type [ci: last-only] #10083
Warn that implicit should have explicit type [ci: last-only] #10083
Conversation
|
5ff8926
to
f636f20
Compare
f636f20
to
a87d504
Compare
import scala.language.reflectiveCalls | ||
import scala.language.implicitConversions | ||
|
||
object Test extends App { | ||
@nowarn // the inferred type includes the default arg, which can't be written explicitly |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@dwijnand inviting your thoughts on your example: implicits should have explicit types, but it is willing to infer a type that can't be written explicitly. In this case, the default arg is not allowed in the refinement. (One could introduce a trait, but that would negate the test.) Probably you will say: "Rules are made to be broken."
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice find!
a87d504
to
72e65b7
Compare
Fields are not marked implicit, and can't access the accessor without cycles, so mark it and check when typing the getter. |
d3d304d
to
9522547
Compare
This comment was marked as outdated.
This comment was marked as outdated.
I didn't switch to draft after handling |
9522547
to
ac2e6ad
Compare
Hung again. I don't see the red ex for canceling jenkins job. Travis:
Locally,
Travis says both
|
I should have @SethTisue about cancel culture. |
This comment was marked as outdated.
This comment was marked as outdated.
ac2e6ad
to
f9395e1
Compare
This comment was marked as resolved.
This comment was marked as resolved.
This comment was marked as resolved.
This comment was marked as resolved.
ac5a670
to
1659fdc
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thank you!
@SethTisue what's your view on making the warning on by default with -Wconf
/ @nowarn
opt-out?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe if we introduce a new warning category and publish the opt-out flag in the release notes, it's OK?
Yes, I think that's sufficient.
Note that this impacts macro authors too; see scala/bug#12798 |
### What changes were proposed in this pull request? This PR aims to upgrade Scala to 2.13.11 - https://www.scala-lang.org/news/2.13.11 Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3, ### Why are the changes needed? This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`: - scala/scala#10363 - scala/scala#10184 - scala/scala#10397 - scala/scala#10348 - scala/scala#10105 There are 2 known issues in this version: - scala/bug#12800 - scala/bug#12799 For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130 in Spark Code, and I checked `javax.annotation.Nonnull` no this issue. So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves The full release notes as follows: - https://github.com/scala/scala/releases/tag/v2.13.11 ### Does this PR introduce _any_ user-facing change? Yes, this is a Scala version change. ### How was this patch tested? - Existing Test - Checked Java 8/17 + Scala 2.13.11 using GA, all test passed Java 8 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14337279564 Java 17 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14343012195 Closes #41626 from LuciferYang/SPARK-40497. Authored-by: yangjie01 <yangjie01@baidu.com> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
### What changes were proposed in this pull request? This PR aims to upgrade Scala to 2.13.11 - https://www.scala-lang.org/news/2.13.11 Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3, ### Why are the changes needed? This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`: - scala/scala#10363 - scala/scala#10184 - scala/scala#10397 - scala/scala#10348 - scala/scala#10105 There are 2 known issues in this version: - scala/bug#12800 - scala/bug#12799 For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130 in Spark Code, and I checked `javax.annotation.Nonnull` no this issue. So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves The full release notes as follows: - https://github.com/scala/scala/releases/tag/v2.13.11 ### Does this PR introduce _any_ user-facing change? Yes, this is a Scala version change. ### How was this patch tested? - Existing Test - Checked Java 8/17 + Scala 2.13.11 using GA, all test passed Java 8 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14337279564 Java 17 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14343012195 Closes apache#41626 from LuciferYang/SPARK-40497. Authored-by: yangjie01 <yangjie01@baidu.com> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
Scala 2.13 raises compilation warnings (which we have elevated to errors) for implicits without explicit types. This is to mirror Scala 3 which requires this. See: * scala/scala#10083 * https://github.com/scala/scala/releases/tag/v2.13.11
Scala 2.13 raises compilation warnings (which we have elevated to errors) for implicits without explicit types. This is to mirror Scala 3 which requires this. See: * scala/scala#10083 * https://github.com/scala/scala/releases/tag/v2.13.11
we treat warnings as fatal, so due to scala/scala#10083 we need to make sure that all implicits have an explicit type
### What changes were proposed in this pull request? This PR aims to re-upgrade Scala to 2.13.11, after SPARK-45144 was merged, the build issues mentioned in #41943 should no longer exist. - https://www.scala-lang.org/news/2.13.11 Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3 ### Why are the changes needed? This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`: - scala/scala#10363 - scala/scala#10184 - scala/scala#10397 - scala/scala#10348 - scala/scala#10105 There are 2 known issues in this version: - scala/bug#12800 - scala/bug#12799 For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130 in Spark Code, and I checked `javax.annotation.Nonnull` no this issue. So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves The full release notes as follows: - https://github.com/scala/scala/releases/tag/v2.13.11 ### Does this PR introduce _any_ user-facing change? Yes, this is a Scala version change. ### How was this patch tested? - Existing Test ### Was this patch authored or co-authored using generative AI tooling? No Closes #42918 from LuciferYang/SPARK-40497-2. Authored-by: yangjie01 <yangjie01@baidu.com> Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
Implicit definitions are required to have an explicit type ascribed in Scala 3. Now Scala 2 warns if the type is missing (and therefore inferred). Under
-Xsource:3
, the warning becomes an error.Local implicits are exempt because they are not available before they are defined.
The warnings may be suppressed globally with
-Wconf:cat=other-implicit-type:s
or locally using@annotation.nowarn("cat=other-implicit-type")
.Note that a whitebox macro may be declared of type
Any
and arbitrarily narrowed by its expansion.Fixes scala/bug#5265