Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Exhaustivity checker emits spurious warning when matching on Java enum type (2.13.11 regression) #12800

Closed
vasilmkd opened this issue Jun 6, 2023 · 8 comments · Fixed by scala/scala#10424

Comments

@vasilmkd
Copy link

vasilmkd commented Jun 6, 2023

Reproduction steps

git clone git@github.com:vasilmkd/scala-java-enum-regression.git
cd scala-java-enum-regression
sbt compile

https://github.com/vasilmkd/scala-java-enum-regression

Scala version: 2.13.11

Problem (-Werror flag is on)

[error] <path>/scala-java-enum-regression/src/main/scala/main.scala:5:23: match may not be exhaustive.
[error] It would fail on the following inputs: (_ : com.intellij.openapi.compiler.CompilerMessageCategory$1), (_ : com.intellij.openapi.compiler.CompilerMessageCategory$2), (_ : com.intellij.openapi.compiler.CompilerMessageCategory$3), (_ : com.intellij.openapi.compiler.CompilerMessageCategory$4)
[error]     val messageType = category match {
[error]                       ^
[error] one error found

Further context

  1. Scala 2.13.10 does not exhibit this behavior.
  2. Compiling using Scala 3.3.0 and Scala 2.12.18 works fine.
  3. The warning/error is not issued if the same Java source code is copied into a .java file and compiled using sbt. It is necessary for this enum to be published in a library. (Might be a problem with classfile decompilation?)
@som-snytt
Copy link

A more complete repro would be helpful to me. At least a link to the java source? Maybe a simple enum suffices? or maybe not? Let me remember how to write an enum in Java, etc.

@SethTisue
Copy link
Member

It is necessary for this enum to be published in a library

Perhaps it would be sufficient to use CompileOrder.JavaThenScala instead of CompileOrder.Mixed?

@SethTisue SethTisue added this to the 2.13.12 milestone Jun 6, 2023
@SethTisue
Copy link
Member

@som-snytt scala/scala#10105 seems like an obvious place to look? any other ideas of what PR might be involved?

@vasilmkd
Copy link
Author

vasilmkd commented Jun 6, 2023

Sorry, here's the source of the enum https://github.com/JetBrains/intellij-community/blob/master/java/compiler/openapi/src/com/intellij/openapi/compiler/CompilerMessageCategory.java.

@som-snytt
Copy link

som-snytt commented Jun 6, 2023

@SethTisue yes with sinking heart that was also my first thought.

@vasilmkd thanks, I couldn't find it right away on github, and this is after my second coffee.

I see it's an "interesting" enum. So much to explore today, I hardly miss being at Scala Days. 😿

@SethTisue SethTisue changed the title Java enum exhaustivity check regression Java enum exhaustivity check false positive (2.13.11 regression) Jun 6, 2023
@vasilmkd
Copy link
Author

vasilmkd commented Jun 6, 2023

I confirm, this is compiled using JDK 17, I haven't tried changing the JDK yet.

CompilerMessageCategory is not a "simple" Java enum, it defines an abstract method and then each enum case implements it.

A "simple" Java enum is for example com.intellij.compiler.ModuleSourceSet. Testing the same code by matching on that enum compiles just fine. I have pushed a new commit to the reproduction repo with this enum.

@vasilmkd
Copy link
Author

vasilmkd commented Jun 6, 2023

I tried using JDK 11 and JDK 8, same result, the warning/error is reproducible as in JDK 17.

@vasilmkd
Copy link
Author

vasilmkd commented Jun 6, 2023

@SethTisue You're right! Copying the enum source code to a Java file, then using compileOrder := CompileOrder.JavaThenScala does manage to reproduce the exhaustivity warning. I'm pushing a commit to the reproduction repo with that change, the library dependency is no longer necessary.

@SethTisue SethTisue changed the title Java enum exhaustivity check false positive (2.13.11 regression) Exhaustivity checker emits spurious warning when matching on Java enum type (2.13.11 regression) Jun 6, 2023
dongjoon-hyun pushed a commit to apache/spark that referenced this issue Jun 19, 2023
### What changes were proposed in this pull request?
This PR aims to upgrade Scala to 2.13.11
- https://www.scala-lang.org/news/2.13.11

Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3,

### Why are the changes needed?
This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`:
- scala/scala#10363
- scala/scala#10184
- scala/scala#10397
- scala/scala#10348
- scala/scala#10105

There are 2 known issues in this version:

- scala/bug#12800
- scala/bug#12799

For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just

https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130

in Spark Code, and I checked `javax.annotation.Nonnull` no this issue.

So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves

The full release notes as follows:

- https://github.com/scala/scala/releases/tag/v2.13.11

### Does this PR introduce _any_ user-facing change?
Yes, this is a Scala version change.

### How was this patch tested?
- Existing Test
- Checked Java 8/17 + Scala 2.13.11 using GA, all test passed

Java 8 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14337279564
Java 17 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14343012195

Closes #41626 from LuciferYang/SPARK-40497.

Authored-by: yangjie01 <yangjie01@baidu.com>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
czxm pushed a commit to czxm/spark that referenced this issue Jun 19, 2023
### What changes were proposed in this pull request?
This PR aims to upgrade Scala to 2.13.11
- https://www.scala-lang.org/news/2.13.11

Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3,

### Why are the changes needed?
This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`:
- scala/scala#10363
- scala/scala#10184
- scala/scala#10397
- scala/scala#10348
- scala/scala#10105

There are 2 known issues in this version:

- scala/bug#12800
- scala/bug#12799

For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just

https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130

in Spark Code, and I checked `javax.annotation.Nonnull` no this issue.

So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves

The full release notes as follows:

- https://github.com/scala/scala/releases/tag/v2.13.11

### Does this PR introduce _any_ user-facing change?
Yes, this is a Scala version change.

### How was this patch tested?
- Existing Test
- Checked Java 8/17 + Scala 2.13.11 using GA, all test passed

Java 8 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14337279564
Java 17 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14343012195

Closes apache#41626 from LuciferYang/SPARK-40497.

Authored-by: yangjie01 <yangjie01@baidu.com>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
dongjoon-hyun pushed a commit to apache/spark that referenced this issue Sep 16, 2023
### What changes were proposed in this pull request?
This PR aims to re-upgrade Scala to 2.13.11, after SPARK-45144 was merged, the build issues mentioned in #41943 should no longer exist.

- https://www.scala-lang.org/news/2.13.11

Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3

### Why are the changes needed?
This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`:
- scala/scala#10363
- scala/scala#10184
- scala/scala#10397
- scala/scala#10348
- scala/scala#10105

There are 2 known issues in this version:

- scala/bug#12800
- scala/bug#12799

For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just

https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130

in Spark Code, and I checked `javax.annotation.Nonnull` no this issue.

So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves

The full release notes as follows:

- https://github.com/scala/scala/releases/tag/v2.13.11

### Does this PR introduce _any_ user-facing change?
Yes, this is a Scala version change.

### How was this patch tested?
- Existing Test

### Was this patch authored or co-authored using generative AI tooling?
No

Closes #42918 from LuciferYang/SPARK-40497-2.

Authored-by: yangjie01 <yangjie01@baidu.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants