New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-47645][BUILD][CORE][SQL][YARN] Make Spark build with -release
instead of -target
#45716
Conversation
.github/workflows/build_and_test.yml
Outdated
# It uses Maven's 'install' intentionally, see https://github.com/apache/spark/pull/26414. | ||
./build/mvn $MAVEN_CLI_OPTS -DskipTests -Pyarn -Pkubernetes -Pvolcano -Phive -Phive-thriftserver -Phadoop-cloud -Djava.version=${JAVA_VERSION/-ea} install | ||
./build/mvn $MAVEN_CLI_OPTS -DskipTests -Pyarn -Pkubernetes -Pvolcano -Phive -Phive-thriftserver -Phadoop-cloud install |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mainly used to verify the cross-compilation scenario of Maven, using Java 21 to compile with -release 17
This comment was marked as outdated.
This comment was marked as outdated.
@@ -215,8 +224,10 @@ trait SparkDateTimeUtils { | |||
val rebasedDays = rebaseGregorianToJulianDays(days) | |||
val localMillis = Math.multiplyExact(rebasedDays, MILLIS_PER_DAY) | |||
val timeZoneOffset = TimeZone.getDefault match { | |||
case zoneInfo: ZoneInfo => zoneInfo.getOffsetsByWall(localMillis, null) | |||
case timeZone: TimeZone => timeZone.getOffset(localMillis - timeZone.getRawOffset) | |||
case zoneInfo: TimeZone if zoneInfo.getClass.getName == zoneInfoClassName => |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for fix:
Error: ] /home/runner/work/spark/spark/sql/api/src/main/scala/org/apache/spark/sql/catalyst/util/SparkDateTimeUtils.scala:27: object util is not a member of package sun
Error: ] /home/runner/work/spark/spark/sql/api/src/main/scala/org/apache/spark/sql/catalyst/util/SparkDateTimeUtils.scala:218: not found: type ZoneInfo
Error: ] /home/runner/work/spark/spark/sql/api/src/main/scala/org/apache/spark/sql/catalyst/util/SparkDateTimeUtils.scala:218: value getOffsetsByWall is not a member of java.util.TimeZone
Maybe we can just use val timeZoneOffset = TimeZone.getDefault.getOffset(localMillis)
, but I'm not sure which case can verify the compatibility issue with 2.4
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let me investigate whether there are other open APIs that can be used as a substitute here.
val bais = new ByteArrayInputStream(in, 0, length) | ||
val byteChannel = Channels.newChannel(bais) | ||
val decodingBufferSize = Math.min(length, 8192) | ||
val decoder = Charset.forName(enc).newDecoder() | ||
|
||
StreamDecoder.forDecoder(byteChannel, decoder, decodingBufferSize) | ||
Channels.newReader(byteChannel, decoder, decodingBufferSize) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for fix:
Error: ] /home/runner/work/spark/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/CreateJacksonParser.scala:27: object nio is not a member of package sun
Error: ] /home/runner/work/spark/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/CreateJacksonParser.scala:61: not found: type StreamDecoder
Error: ] /home/runner/work/spark/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/CreateJacksonParser.scala:67: not found: value StreamDecoder
Error: ] /home/runner/work/spark/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/CreateJacksonParser.scala:27: Unused import
val bais = new ByteArrayInputStream(in, 0, length) | ||
val byteChannel = Channels.newChannel(bais) | ||
val decodingBufferSize = Math.min(length, 8192) | ||
val decoder = Charset.forName(enc).newDecoder() | ||
|
||
StreamDecoder.forDecoder(byteChannel, decoder, decodingBufferSize) | ||
Channels.newReader(byteChannel, decoder, decodingBufferSize) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for fix:
Error: ] /home/runner/work/spark/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/xml/CreateXmlParser.scala:27: object nio is not a member of package sun
Error: ] /home/runner/work/spark/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/xml/CreateXmlParser.scala:78: not found: type StreamDecoder
Error: ] /home/runner/work/spark/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/xml/CreateXmlParser.scala:84: not found: value StreamDecoder
Error: ] /home/runner/work/spark/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/xml/CreateXmlParser.scala:27: Unused import
@@ -17,7 +17,7 @@ | |||
|
|||
package org.apache.spark.deploy.yarn | |||
|
|||
import java.io.{FileSystem => _, _} | |||
import java.io.{File, FileFilter, FileNotFoundException, FileOutputStream, InterruptedIOException, IOException, OutputStreamWriter} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fix
Error: ] /home/runner/work/spark/spark/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala:20: object FileSystem is not a member of package java.io
-release 17
val mh = lookup.unreflectConstructor(constructor) | ||
val action = mh.invoke("sun.io.serialization.extendedDebugInfo") | ||
.asInstanceOf[PrivilegedAction[Boolean]] | ||
!AccessController.doPrivileged(action).booleanValue() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use MethodHandle to ensure logical consistency. Considering that sun.security
is not an exported package, we can also directly use !JBoolean.getBoolean("sun.io.serialization.extendedDebugInfo")
.
-release 17
-release 17
-release 17
-release 17
-release 17
-release 17
-release 17
-release
instead of -target
@@ -175,8 +174,7 @@ | |||
<scala.version>2.13.13</scala.version> | |||
<scala.binary.version>2.13</scala.binary.version> | |||
<scalatest-maven-plugin.version>2.2.0</scalatest-maven-plugin.version> | |||
<!-- don't upgrade scala-maven-plugin to version 4.7.2 or higher, see SPARK-45144 for details --> | |||
<scala-maven-plugin.version>4.7.1</scala-maven-plugin.version> | |||
<scala-maven-plugin.version>4.8.1</scala-maven-plugin.version> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To fix Error: -release cannot be less than -target
, it seems necessary to upgrade to use 4.8.1
-release
instead of -target
-release
instead of -target
|
Could you make the CI happy, @LuciferYang ? |
re-triggered the failed task, let's see if it can pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM.
cc @srowen , @JoshRosen , @HyukjinKwon , too.
Merged to master. Thank you, @LuciferYang and all. |
Thanks @dongjoon-hyun ~ |
This PR seems to break local environment for maven in IntelliJ. Couple of us are getting this response when trying to run anything using maven, even after rebuilding.
|
@mihailom-db Could you provide a specific build command? The maven compilation in the GA test has passed. |
This seems to have solved the problem. Thanks. |
…` instead of `-target` ### What changes were proposed in this pull request? This pr makes the following changes to allow Spark to build with `-release` instead of `-target`: 1. Use `MethodHandle` instead of direct calls to `sun.security.action.GetBooleanAction` and `sun.util.calendar.ZoneInfo`, because they are not `exports` APIs. 2. `Channels.newReader` is used instead of ``,StreamDecoder.forDecoder because `StreamDecoder.forDecoder` is also not `exports` APIs. ```java public static Reader newReader(ReadableByteChannel ch, CharsetDecoder dec, int minBufferCap) { Objects.requireNonNull(ch, "ch"); return StreamDecoder.forDecoder(ch, dec.reset(), minBufferCap); } ``` 3. Adjusted the import of `java.io._` in `yarn/Client.scala` to fix the compilation error: ``` Error: ] /home/runner/work/spark/spark/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala:20: object FileSystem is not a member of package java.io ``` 4. Replaced `-target` with `-release` in `pom.xml` and `SparkBuild.scala`, and removed the `-source` option, because using `-release` is sufficient. 5. Upgrade `scala-maven-plugin` from 4.7.1 to 4.8.1 to fix the error `[ERROR] -release cannot be less than -target` when executing `build/mvn clean install -DskipTests -Djava.version=21` ### Why are the changes needed? After Scala 2.13.9, the compile option `-target` has been deprecated, it is recommended to use `-release`: - scala/scala#9982 ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Pass GitHub Actions ### Was this patch authored or co-authored using generative AI tooling? No Closes apache#45716 from LuciferYang/scala-maven-plugin-491. Authored-by: yangjie01 <yangjie01@baidu.com> Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
What changes were proposed in this pull request?
This pr makes the following changes to allow Spark to build with
-release
instead of-target
:Use
MethodHandle
instead of direct calls tosun.security.action.GetBooleanAction
andsun.util.calendar.ZoneInfo
, because they are notexports
APIs.Channels.newReader
is used instead of ``,StreamDecoder.forDecoder becauseStreamDecoder.forDecoder
is also not `exports` APIs.java.io._
inyarn/Client.scala
to fix the compilation error:Replaced
-target
with-release
inpom.xml
andSparkBuild.scala
, and removed the-source
option, because using-release
is sufficient.Upgrade
scala-maven-plugin
from 4.7.1 to 4.8.1 to fix the error[ERROR] -release cannot be less than -target
when executingbuild/mvn clean install -DskipTests -Djava.version=21
Why are the changes needed?
After Scala 2.13.9, the compile option
-target
has been deprecated, it is recommended to use-release
:-release
more useful, deprecate-target
, align with Scala 3 scala/scala#9982Does this PR introduce any user-facing change?
No
How was this patch tested?
Pass GitHub Actions
Was this patch authored or co-authored using generative AI tooling?
No