Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade gRPC to support Apple Silicon #13

Merged
merged 3 commits into from Jul 8, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
Expand Up @@ -20,6 +20,8 @@
import java.util.Map;
import java.util.Set;

import org.apache.commons.lang3.SystemUtils;

import org.apache.spark.SparkConf;
import org.apache.spark.util.EventLoop;

Expand All @@ -36,4 +38,8 @@ public static RssShuffleManager createShuffleManager(
Map<String, Set<Long>> failBlockIds) {
return new RssShuffleManager(conf, isDriver, loop, successBlockIds, failBlockIds);
}

public static boolean isMacOnAppleSilicon() {
return SystemUtils.IS_OS_MAC_OSX && SystemUtils.OS_ARCH.equals("aarch64");
}
}
Expand Up @@ -229,11 +229,25 @@ public void onError(Throwable e) {
assertTrue(shuffleWriteMetrics.writeTime() > 0);
assertEquals(6, shuffleWriteMetrics.recordsWritten());
// Spark3 and Spark2 use different version lz4, their length is different
assertEquals(120, shuffleWriteMetrics.bytesWritten());
// it can happen that 2 different platforms compress the same data differently,
// yet the decoded outcome remains identical to original.
// https://github.com/lz4/lz4/issues/812
if (TestUtils.isMacOnAppleSilicon()) {
assertEquals(144, shuffleWriteMetrics.bytesWritten());
} else {
assertEquals(120, shuffleWriteMetrics.bytesWritten());
}

assertEquals(6, shuffleBlockInfos.size());
for (ShuffleBlockInfo shuffleBlockInfo : shuffleBlockInfos) {
assertEquals(20, shuffleBlockInfo.getLength());
// it can happen that 2 different platforms compress the same data differently,
// yet the decoded outcome remains identical to original.
// https://github.com/lz4/lz4/issues/812
if (TestUtils.isMacOnAppleSilicon()) {
assertEquals(24, shuffleBlockInfo.getLength());
} else {
assertEquals(20, shuffleBlockInfo.getLength());
}
assertEquals(22, shuffleBlockInfo.getUncompressLength());
assertEquals(0, shuffleBlockInfo.getShuffleId());
if (shuffleBlockInfo.getPartitionId() == 0) {
Expand Down
19 changes: 13 additions & 6 deletions pom.xml
Expand Up @@ -40,11 +40,12 @@
<commons-lang3.version>3.10</commons-lang3.version>
<commons-codec.version>1.9</commons-codec.version>
<codehaus.jackson.version>1.9.13</codehaus.jackson.version>
<error_prone_annotations.version>2.3.4</error_prone_annotations.version>
<error_prone_annotations.version>2.10.0</error_prone_annotations.version>
<execution.root>${user.dir}</execution.root>
<fasterxml.jackson.version>2.10.0</fasterxml.jackson.version>
<grpc.version>1.33.0</grpc.version>
<guava.version>30.0-jre</guava.version>
<grpc.version>1.47.0</grpc.version>
<gson.version>2.9.0</gson.version>
<guava.version>31.0.1-jre</guava.version>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why should we also bump guava and gson version?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

w/o upgrading gson and guava, it will violate rules defined by maven-enforcer-plugin

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

grpc and protobuf should work well w/ the previous version, the dependency upgrading is trivial

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would suggest to make the changes minimal to avoid unexpected issues.

Copy link
Member Author

@pan3793 pan3793 Jul 4, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Then we should remove requireUpperBoundDeps rule of maven-enforcer-plugin

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minimal change was tried in Tencent/Firestorm#185, but there were some issues.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Found and fixed some issues in Spark3 IT

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would suggest to make the changes minimal to avoid unexpected issues.

I think it's actually the minimal change to meet requireUpperBoundDeps requirements.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Found and fixed some issues in Spark3 IT

Thanks for digging out this issue, I remember the same issue was encountered during previous attempts.

<hadoop.scope>provided</hadoop.scope>
<hadoop.version>2.8.5</hadoop.version>
<httpclient.version>4.5.3</httpclient.version>
Expand All @@ -64,8 +65,7 @@
<picocli.version>4.5.2</picocli.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<prometheus.simpleclient.version>0.9.0</prometheus.simpleclient.version>
<protobuf.version>3.12.0</protobuf.version>
<protoc.version>3.12.0</protoc.version>
<protobuf.version>3.19.2</protobuf.version>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems native M1 support was introduced in v3.20.1

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not exactly, it is from v3.17.3

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

3.17.3 is still Rosetta, there is only protoc-3.17.3-osx-x86_64.zip.

In 3.20.1, there is protoc-3.20.1-osx-aarch_64.zip.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should check the maven repo instead of GitHub release

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the link, seems they both requires Rosetta 2.

$ file protoc-3.17.3-osx-aarch_64.exe protoc-3.20.1-osx-aarch_64.exe 
protoc-3.17.3-osx-aarch_64.exe: Mach-O 64-bit executable x86_64
protoc-3.20.1-osx-aarch_64.exe: Mach-O 64-bit executable x86_64

Copy link
Member Author

@pan3793 pan3793 Jul 5, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's a funny thing, they just copy and rename the file... both file has the same md5

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The latest version is right

➜  apache-uniffle git:(master) file ~/Downloads/protoc-3.21.2-osx-aarch_64.exe
/Users/chengpan/Downloads/protoc-3.21.2-osx-aarch_64.exe: Mach-O 64-bit executable arm64

<roaring.bitmap.version>0.9.15</roaring.bitmap.version>
<rss.shade.packageName>org.apache.uniffle</rss.shade.packageName>
<skipDeploy>false</skipDeploy>
Expand Down Expand Up @@ -184,6 +184,13 @@
</exclusion>
</exclusions>
</dependency>

<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>${gson.version}</version>
</dependency>

<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
Expand Down Expand Up @@ -588,7 +595,7 @@
<version>0.6.1</version>
<configuration>
<protocArtifact>
com.google.protobuf:protoc:${protoc.version}:exe:${os.detected.classifier}
com.google.protobuf:protoc:${protobuf.version}:exe:${os.detected.classifier}
</protocArtifact>
<pluginId>grpc-java</pluginId>
<pluginArtifact>
Expand Down
2 changes: 1 addition & 1 deletion proto/pom.xml
Expand Up @@ -98,7 +98,7 @@
<artifactId>protobuf-maven-plugin</artifactId>
<configuration>
<protocArtifact>
com.google.protobuf:protoc:3.12.0:exe:${os.detected.classifier}
com.google.protobuf:protoc:${protobuf.version}:exe:${os.detected.classifier}
</protocArtifact>
<!-- Place these in a location that compiler-plugin is already looking -->
<outputDirectory>${project.build.directory}/generated-sources</outputDirectory>
Expand Down