-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update py-pyspark and py-py4j #44263
Update py-pyspark and py-py4j #44263
Conversation
…s where a system doesn't have Java and py4j expects it
Hi @teaguesterling! I noticed that the following package(s) don't yet have maintainers:
Are you interested in adopting any of these package(s)? If so, simply add the following to the package class: maintainers("teaguesterling") If not, could you contact the developers of this package and see if they are interested? You can quickly see who has worked on a package with $ spack blame py-py4j Thank you for your help! Please don't add maintainers without their consent. You don't have to be a Spack expert or package developer in order to be a "maintainer," it just gives us a list of users willing to review PRs or debug issues relating to this package. A package can have multiple maintainers; just add a list of GitHub handles of anyone who wants to volunteer. |
Just to make sure it's not missed, I added a few other versions for packages I was only adding older versions:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking good. Have a few additional tweaks.
var/spack/repos/builtin/packages/py-googleapis-common-protos/package.py
Outdated
Show resolved
Hide resolved
var/spack/repos/builtin/packages/py-googleapis-common-protos/package.py
Outdated
Show resolved
Hide resolved
…ackage.py Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
@spackbot fix style |
Let me see if I can fix that for you! |
I was able to run spack style --fix==> Running style checks on spack
selected: isort, black, flake8, mypy
==> Modified files
var/spack/repos/builtin/packages/arrow/package.py
var/spack/repos/builtin/packages/py-googleapis-common-protos/package.py
var/spack/repos/builtin/packages/py-grpcio-status/package.py
var/spack/repos/builtin/packages/py-grpcio/package.py
var/spack/repos/builtin/packages/py-protobuf/package.py
var/spack/repos/builtin/packages/py-py4j/package.py
var/spack/repos/builtin/packages/py-pyarrow/package.py
var/spack/repos/builtin/packages/py-pyspark/package.py
==> Running isort checks
isort checks were clean
==> Running black checks
reformatted var/spack/repos/builtin/packages/py-pyspark/package.py
reformatted var/spack/repos/builtin/packages/py-pyarrow/package.py
All done! ✨ 🍰 ✨
2 files reformatted, 6 files left unchanged.
black checks were clean
==> Running flake8 checks
flake8 checks were clean
==> Running mypy checks
lib/spack/spack/version/version_types.py:145: error: Argument 2 to "StandardVersion" has incompatible type "*Tuple[Tuple[Any, ...], Tuple[Any, ...]]"; expected "Tuple[Tuple[Any, ...], Tuple[Any, ...]]" [arg-type]
lib/spack/spack/version/version_types.py:452: error: Argument 2 to "StandardVersion" has incompatible type "*Tuple[Tuple[Any, ...], Tuple[Any, ...]]"; expected "Tuple[Tuple[Any, ...], Tuple[Any, ...]]" [arg-type]
lib/spack/spack/version/version_types.py:481: error: Argument 2 to "StandardVersion" has incompatible type "*Tuple[Tuple[Any, ...], Tuple[Any, ...]]"; expected "Tuple[Tuple[Any, ...], Tuple[Any, ...]]" [arg-type]
Found 3 errors in 1 file (checked 625 source files)
mypy found errors
I've updated the branch with style fixes. |
…sions: 1.16 is old, built for python ~3.6, and does not build for 3.8. 1.52.0 was removed from pypi
…ncy that is called out for some other packages
In looking at some of the CI checks, it seems at least version 1.48 had unresolved issues. Digging around a bit I was able to add patches and dependencies to resolve. |
Not included (but probably should be): allow py-spark to be built from source (or provided as a virtual package from spark built from source).
I'm not sure about the best way to set defaults for py4j & java.