New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to install Python dependencies #157
Comments
This is a great question! import click # pylint: disable=import-error But I think it may stem from where the linter actually runs, like directory wise... |
I came here looking for an answer to this same question - just wanted to bump it for visibility. I had tried the following:
But I still get import errors. |
I attempted to handle this by disabling import-errors using a custom pylint configuration. Prepending However, this project runs pylint using the The one workaround I found was to use |
I finally found a workaround (works with For the first step, we can install our dependencies in the container running the action, but the super-linter runs in a separate container. So we have to share the dependencies with the super-linter container, something we can do with container volumes. But we can not declare new volumes... So let's look at the volume already mounted when the super-linter runs:
We could put our dependencies in Then we can tell
Hope this helps! |
The other workaround is to copy the template into .github/linters/.python-lint and then add these to ignored modules at line 246. For example:
|
@jgaffiot I'm struggling to get your workaround to work, I suspect it is because some of the dependencies I need have binary components (.so libraries), and the super-linter docker image uses Alpine Python, which has musl libc, in contrast to the Ubuntu host where the venv is created, which uses glibc. Do you have any ideas on how we could make it work? 🤔 Thanks! |
If the problem really comes from incompatibilities between glibc and musl, I suppose you would have to build your Python dependencies in an Alpine-based image. Unfortunately, it seems that Github does not provide Alpine-based image for actions, neither natively and neither with self-hosted runner. Perhaps by building your own action based on an image taken on DockerHub ? |
This issue has been automatically marked as stale because it has not had recent activity. If you think this issue should stay open, please remove the |
@github-actions The problem is not fixed, ergo the issue should not be closed. |
…PATH needed to be updated accordingly [source](super-linter/super-linter#157). Fingers crossed.
…ies so that mypy linting will not fail when trying to find pytest. Based on super-linter/super-linter#157 (comment).
…hon version from `.python-version` file. This should wrap up improved workaround for running super-linter with context of the project's Python dependenices based on super-linter/super-linter#157 (comment).
I'm a bit surprised there is not more mention in the documentation of needing this workaround for linting Python code. As a python newbie, it took me a while to track down why linting with the super-linter was producing errors when linting locally was working fine. Thanks to jgaffiot's solution, I was able to put together something that works for my use case:
Here is what my jobs:
lint:
name: Lint Codebase
runs-on: ubuntu-latest
steps:
- name: Checkout
id: checkout
uses: actions/checkout@v4
with:
# super-linter needs the full git history to get the
# list of files that changed across commits
fetch-depth: 0
- name: Set up Python, including cache for pipenv virtual environment
uses: actions/setup-python@v5
with:
python-version-file: .python-version
cache: pipenv
- name: Install pipenv
run: |
pip install --upgrade pip
pip install pipenv
- name: Install project dependencies
run: pipenv install --deploy --dev
- name: Get virtual environment path
id: get-venv-path
run: echo "venv-path=$(pipenv --venv)" >> "$GITHUB_OUTPUT"
# Copy python dependencies to a location that the super-linter will be
# able to access when running inside its Docker container
# '/home/runner/work/_temp/_github_workflow' maps to '/github/workflow'
# in the Docker container
- name: Copy python dependencies
run: cp -r "${{ steps.get-venv-path.outputs.venv-path }}" /home/runner/work/_temp/_github_workflow/.venv
# Extract MAJOR.MINOR version from .python-version file to be used in
# Python folder name when setting PYTHONPATH for super-linter
- name: Get Python version from .python-version file
id: get-python-version
run: echo "python-version=$(cut -d '.' -f 1,2 .python-version)" >> "$GITHUB_OUTPUT"
- name: Lint Codebase
id: super-linter
uses: super-linter/super-linter/slim@v6
env:
DEFAULT_BRANCH: main
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
LINTER_RULES_PATH: . # Set linter rules directory to repo root
MARKDOWN_CONFIG_FILE: .github/linters/.markdown-lint.yml
PYTHONPATH: "/github/workspace:/github/workflow/.venv/lib/python${{ steps.get-python-version.outputs.python-version }}/site-packages"
PYTHON_MYPY_CONFIG_FILE: pyproject.toml
PYTHON_RUFF_CONFIG_FILE: pyproject.toml
VALIDATE_ALL_CODEBASE: true
VALIDATE_PYTHON_BLACK: false
VALIDATE_PYTHON_FLAKE8: false
VALIDATE_PYTHON_PYLINT: false
VALIDATE_JSCPD: false
YAML_CONFIG_FILE: .github/linters/.yaml-lint.yml |
In https://github.com/yhoiseth/python-prediction-scorer/runs/786096970, pylint is complaining about missing imports. They are already installed in a previous step, but it doesn’t seem like they are “kept”. How can I run e.g.
python -m pip install --upgrade pip pip-tools setuptools wheel && pip-sync
so that it works with Super-Linter?The text was updated successfully, but these errors were encountered: