New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve CI jobs speed #13600
Comments
It looks also that due to the number of cache per branches, the cache is invalidated too fast for this repository: |
This sounds like a good first step that could maybe be implemented as an action local to the repository, and reused in other workflows. |
Among the slow job are the integration tests, Playwright documentation mentioned the ability to split them between multiple jobs using sharding tests: https://playwright.dev/docs/ci#sharding |
For reference GitHub upgraded their runners for open source projects, which should already help a little bit:
https://github.blog/2024-01-17-github-hosted-runners-double-the-power-for-open-source/ |
Problem
The CI jobs are taking lots of time and their number is easily triggering queuing when multiple contributors are active simultaneously.
Proposed Solution
Ideas to speed the CI
The current job are all installing the dependencies and most of them then install JupyterLab. Caching the dependencies packages helps but only when the cache is used and valid - it often happen on a new PR that a new upstream Python dependency is downloaded due to a newer version being available but as the job fails the cache is not updated.
We could either move to docker stack (see Sharing Docker containers between jobs in a workflow docker/build-push-action#225)
But as this is not a technology we are really using, we could also simply share a large artifact built by a single initial job
Additional context
The text was updated successfully, but these errors were encountered: