Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parallelizing across machines in CI #6270

Closed
jamiebuilds opened this issue May 25, 2018 · 11 comments · Fixed by #12546
Closed

Parallelizing across machines in CI #6270

jamiebuilds opened this issue May 25, 2018 · 11 comments · Fixed by #12546

Comments

@jamiebuilds
Copy link
Contributor

jamiebuilds commented May 25, 2018

Many CI systems support splitting up a single task across multiple machines. This can drastically speed up CI times even when you are duplicating some of the work compiling or whatever.

From what I've seen, they mostly work like this:

steps:
  - command: yarn install && yarn test
    parallelism: 3
# on machine 1
CI_NODE_TOTAL=3 CI_NODE_INDEX=0 yarn install && yarn test

# on machine 2
CI_NODE_TOTAL=3 CI_NODE_INDEX=1 yarn install && yarn test

# on machine 3
CI_NODE_TOTAL=3 CI_NODE_INDEX=2 yarn install && yarn test

For example: https://buildkite.com/docs/builds/parallel-builds

It would be awesome if there was an easy way to integrate this with Jest such that you could automatically chunk tests up so that they can be run across machines.

I imagine you'd have to discover all the tests and then have some stable way of splitting them up, either by counting the tests or by splitting the files up.

I think you could even automatically do this without any additional setup from developers. I created a module to help you do just that: https://github.com/jamiebuilds/ci-parallel-vars


We do this in our project today by starting 4 separate Jest runs on different machines:

steps:
  - CI_NODE_TOTAL=4 CI_NODE_INDEX=0 JEST_TESTS=$(jest --listTests --json) jest
  - CI_NODE_TOTAL=4 CI_NODE_INDEX=1 JEST_TESTS=$(jest --listTests --json) jest
  - CI_NODE_TOTAL=4 CI_NODE_INDEX=2 JEST_TESTS=$(jest --listTests --json) jest
  - CI_NODE_TOTAL=4 CI_NODE_INDEX=3 JEST_TESTS=$(jest --listTests --json) jest

Then in jest.config.js we split up the tests:

let parallelism = require('ci-parallel-vars')
let chunkd = require('chunkd')

let tests = JSON.parse(process.env.JEST_TESTS).sort((a, b) => {
  return b.localeCompare(a)
})

if (parallelism) {
  tests = chunkd(tests, parallelism.index, parallelism.total)
}

module.exports = {
  testMatch: tests,
}

This sped up our builds significantly

@SimenB
Copy link
Member

SimenB commented May 27, 2018

@aaronabramov you talked about something like this at FB. But I don't remember if you said it worked for you, or if you wanted something like it?

@aaronabramov
Copy link
Contributor

i was talking about something else (host process and a bunch of remote clients), but i think it would be nice to do jest --chunk=2/10 kind of stuff in Jest

@KevinGrandon
Copy link

I would like to see this to make CI fast and it feels like a common request.

Also reported in: #2330

@slikts
Copy link
Contributor

slikts commented Nov 6, 2020

Cypress has the feature to slice or load balance tests, and it's really neat, since it can be set up just by adding a parallel execution strategy to the CI pipeline. I'm surprised there's been no traction for this feature.

@kpelelis
Copy link

We have been doing something similar in Skroutz with RSspec, using multiple process one of which gets promoted to the orchestrator. We also use a Redis instance to measure the runtime of the tests and distribute them with sort of equal load. I was thinking of something similar for jest if anyone interested, I can setup a ddoc on how to achieve this, as I am not really familiar with the internals of Jest.

billyvg added a commit to getsentry/sentry that referenced this issue Apr 19, 2021
Frontend test duration has been creeping up and failing. There are some optimizations we could do, but I think this needs to be split for now.

Took the logic from this post jestjs/jest#6270 (comment) and applied it here.
@thiamsantos
Copy link

I would love to see this feature been native to jest. Right now we have been using the approach suggested at #11252 (comment). But seems to be a common enough functionality to be added to jest itself. When a project reaches a certain size is nice to be able to split across different jobs/machines in CI.

Seems that we have a few suggestions on how the API would look like:

  • Parallel vars: CI_NODE_TOTAL=3 CI_NODE_INDEX=0 yarn test
  • Chunk: yarn test --chunk=0/3
  • Shard from Built in shard support #11252 : jest --shard-from=0 --shard-to=0.2

I would like to add another suggestion, based on elixir test framework:

JEST_PARTITION=2 yarn test --partitions 3 or yarn test --total-partitions 4 --partition 1


I would be happy to send a PR if the feature is welcome.

@nickofthyme
Copy link
Contributor

nickofthyme commented Feb 25, 2022

For added context, the playwright test runner has a built-in api for sharding tests across mulitple machines, see docs.

npx playwright test --shard=1/3
npx playwright test --shard=2/3
npx playwright test --shard=3/3

This would be amazing to have in jest out of the box as well.

@SimenB
Copy link
Member

SimenB commented Feb 26, 2022

PR very much welcome! 🙂

@marionebl
Copy link
Contributor

PR very much welcome! 🙂

Went ahead and implemented this as --shard=n/m here #12546

@SimenB
Copy link
Member

SimenB commented Mar 6, 2022

https://github.com/facebook/jest/releases/tag/v28.0.0-alpha.7

@github-actions
Copy link

github-actions bot commented Apr 6, 2022

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
Please note this issue tracker is not a help forum. We recommend using StackOverflow or our discord channel for questions.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Apr 6, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

9 participants