Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Jest much slower in Travis CI than locally #7647

Closed
milesj opened this issue Jan 18, 2019 · 24 comments
Closed

Jest much slower in Travis CI than locally #7647

milesj opened this issue Jan 18, 2019 · 24 comments
Labels

Comments

@milesj
Copy link

milesj commented Jan 18, 2019

🐛 Bug Report

When running Jest locally, all tests finish in 12 seconds. When running in Travis CI, they finish in 448 seconds. Here's the local output:

 PASS  packages/core/tests/reporters/CIReporter.test.ts (60 MB heap size)
 PASS  packages/core/tests/executors/Pool.test.ts (57 MB heap size)
 PASS  packages/core/tests/reporters/BoostReporter.test.ts (59 MB heap size)
 PASS  packages/core/tests/Console.test.ts (77 MB heap size)
 PASS  packages/core/tests/Tool.test.ts (5.307s, 68 MB heap size)
 PASS  packages/core/tests/Reporter.test.ts (75 MB heap size)
 PASS  packages/core/tests/Pipeline.test.ts (6.854s, 56 MB heap size)
 PASS  packages/core/tests/ConfigLoader.test.ts (83 MB heap size)
 PASS  packages/core/tests/Executor.test.ts (68 MB heap size)
 PASS  packages/reporter-nyan/tests/NyanReporter.test.ts (68 MB heap size)
 PASS  packages/core/tests/executors/Sync.test.ts (67 MB heap size)
 PASS  packages/core/tests/ModuleLoader.test.ts (70 MB heap size)
 PASS  packages/core/tests/executors/Serial.test.ts (71 MB heap size)
 PASS  packages/core/tests/CLI.test.ts (78 MB heap size)
 PASS  packages/core/tests/Output.test.ts (69 MB heap size)
 PASS  packages/core/tests/CrashLogger.test.ts (8.9s, 55 MB heap size)
 PASS  packages/core/tests/i18n/LanguageDetector.test.ts (69 MB heap size)
 PASS  packages/core/tests/Task.test.ts (81 MB heap size)
 PASS  packages/core/tests/Emitter.test.ts (81 MB heap size)
 PASS  packages/core/tests/helpers/formatModuleName.test.ts (65 MB heap size)
 PASS  packages/core/tests/executors/Parallel.test.ts (77 MB heap size)
 PASS  packages/core/tests/reporters/ErrorReporter.test.ts (81 MB heap size)
 PASS  packages/core/tests/Plugin.test.ts (77 MB heap size)
 PASS  packages/core/tests/helpers/wrapWithPromise.test.ts (80 MB heap size)
 PASS  packages/core/tests/Routine.test.ts (9.575s, 75 MB heap size)
 PASS  packages/core/tests/i18n/FileBackend.test.ts (81 MB heap size)

Test Suites: 26 passed, 26 total
Tests:       527 passed, 527 total
Snapshots:   69 passed, 69 total
Time:        12.008s
Ran all test suites.

And the CI output.

 PASS  packages/core/tests/Routine.test.ts (5.131s, 74 MB heap size)
 PASS  packages/core/tests/ConfigLoader.test.ts (97 MB heap size)
 PASS  packages/core/tests/Tool.test.ts (93 MB heap size)
 PASS  packages/core/tests/Console.test.ts (90.95s, 107 MB heap size)
 PASS  packages/core/tests/reporters/BoostReporter.test.ts (161.036s, 100 MB heap size)
 PASS  packages/core/tests/Reporter.test.ts (10.715s, 82 MB heap size)
 PASS  packages/core/tests/ModuleLoader.test.ts (95 MB heap size)
 PASS  packages/core/tests/Executor.test.ts (105 MB heap size)
 PASS  packages/core/tests/Task.test.ts (113 MB heap size)
 PASS  packages/reporter-nyan/tests/NyanReporter.test.ts (170.847s, 115 MB heap size)
 PASS  packages/core/tests/Emitter.test.ts (71 MB heap size)
 PASS  packages/core/tests/Output.test.ts (71 MB heap size)
 PASS  packages/core/tests/executors/Serial.test.ts (80 MB heap size)
 PASS  packages/core/tests/reporters/CIReporter.test.ts (85 MB heap size)
 PASS  packages/core/tests/Pipeline.test.ts (95 MB heap size)
 PASS  packages/core/tests/executors/Pool.test.ts (108 MB heap size)
 PASS  packages/core/tests/i18n/FileBackend.test.ts (110 MB heap size)
 PASS  packages/core/tests/CrashLogger.test.ts (110 MB heap size)
 PASS  packages/core/tests/reporters/ErrorReporter.test.ts (119 MB heap size)
 PASS  packages/core/tests/helpers/formatModuleName.test.ts (120 MB heap size)
 PASS  packages/core/tests/executors/Sync.test.ts (88 MB heap size)
 PASS  packages/core/tests/executors/Parallel.test.ts (95 MB heap size)
 PASS  packages/core/tests/Plugin.test.ts (96 MB heap size)
 PASS  packages/core/tests/CLI.test.ts (106 MB heap size)
 PASS  packages/core/tests/helpers/wrapWithPromise.test.ts (105 MB heap size)
 PASS  packages/core/tests/i18n/LanguageDetector.test.ts (114 MB heap size)
Test Suites: 26 passed, 26 total
Tests:       527 passed, 527 total
Snapshots:   69 passed, 69 total
Time:        448.1s
Ran all test suites.

Notice in the CI that BoostReporter.test.ts takes 161.036s, NyanReporter.test.ts takes 170.847s, and Console.test.ts takes 90.95s, while the local testing does not even hit the slow timing threshold. The heap usage is relatively the same between both.

The one common denominator between the reporters is that they use chalk and have many tests that compare chalk output. For example:

However, I've tried disabling chalk using chalk.enabled = false, but the timings did not change.

Another possible culprit is the console mocking that I do here:

But still at a loss at why that would cause these issues.

To Reproduce

Steps to reproduce the behavior:

Expected behavior

CI tests not to be so slow.

Link to repl or repo (highly encouraged)

Example CI builds can be found here: https://travis-ci.org/milesj/boost/builds/481201444

Run npx envinfo --preset jest

Local:

  System:
    OS: macOS 10.14.2
    CPU: (8) x64 Intel(R) Core(TM) i7-4870HQ CPU @ 2.50GHz
  Binaries:
    Node: 10.14.1 - ~/.nvm/versions/node/v10.14.1/bin/node
    Yarn: 1.13.0 - /usr/local/bin/yarn
    npm: 6.4.1 - ~/.nvm/versions/node/v10.14.1/bin/npm

CI:

Node: 8.15 / 10.15 / 11.6
Yarn: 1.13
@milesj
Copy link
Author

milesj commented Jan 18, 2019

This is using Jest 24 beta btw.

@SimenB
Copy link
Member

SimenB commented Jan 18, 2019

@milesj
Copy link
Author

milesj commented Jan 19, 2019

With 4 workers its still 190.84s but it seemed to crash in Node 11. https://travis-ci.org/milesj/boost/jobs/481648087 Gonna try 2 workers.

With -i its 450.86s.

Not sure either of those help me right now.

@SimenB
Copy link
Member

SimenB commented Jan 19, 2019

Unless you're paying, 2 is max. However, runInBand should be faster... Odd it only makes it horribly worse

@SimenB
Copy link
Member

SimenB commented Jan 19, 2019

Can you try to remove detectOpenHandles? It uses async_hooks, which has quite the atrocious performance overhead

@milesj
Copy link
Author

milesj commented Jan 21, 2019

Turned off detectOpenHandles and using 2 workers. Still pretty slow on Node 10 at 284.33s. I wonder if it's because I'm running Jest in a child process.

I also noticed a ton of 7s being output, which is super weird. It's definitely not coming from my code, but from the Jest process. Have yet to figure out where it's coming from.

// Received
..............................�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7�7..

// Should be
................................

@milesj
Copy link
Author

milesj commented Jan 21, 2019

Running not in a child process is the same speed.

@mattfysh
Copy link

@milesj are you using the jsdom env? I'm thinking this may be the cause

@milesj
Copy link
Author

milesj commented Jan 22, 2019

@mattfysh Good call, that slipped my mind. Since I haven't defined it, it's defaulting to jsdom. I'll try it without.

@milesj
Copy link
Author

milesj commented Jan 22, 2019

Using node env crashed Travis. Using node env + 2 workers took about 282.66s. Still seems rather slow. This is weird.

@milesj
Copy link
Author

milesj commented Jan 26, 2019

Any other diagnostics I could run to try and debug this?

Tests taking 764.268s now. https://travis-ci.org/milesj/boost/jobs/484611089

Possible memory leak or something else?

@SimenB
Copy link
Member

SimenB commented Jan 26, 2019

I wonder if it's because I'm running Jest in a child process.

Hmm, maybe. Would you be able to reduce it down to something that's slow locally as well?

Make sure to pass the --max-workers and/or --run-in-band down to the spawned child process

@SimenB
Copy link
Member

SimenB commented Jan 26, 2019

Possible memory leak or something else?

You can run with --detect-leaks and see if Jest itself thinks something is leaking

@milesj
Copy link
Author

milesj commented Jan 26, 2019

Trying that, will report back. In the mean time, I've been going through my CI builds and this commit seemed to double the time it takes.

milesj/boost@cd1c500

But nothing seems out of the ordinary. This is slightly around when I started using Jest 24 alpha, but not sure if that's a red herring or not.

Also this one, which changes the istanbul comments. But that also seems like a stretch.

milesj/boost@fa5690c

@milesj
Copy link
Author

milesj commented Jan 27, 2019

@SimenB
Copy link
Member

SimenB commented Jan 27, 2019

Can you reproduce the error/slowness in a smaller repo? There's so much stuff code there, it's hard to tell what's going on.

Potentially, just a branch in the same repo which deletes most of the code but still shows it being slow

@milesj
Copy link
Author

milesj commented Jan 27, 2019

Yeah I'll try and narrow it down if I can.

@milesj
Copy link
Author

milesj commented Jan 29, 2019

I wonder if this is also relevant: #7732

@SimenB
Copy link
Member

SimenB commented Jan 29, 2019

Then it should be slow locally as well. Might very well be the reason, of course - if you have a bajillion files, the relatively weak CPU of travis will be slow 🙂 Shouldn't be that slow though

@milesj
Copy link
Author

milesj commented Feb 9, 2019

I've noticed that imports to execa trigger the detect leaks error. Here's an example:

[20:32:25] Miles:boost > NODE_ENV=test BOOST_ENV=test npx jest -i --detectLeaks --detectOpenHandles ./packages/core/tests/i18n/
 PASS  packages/core/tests/i18n/FileBackend.test.ts
 PASS  packages/core/tests/i18n/LanguageDetector.test.ts

Test Suites: 2 passed, 2 total
Tests:       4 passed, 4 total
Snapshots:   0 total
Time:        0.965s, estimated 1s
Ran all test suites matching /.\/packages\/core\/tests\/i18n\//i.
[20:33:02] Miles:boost > NODE_ENV=test BOOST_ENV=test npx jest -i --detectLeaks --detectOpenHandles ./packages/core/tests/i18n/
 PASS  packages/core/tests/i18n/FileBackend.test.ts
 FAIL  packages/core/tests/i18n/LanguageDetector.test.ts
  ● Test suite failed to run

    EXPERIMENTAL FEATURE!
    Your test suite is leaking memory. Please ensure all references are cleaned.

    There is a number of things that can leak memory:
      - Async operations that have not finished (e.g. fs.readFile).
      - Timers not properly mocked (e.g. setInterval, setTimeout).
      - Keeping references to the global scope.

      at node_modules/jest-cli/build/TestScheduler.js:240:24

Test Suites: 1 failed, 1 passed, 2 total
Tests:       3 passed, 3 total
Snapshots:   0 total
Time:        0.99s, estimated 1s
Ran all test suites matching /.\/packages\/core\/tests\/i18n\//i.

Going through execa now. Maybe the leak there is causing issues.

@milesj
Copy link
Author

milesj commented Feb 9, 2019

Found some timers that weren't faked and it brought it down to 370s. Also fixed the open handle issue.

https://github.com/milesj/boost/pull/32/files

Think this is as good as it'll get for now.

@milesj
Copy link
Author

milesj commented Feb 11, 2019

I'm using the local node + chrome debugger to profile Jest. Just gonna post some screenshots.

Jasmine promises / queue runner seem to take the most time.
screen shot 2019-02-10 at 18 53 43
screen shot 2019-02-10 at 18 54 26

execa & env-ci take a lot of time with their child processes (I can probably mock these).
screen shot 2019-02-10 at 18 56 12

An async call in my Tools constructor may be the big issue. I'll try and lazy-load this.
screen shot 2019-02-10 at 19 07 35

@milesj
Copy link
Author

milesj commented Jan 22, 2020

Gonna close this.

@milesj milesj closed this as completed Jan 22, 2020
@github-actions
Copy link

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
Please note this issue tracker is not a help forum. We recommend using StackOverflow or our discord channel for questions.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators May 11, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

3 participants