Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Generating a coverage report with --runInBand, collectCoverageFrom, and a transformer can mask a failing exit code #13233

Closed
slifty opened this issue Sep 9, 2022 · 33 comments

Comments

@slifty
Copy link

slifty commented Sep 9, 2022

Version

29.0.2

Steps to reproduce

  1. Create a new project that uses a jest transformer (e.g. ts-jest).

  2. Specify a collectCoverageFrom in jest.config.js in a way that will invoke the transformer when collecting coverage

jest.config.js

module.exports = {
  collectCoverageFrom: ["src/**/*.ts"],
  preset: 'ts-jest',
};
  1. Create a foo.test.js file that will fail to run (e.g. including a syntax error). Note, this does NOT need to pass through the transformer. e.g.:

foo.test.js

syntaxError!;
  1. Create a source file with a name that matches the collectCoverageFrom pattern and will pass through the jest transformer (e.g. bar.ts), and write something that will cause an error, e.g. a syntax error, in that file.

bar.ts

anotherSyntaxErrror!;
  1. Run npx jest --runInBand --coverage

  2. Run echo $? to see the exit code, which will be 0

Expected behavior

I expect:

  1. the coverage report to generate and
  2. for the exit code to be 1.

Actual behavior

The tests fail (due to failure to run) but the coverage report generation silently errors and the exit code for the entire process is, incorrectly,0.

(This, for instance, means that CI marks tests as succeeding).

Additional context

Running npx jest --coverage without --runInBand does not cause this bug and instead renders output such as:

Running coverage on untested files...Failed to collect coverage from /Users/slifty/Maestral/Code/personal/jesttest/src/foo.ts
ERROR: Jest worker encountered 3 child process exceptions, exceeding retry limit
STACK: Error: Jest worker encountered 3 child process exceptions, exceeding retry limit
    at ChildProcessWorker.initialize (/Users/slifty/Maestral/Code/personal/jesttest/node_modules/jest-worker/build/workers/ChildProcessWorker.js:211:21)
    at ChildProcessWorker._onExit (/Users/slifty/Maestral/Code/personal/jesttest/node_modules/jest-worker/build/workers/ChildProcessWorker.js:396:12)
    at ChildProcess.emit (node:events:513:28)
    at Process.ChildProcess._handle.onexit (node:internal/child_process:291:12)
----------|---------|----------|---------|---------|-------------------
File      | % Stmts | % Branch | % Funcs | % Lines | Uncovered Line #s
----------|---------|----------|---------|---------|-------------------
All files |       0 |        0 |       0 |       0 |
----------|---------|----------|---------|---------|-------------------
Test Suites: 1 failed, 1 total
Tests:       0 total
Snapshots:   0 total
Time:        4 s
Ran all test suites.

Running npx jest --runInBand --coverage only renders:

Running coverage on untested files...

and then the process exits with code 0.

Some other interesting "alternative outcomes":

  • If collectCoverageFrom is not specified then coverage is generated as expected and the process returns correct exit codes.
  • If the tests are able to run then then coverage is STILL not generated and the exit code is always 1 regardless of test outcomes.

Environment

System:
    OS: macOS 12.4
    CPU: (8) x64 Intel(R) Core(TM) i7-8559U CPU @ 2.70GHz
  Binaries:
    Node: 18.9.0 - ~/.nvm/versions/node/v18.9.0/bin/node
    Yarn: 1.22.19 - /usr/local/bin/yarn
    npm: 8.19.1 - ~/.nvm/versions/node/v18.9.0/bin/npm
  npmPackages:
    jest: ^29.0.2 => 29.0.2
@github-actions
Copy link

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 30 days.

@github-actions github-actions bot added the Stale label Oct 11, 2022
@slifty
Copy link
Author

slifty commented Oct 11, 2022

I believe this is still an issue -- I can try to dig into the code base to evaluate what's causing the problem.

@github-actions github-actions bot removed the Stale label Oct 11, 2022
@Fakerhardcore
Copy link

Fakerhardcore commented Oct 25, 2022

Same problem here when using --runInBand.

Does someone have a workaround, i cant run in parallel on our server :( ?
And I don't want to downgrade, cause of other bugs.

@rmarku
Copy link

rmarku commented Nov 10, 2022

Same problem here. This is still a bug.

@ThomasGHenry
Copy link

I'm also hitting this now. (I know "+1 isn't helpful", but stalebot is watching the clock! 😉 )

@Aure77
Copy link

Aure77 commented Dec 14, 2022

Same issue (hi stalebot 🙄)

@slifty
Copy link
Author

slifty commented Dec 14, 2022

It has been a few months of reproduction across a bunch of folks and the issue is still marked as needing triage -- does anybody know if there is a mechanism to escalate?

@ThomasGHenry
Copy link

Having looked at no code, here's my hopefully-not-red-herring pet theory in lieu of triage.

I noticed similar behavior setting --maxWorkers=1.

I believe the intention (feel like this was implied in the doc somewhere) is for there to be the main "thread" plus one or more workers. I assume the trouble comes in when the main "thread" is the only one. It thinks it's a worker, and ends for whatever reason with no one left to do the clean-up, reporting, whatever.

If my imagination is in the ballpark, a solution might be to enforce the correct minimum worker count (guessing 2), and/or ensure that when the main "thread" is working as a solitary worker that it's resilient enough to pick back up and do main thread stuff whether its worker work was successful or not (think try/catch/finally sort of thing).

@github-actions
Copy link

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 30 days.

@github-actions github-actions bot added the Stale label Jan 15, 2023
@Fakerhardcore
Copy link

As far as i know, this is still an issue

@github-actions github-actions bot removed the Stale label Jan 16, 2023
@IntusFacultas
Copy link

Ran into this as well.

@nmarklund10
Copy link

ran into this as well

@lightenup
Copy link

I have the same issue. Hi stalebot! 👋

@sandijs-private
Copy link

Reproducing this error is easy.
Create several tests in NestJS which write and delete mongodb data.
When running several tests in parallel, it creates strange database content situations and tests fail all the time.
This proves that in code coverage mode --runInBand and --maxWorkers=1 are being ignored.
Tests run in parallel.

@github-actions
Copy link

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 30 days.

@github-actions github-actions bot added the Stale label Apr 27, 2023
@slifty
Copy link
Author

slifty commented Apr 27, 2023

I suppose this warrants another bump.

@github-actions github-actions bot removed the Stale label Apr 27, 2023
@github-actions
Copy link

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 30 days.

@github-actions github-actions bot added the Stale label May 27, 2023
@slifty
Copy link
Author

slifty commented May 27, 2023

it's unclear to me if the maintainers of this project end up seeing issues like this, but in case they do I suppose I'll still bump.

@github-actions github-actions bot removed the Stale label May 27, 2023
james-hu added a commit to handy-common-utils/dev-dependencies that referenced this issue May 28, 2023
@james-hu
Copy link

james-hu commented Jun 2, 2023

I can confirm that the bug is still there with the latest version of jest and ts-jest as of May 2023. Also I found that --maxWorkers=2 (without runInBand) fixes the problem.

I have a test case covering this issue: https://github.com/handy-common-utils/dev-dependencies/blob/08fc16a45db1e22882f084f14c3be4acaca1e956/jest/test/fs-utils.spec.ts#LL51C7-L51C67

And the test case would fail in GitHub actions if I remove --maxWorkers=2: https://github.com/handy-common-utils/dev-dependencies/blob/08fc16a45db1e22882f084f14c3be4acaca1e956/jest/test/fixtures/fs-utils/package.json#L7

I suspect that GitHub gives the actions 2 virtual CPU cores, and that triggers the problem if you don't tell Jest to use 2 workers.

@morris
Copy link

morris commented Jun 10, 2023

In my case it fails silently like described if there is at least one uncovered file which has compile errors. Noticed this while having an incomplete but unused file (a draft) locally. Workaround is obviously to make sure there are no compile errors, but an error message would be helpful of course :)

@dep
Copy link

dep commented Jul 6, 2023

Same problem here. Setting maxWorkers to 2 did not help.

 PASS  src/components/SummaryApp.test.tsx
 PASS  src/components/SummaryApp/CallList.test.tsx
 PASS  src/components/AudioPlayer/AudioPlayer.test.tsx


Failed to collect coverage from /Users/dpeck/Foo/Box/applications/react/modern/call-review-flow/src/components/DetailApp/CallInfo/index.tsx
ERROR: Jest worker encountered 3 child process exceptions, exceeding retry limit
STACK: Error: Jest worker encountered 3 child process exceptions, exceeding retry limit
    at ChildProcessWorker.initialize (/Users/dpeck/foo/Box/applications/react/modern/call-review-flow/node_modules/jest/node_modules/jest-worker/build/workers/ChildProcessWorker.js:170:21)
    at ChildProcessWorker._onExit (/Users/dpeck/foo/Box/applications/react/modern/call-review-flow/node_modules/jest/node_modules/jest-worker/build/workers/ChildProcessWorker.js:254:12)
    at ChildProcess.emit (node:events:527:28)
    at Process.ChildProcess._handle.onexit (node:internal/child_process:291:12)

@danduh
Copy link

danduh commented Aug 2, 2023

Have same issue, any updates on this?

@VladisB
Copy link

VladisB commented Aug 9, 2023

Any updates?

@slifty
Copy link
Author

slifty commented Aug 10, 2023

I don't believe anybody in the jest team has seen this issue, unfortunately. If anybody knows anybody that can help escalate it would probably be useful!

@mrazauskas
Copy link
Contributor

mrazauskas commented Aug 10, 2023

  1. Create a new project that uses a jest transformer (e.g. ts-jest).

Is the babel-jest transformer causing this problem as well?

I am asking this because babel-jest is the only transformer in the Jest repo. If only some other transformers are causing this issue, that could mean this bug is on their side.

@mrazauskas
Copy link
Contributor

Does not reproduce with babel-jest. For me it errors loud and clear. The exit code is 1.

So this is either setup issue or a problem in a transformer you are using. Simply report the issue in their repos.

If someone is able to reproduce the problem using babel-jest, please provide full reproduction repo.

@slifty
Copy link
Author

slifty commented Aug 11, 2023

Thank you for taking a look at this! I'll see if I can create a reproduction case with babel-jest and otherwise will follow up in the appropriate places as advised! (I'll report back here either way)

@slifty
Copy link
Author

slifty commented Aug 11, 2023

I was not able to reproduce this with babel-jest so I've opened an issue in ts-jest directly at: kulshekhar/ts-jest#4193

@FranciscoLagorio
Copy link

I have the same problem !!!

@slifty
Copy link
Author

slifty commented Sep 23, 2023

@FranciscoLagorio would you be up for also commenting at the ts-jest repository's issue? I believe that's the place where the fix would have to be made.

@github-actions
Copy link

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 30 days.

@github-actions github-actions bot added the Stale label Oct 23, 2023
Copy link

This issue was closed because it has been stalled for 30 days with no activity. Please open a new issue if the issue is still relevant, linking to this one.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Nov 22, 2023
Copy link

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
Please note this issue tracker is not a help forum. We recommend using StackOverflow or our discord channel for questions.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Dec 23, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

17 participants