Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Tests appear to stall quietly with V28 #12972

Closed
phawxby opened this issue Jun 27, 2022 · 4 comments
Closed

[Bug]: Tests appear to stall quietly with V28 #12972

phawxby opened this issue Jun 27, 2022 · 4 comments

Comments

@phawxby
Copy link
Contributor

phawxby commented Jun 27, 2022

Version

28.1.1

Steps to reproduce

I'm not really sure how to demonstrate a repro case for this, it's a private repository and i'm fairly confident at this point it's due to the size of our repository. I'm hoping this provides a place for others to also notify if they're experiencing the same.

My gut feeling at this point is it's to do with us approaching the memory limit on the machine, which is obviously a problem, but it should probably output some kind of more helpful message than just stalling out and not doing anything until it times out. I'm assuming this based on the tests completing without coverage so memory limits aren't a problem.

Node 16.10.0 use because of this issue.

Jest 27.4.1 with coverage

Test Suites: 850 passed, 850 total
Tests:       1 skipped, 6707 passed, 6708 total
Snapshots:   1895 passed, 1895 total
Time:        249.302 s
Ran all test suites.
CircleCI received exit code 0

image

Jest 28.1.1 with coverage.

 PASS   unit  src/common/helpers/tokens/xrx/product/__tests__/ProductSupportPhoneToken.test.ts
 PASS   unit  src/common/helpers/sitemap/__tests__/segment.test.ts (14.47 s)

Too long with no output (exceeded 5m0s): context deadline exceeded

image

Test step timed out at 9m 18s. Time in the job up until the start of the test step was about 1m. So roughly 5m from the start of the job until it stopped outputting.

Jest 28.1.1 without coverage.
(Ignoring the test failure, it's a known flakey test we need to fix)

Test Suites: 1 failed, 849 passed, 850 total
Tests:       1 failed, 1 skipped, 6691 passed, 6693 total
Snapshots:   1 failed, 1895 passed, 1896 total
Time:        221.092 s
Ran all test suites.

Exited with code exit status 1
CircleCI received exit code 1

image

Expected behavior

Some kind of error message regarding lack of memory or something.

Actual behavior

Tests appear to stall with no output.

Additional context

No response

Environment

System:
    OS: macOS 12.4
    CPU: (12) x64 Intel(R) Core(TM) i7-9750H CPU @ 2.60GHz
  Binaries:
    Node: 16.10.0 - ~/.nvm/versions/node/v16.10.0/bin/node
    Yarn: 1.22.10 - /usr/local/bin/yarn
    npm: 6.14.6 - ~/Documents/GitHub/brand-engine/node_modules/.bin/npm
  npmPackages:
    jest: ^28.1.1 => 28.1.1
@phawxby
Copy link
Contributor Author

phawxby commented Jun 27, 2022

I've been far more aggressive with my coveragePathIgnorePatterns and I've finally managed to get it to pass on Jest V28 with coverage enabled.
image

With sharding enabled split over 2 instances you would expect lower memory usage and an even higher chance of passing.

Instance 1

 PASS   unit  src/common/helpers/tokens/xrx/locale/__tests__/SiteContextTokens.test.ts
 PASS   unit  src/features/product/handlers/compare/__tests__/formatSpec.test.ts

Too long with no output (exceeded 5m0s): context deadline exceeded

Instance 2

 PASS   unit  src/common/helpers/sitemap/__tests__/segment.test.ts (25.583 s)
 PASS   unit  src/common/log/__tests__/logBuffer.test.ts (93.319 s)

Too long with no output (exceeded 5m0s): context deadline exceeded

logBuffer is not a test that should take 93 seconds, it's a very basic test suite that should be pretty much instant.

image

Memory usage is significantly worse with sharding!?

Edit: I'm going to take this way back and split apart the jobs of upgrading jest, improving coverage output isolation, Node 18+, etc

@phawxby phawxby closed this as completed Jul 21, 2022
@skovhus
Copy link
Contributor

skovhus commented Jul 21, 2022

@phawxby why was this closed? Did you find a solution? 🙏

We observe the same issue of silently hanging tests on a fairly large test suite.

@phawxby
Copy link
Contributor Author

phawxby commented Jul 21, 2022

@phawxby why was this closed? Did you find a solution? 🙏

We observe the same issue of silently hanging tests on a fairly large test suite.

Not exactly but I've spent some time this week trying to isolate what the hell is going on and break it into constituent parts. This is the first bit.
#13054

@github-actions
Copy link

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
Please note this issue tracker is not a help forum. We recommend using StackOverflow or our discord channel for questions.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Aug 21, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

2 participants