Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JavaScript heap out of memory after upgrade to Jest 26 #9980

Open
simkessy opened this issue May 5, 2020 · 62 comments
Open

JavaScript heap out of memory after upgrade to Jest 26 #9980

simkessy opened this issue May 5, 2020 · 62 comments

Comments

@simkessy
Copy link

simkessy commented May 5, 2020

🐛 Bug Report

I upgraded from 24.X to 26.0.0 but now test that was passing is not
Running test takes long time to complete then I get this error
image

To Reproduce

My test:

  describe('when item ids are in sessionStorage', () => {
    const itemIds = [333, 222, 111];

    beforeEach(() => {
      parseLocationToQueries.mockImplementation(() => ({
        queue_id: testQueueId
      }));
      isAdHocReviewByItemId.mockReturnValue(false);
      isAdHocReviewByObjId.mockReturnValue(false);
      setItemsToBeReviewed(itemIds);
    });

    it('initial fetch', () => {
      const wrapper = tf.render();
      expect(wrapper.state('itemIds')).toEqual([]);
      expect(axios.post).toBeCalledWith('/review/items', { item_ids: itemIds });
    });

    it('fetch more while no more', () => {
      const wrapper = tf.render();
      axios.post.mockClear();
      wrapper.instance().fetchMoreItems();
      expect(axios.post).not.toBeCalled();
    });

    it('fetch more while more', () => {
      const wrapper = tf.render();
      axios.post.mockClear();
      wrapper.setState({ itemIds: [555] });
      wrapper.instance().fetchMoreItems();
      expect(axios.post).toBeCalledWith('/review/items', { item_ids: [555] });
    });
  });

code:

export function setItemsToBeReviewed(itemIds) {
  sessionStorage.setItem(ITEMS_TO_BE_REVIEWED_KEY, JSON.stringify(itemIds));
}


  fetchMoreItems = () => {
    this.setState({ loadingMoreItems: true });
    return this.fetchItems(true)
      .then(res => {
        this.loadData(res.data);
      })
      .catch(error => {
        console.log('FetchmoreError', error);
      });
  };

  fetchItems = (excludeAssigned: boolean = false) => {
    let request;
    if (this.state.itemIds) {
      request = this.fetchItemsByIds();
    } else {
      request = this.fetchItemsFIFO(excludeAssigned);
    }
    return request;
  };

  fetchItemsFIFO = (excludeAssigned: boolean = false) => {
    const { isAlignment, queueIdFromURL } = this.state;
    const url = '/review/assign';
    const params = {
      alignment: isAlignment,
      queue_id: queueIdFromURL,
      exclude_assigned: excludeAssigned
    };
    return axios.get<any>(url, { params });
  };

  fetchItemsByIds = () => {
    if (_.isEmpty(this.state.itemIds)) {
      return Promise.resolve({ data: [] });
    }
    const url = '/review/items';
    const data = {
      item_ids: _.slice(this.state.itemIds, 0, FETCH_BATCH_SIZE)
    };
    this.setState(state => ({
      itemIds: _.slice(state.itemIds, FETCH_BATCH_SIZE)
    }));
    return axios.post<any, any>(url, data);
  };

jest.config:

module.exports = {
  timers: 'fake',
  moduleDirectories: ['node_modules'],
  moduleFileExtensions: ['js', 'jsx'],
  moduleNameMapper: {
    '\\.(jpg|jpeg|png|gif|eot|otf|webp|svg|ttf|woff|woff2|mp4|webm|wav|mp3|m4a|aac|oga)$':
      '<rootDir>/__mocks__/fileMock.js',
    '\\.(css|less)$': '<rootDir>/__mocks__/styleMock.js',
    '^Root(.*)$': '<rootDir>$1',
    '^Utils(.*)$': '<rootDir>/src/utils$1',
    '^Hoc(.*)$': '<rootDir>/src/hoc$1',
    '^Components(.*)$': '<rootDir>/src/components$1'
  },
  testRegex: 'test\\.jsx?$',
  testURL: 'http://localhost:3000',
  collectCoverageFrom: [
    'src/**/*.js',
    'src/**/*.jsx',
    '!**/node_modules/**',
    '!src/components/bulk_review/columns/**',
    '!src/components/v2/**'
  ],
  coverageReporters: ['html', 'text'],
  coverageThreshold: {
    global: {
      branches: 90,
      functions: 90,
      lines: 90,
      statements: 90
    }
  },
  coverageDirectory: 'coverage',
  snapshotSerializers: ['enzyme-to-json/serializer'],
  testEnvironment: '<rootDir>/jest-environment.js',
  setupFilesAfterEnv: ['<rootDir>/enzyme.setup.js'],
  setupFiles: [
    '<rootDir>/__mocks__/localStorageMock.js',
    '<rootDir>/__mocks__/consoleMock.js'
  ],
  globals: {
    ENVIRONMENT: 'TESTING'
  },
  testPathIgnorePatterns: ['<rootDir>/src/components/v2'],
  reporters: [
    'default',
    [
      'jest-html-reporter',
      {
        pageTitle: 'Test Report',
        statusIgnoreFilter: 'passed',
        includeFailureMsg: 'true'
      }
    ]
  ]
};

envinfo

System:
OS: Linux 4.15 Ubuntu 18.04.4 LTS (Bionic Beaver)
CPU: (36) x64 Intel(R) Xeon(R) Platinum 8124M CPU @ 3.00GHz
Binaries:
Node: 14.1.0 - ~/.nvm/versions/node/v14.1.0/bin/node
Yarn: 1.22.4 - /usr/bin/yarn
npm: 6.14.4 - ~/.nvm/versions/node/v14.1.0/bin/npm
npmPackages:
jest: ^26.0.0 => 26.0.0

@thymikee
Copy link
Collaborator

thymikee commented May 5, 2020

We will need a repro that can be downloaded and analyzed.
Also, please make sure to clear cache just in case, e.g with jest --clear-cache

@simkessy
Copy link
Author

simkessy commented May 5, 2020

Oh --clear-cache fixed it.

@simkessy simkessy closed this as completed May 5, 2020
@thymikee
Copy link
Collaborator

thymikee commented May 5, 2020

Thanks, that's good to know. Still weird

@simkessy
Copy link
Author

simkessy commented May 5, 2020

I spoke too soon, it seems like the issue is this helper function:

export function setItemsToBeReviewed(itemIds) {
  sessionStorage.setItem(ITEMS_TO_BE_REVIEWED_KEY, JSON.stringify(itemIds));
}

@simkessy simkessy reopened this May 5, 2020
@SimenB
Copy link
Member

SimenB commented May 6, 2020

We will need a repro that can be downloaded and analyzed.

This is still the case 🙂

@thymikee
Copy link
Collaborator

thymikee commented May 6, 2020

Also sounds like JSDOM leaking

@zvs001
Copy link

zvs001 commented Jun 9, 2020

Not sure if it is related. But I get heap leak for simple expect:

  let items = tree.root.findAllByProps({ testID: 'CrewItem.Employee' })

  expect(items).toHaveLength(8) // stacked and throws leak in 30-60 seconds
  expect(items.length).toEqual(8) // works ok

Clearing cache doesn't help

@heypran
Copy link

heypran commented Jun 11, 2020

I am facing similar issues

@j
Copy link

j commented Jun 17, 2020

Same issue here as well. (using ts-jest)

@lukeapage
Copy link
Contributor

I got it during a full run in which some tests failed. I spent some time debugging and taking memory snapshots and comparing.. I couldn’t find any leaks.
I ran it with inspect in watch mode, run in band, took a snapshot after the first run, then ran again and took another. Is that the best way to find leaks?

@20BBrown14
Copy link

20BBrown14 commented Jun 19, 2020

I think I'm running into the same issue. Created a new app recently with Jest 26. Using Enzyme for snapshot testing. Updated a test to use mount instead of shallow and now it gets out of memory errors everytime I run it even if it's the only test running. Node's out there using something like 1.5GB. This is with or without coverage and I've tried clearing cache as well. I can provide my repo as an example if needed.

I posted an issue to Enzyme enzymejs/enzyme#2405 (comment)

Below is the error I get on this test

Test suite failed to run

    Call retries were exceeded

      at ChildProcessWorker.initialize (node_modules/jest-runner/node_modules/jest-worker/build/workers/ChildProcessWorker.js:191:21)


<--- Last few GCs --->

[3466:0x39d1050]    32366 ms: Mark-sweep 1390.7 (1425.4) -> 1390.2 (1425.9) MB, 714.3 / 0.0 ms  (average mu = 0.110, current mu = 0.013) allocation failure scavenge might not succeed
[3466:0x39d1050]    33470 ms: Mark-sweep 1391.0 (1425.9) -> 1390.5 (1426.4) MB, 1091.8 / 0.0 ms  (average mu = 0.053, current mu = 0.010) allocation failure scavenge might not succeed


<--- JS stacktrace --->

==== JS stack trace =========================================

    0: ExitFrame [pc: 0x23bdb465be1d]
    1: StubFrame [pc: 0x23bdb465d1df]
Security context: 0x1e8e53f9e6c1 <JSObject>
    2: printBasicValue(aka printBasicValue) [0x2c6a1c7d28e1] [<root>/node_modules/jest-snapshot/node_modules/pretty-format/build/index.js:~108] [pc=0x23bdb4dcdac1](this=0x00329d2826f1 <undefined>,val=0x3125160c22e1 <String[14]: onSubMenuClick>,printFunctionName=0x00329...

@heypran
Copy link

heypran commented Jun 20, 2020

I tried removing random test suites from my tests but still jest memory leaks. So there is no particular test causing the leak.

@kckunal2612
Copy link

I had a similar problem where I used to run into Out of Memory error when Jest started to do coverage on "untested files".
Using v8 as coverage provider solved the issue for me.
However, its an experimental feature (as per documentation) -
https://jestjs.io/blog/2020/01/21/jest-25#v8-code-coverage

@alexfromapex
Copy link

alexfromapex commented Aug 19, 2020

After doing some research, it seems this memory leak has been an ongoing issue since 2019 (Jest 22) so wanted to consolidate some notes here for posterity. Past issues have been related to graceful-fs and I think some have solved it via a hack/workaround that removes graceful-fs and then re-adds graceful-js after running jest. One troubleshooting thread was looking at compileFunction in the vm package as a potential cause. It seems that jest, webpack-dev-server, babel, and create-react-app are using graceful-js as a dependency. The memory leak issue was supposed to be fixed in a newer release of Jest but there may have been a regression since it is popping up again. I can confirm everything was working fine until a substantial amount of Jest tests were created in our environment and then the heap overflows on our CI machine after the heap size grows larger than the allocated memory due to the leak. I've tried using 1 worker, runInBand, etc. without success.

The common cause of the issues I've seen is collecting coverage via collecting coverage and graceful-fs. I haven't done an in-depth analysis of those issues but seeing that they are both filesystem-related and having solved my own issue which was related to file imports I suspect they are some version of the same issue I was having.

Wanted to provide the solution I found so others may reap benefits:

The cause:

Using imports of the format import * from 'whatever'

The solution:

Using the format import { whatINeed } from 'whatever' instead dramatically reduced the memory accumulation

@Godofbrowser
Copy link

Often times when this happens, I delete the src folder (provided it's on version control) and run
git checkout .
and
jest --clearCache
and now running the tests again works as before. In my case, not sure it has anything to do with upgrade but since it has occurred a few times over the last 6 months i thought to share

@klarkc
Copy link

klarkc commented Jan 22, 2021

+1 @alexfromapex solution did not worked for me.

Jest 26.6.3
Node 14.15.4

Dump: https://pastebin.com/Mfwi2iiA

It happens after some re-runs on any CI server (my runners are docker containers). Always after a fresh boot it works normally, and after some runs, it breaks again, only comming back after a new reboot. I tried with 1GB RAM and 2GB RAM machines, same result. It seems not happening with 8GB+ RAM hardware (my local machine).

Some other info I've gathered, it happens always after ~5m running, everytime the test log has the same size (it might be happening at same spot).

@felipeloha
Copy link

i have the same issue

@5saviahv
Copy link

I have very similar issue,

My test:

const first = [ div_obj, p_obj, a_obj ]; // array with three DOM elements
const second = [ div_obj, p_obj, a_obj ]; // array with same DOM elements
second.push( pre_obj ); // add new obj

expect(first).toEqual(second); // compare two arrays one 3 elements other 4 elements

test should fail within 250 ms (timeout), but it takes 40 sec and it spits out message:

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
...
<--- JS stacktrace --->

==== JS stack trace =========================================

    0: ExitFrame [pc: 0x55e78ec781b9]
Security context: 0x086b49dc08d1 <JSObject>
    1: toString [0x2cb560879521](this=0x1a95f0796eb1 <Object map = 0x2b3cdd5e4839>)
    2: printComplexValue(aka printComplexValue) [0x329610605059] [/home/joe/../node_modules/pretty-format/build/index.js:~198] [pc=0x1413a8b0e5ac](this=0x059db4cc04b1 <undefined>,0x1a95f0796eb1 <Object map = 0x2b3cdd5...

Somehow I believe stack trace points to printComplexValue. I also tried toMatchObject, but exactly the same.

Jest: v26.6.3

@jsbeckr
Copy link

jsbeckr commented Jan 28, 2021

I have a similar issue with:

Node: 15.5.1
Jest: 25.5.4

@klarkc
Copy link

klarkc commented Jan 28, 2021

It seems not happening with 8GB+ RAM hardware (my local machine).

Update: It just happened in my local host with 8GB ram, but this time in watch mode and outside docker, running a single test, after consecutive file saves (without waiting tests to finish).

Here is the dump: https://pastebin.com/jrDkCYiH

IDK if this help, but here is the memory status when it happened:

[klarkc@ssdarch ~]$ free -m
              total        used        free      shared  buff/cache   available
Mem:           7738        3731        2621         473        1385        3226
Swap:          8191        2133        6058

@rimiti
Copy link

rimiti commented Jan 29, 2021

I had this same error on my Gitlab CI and I just temporary added the -clearCache jest option and it works well.

@viceice
Copy link

viceice commented Mar 1, 2021

We see this regularly on our tests at https://github.com/renovatebot/renovate

@Fyko
Copy link

Fyko commented Mar 13, 2021

We see this regularly on our tests -- the first one succeeds then the second fails.

@harish-bonify
Copy link

Issue happening with jest v29 inside docker container.
Tests run fine on host machine.

@jordyvanraalte
Copy link

We are also struggling with this issue on Jest 26. Upgrading to Jest 29 didn't work.

@harish-bonify
Copy link

Specifying
NODE_OPTIONS="--max-old-space-size=2048"
helped us to solve the issue

@jordyvanraalte
Copy link

Sadly, this didn't solve our issues however, I can run our unit tests in two sessions which solves the issue for now

@anhdoecourier
Copy link

I had the same problem (also while collecting coverage data, on GitHub Actions/CI) and fixed it by limiting the number of workers:

  • maxWorkers: 2 in jest.config.js
  • or --w 2 as a command line parameter.

This solved my issue:
https://stackoverflow.com/a/68307839/9331978

@idanen
Copy link

idanen commented Jan 1, 2023

For me this happened only on CI.
Turns out it was because on CI the default is runInBand, so adding this locally helped me replicate the issue on a local machine.
For me it happened when an exception was thrown inside the callback given to a useLayoutEffect().
It just ran for ever, consuming more and more memory.
Hope this helps someone in this thread 🙏

@themorganthompson
Copy link

Updating my jest.config.ts with coverageProvider: 'v8' and maxWorkers: 2 did the trick for me!

@alfasin
Copy link

alfasin commented Jul 5, 2023

I tried a few different solutions that didn't work for me:

  • increasing --max-old-space-size
  • using the node flag: node --no-compilation-cache
  • use --runInBand
  • use --expose-gc
  • limiting the number of workers

What did work for me:

Limiting the idle memory per worker using the flag workerIdleMemoryLimit

I'm also limiting the number of workers so maybe it was a combination of the solutions.

@brendonco
Copy link

testTimeout: 10000,
maxConcurrency: 3,
maxWorkers: '50%',

Doesn't help in my case. still stuck in the test.

diracdeltas added a commit to brave/brave-core that referenced this issue Oct 17, 2023
diracdeltas added a commit to brave/brave-core that referenced this issue Oct 17, 2023
diracdeltas added a commit to brave/brave-core that referenced this issue Oct 17, 2023
@black-snow
Copy link

black-snow commented Oct 30, 2023

Jest 29.4.2 here, happening with a single, rather simple test and it boils down to:

    expect(s3.client).toHaveReceivedCommandWith(PutObjectCommand, {
      Bucket: IMAGES_BUCKET,
      Key: `${a}/${b}/info.json`,
      Body: expect.jsonMatching(infoFile),
      ContentType: "application/json",
    });

The s3 client is import { mockClient } from "aws-sdk-client-mock";

"aws-sdk-client-mock": "2.2.0", // hapens with 3.0.0 as well
"aws-sdk-client-mock-jest": "2.2.0",
"@aws-sdk/client-s3": "3.414.0",

There's nothing fancy in there; IMAGES_BUCKET is actually undefined, a and b some constant strings, and infoFile is:

    const infoFile: SomeType = {
      imageKeys: [imageObject1.Key!, imageObject2.Key!], // strings
      taskToken, // string
      location, // shallow dict
    };

Commenting out parts of it does not help, but as soon as I comment out the whole expectation my test turns green. With it I constantly get:

<--- Last few GCs --->

[186230:0x758f6a0]    55581 ms: Scavenge 2044.5 (2081.3) -> 2043.6 (2085.5) MB, 6.0 / 0.0 ms  (average mu = 0.244, current mu = 0.218) allocation failure; 
[186230:0x758f6a0]    56400 ms: Mark-sweep (reduce) 2045.6 (2085.5) -> 2044.3 (2079.8) MB, 465.3 / 0.0 ms  (+ 154.6 ms in 32 steps since start of marking, biggest step 6.9 ms, walltime since start of marking 638 ms) (average mu = 0.255, current mu = 0.268

<--- JS stacktrace --->

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
 1: 0xb85bc0 node::Abort() [/home/xxx/.nvm/versions/node/v18.18.1/bin/node]
 2: 0xa94834  [/home/xxx/.nvm/versions/node/v18.18.1/bin/node]
 3: 0xd667f0 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/home/xxx/.nvm/versions/node/v18.18.1/bin/node]
 4: 0xd66b97 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/home/xxx/.nvm/versions/node/v18.18.1/bin/node]
 5: 0xf442a5  [/home/xxx/.nvm/versions/node/v18.18.1/bin/node]
 6: 0xf451a8 v8::internal::Heap::RecomputeLimits(v8::internal::GarbageCollector) [/home/xxx/.nvm/versions/node/v18.18.1/bin/node]
 7: 0xf556b3  [/home/xxx/.nvm/versions/node/v18.18.1/bin/node]
 8: 0xf56528 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/home/xxx/.nvm/versions/node/v18.18.1/bin/node]
 9: 0xf30e8e v8::internal::HeapAllocator::AllocateRawWithLightRetrySlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/home/xxx/.nvm/versions/node/v18.18.1/bin/node]
10: 0xf32257 v8::internal::HeapAllocator::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/home/xxx/.nvm/versions/node/v18.18.1/bin/node]
11: 0xf1342a v8::internal::Factory::NewFillerObject(int, v8::internal::AllocationAlignment, v8::internal::AllocationType, v8::internal::AllocationOrigin) [/home/xxx/.nvm/versions/node/v18.18.1/bin/node]
12: 0x12d878f v8::internal::Runtime_AllocateInYoungGeneration(int, unsigned long*, v8::internal::Isolate*) [/home/xxx/.nvm/versions/node/v18.18.1/bin/node]
13: 0x17055f9  [/home/xxx/.nvm/versions/node/v18.18.1/bin/node]

I tried maxWorkers, runInBand, workerIdleMemoryLimit, and more but to no avail.
Running on Win11 inside WSL 2 Ubuntu LTS, node 18.18.1. Runs "fine" on colleagues' M2 Macs (except it's grilling it).

What's also interesting, I use similar expectations right before and after the problematic one and they run just fine.

@SimenB
Copy link
Member

SimenB commented Oct 30, 2023

Have you tried node 21.1+?

EDIT: oh, a specific assertion - that's weird. Could you out together a minimal reproduction?

@black-snow
Copy link

Hi @SimenB , I'll give it a try.

What's rather silly - this is my expectation:

    expect(s3.client).toHaveReceivedCommandWith(PutObjectCommand, {
      Bucket: IMAGES_BUCKET,
      Key: `${a}/${b}/info.json`,
      Body: expect.jsonMatching(infoFile),
      ContentType: "application/json",
    });

And this follows right after that and runs fine:

    expect(s3.client).toHaveReceivedCommandWith(PutObjectCommand, {
      Bucket: IMAGES_BUCKET,
      Key: `${a}/${IMAGES_DOCUMENT_NAME}`,
      Body: expect.anything(),
      ContentType: "application/pdf",
    });

But it ain't seem to matter what the actual body of the first is, still runs oom even if I turn it into:

    expect(s3.client).toHaveReceivedCommandWith(PutObjectCommand, {
      Bucket: "b",
      Key: `a`,
      Body: "",
      ContentType: "application/json",
    });

This is driving me crazy. It's the same thing ...

@black-snow
Copy link

black-snow commented Oct 30, 2023

When I strip it down to the size where I can almoste share it it starts working again -.-'

But another fun fact I forgot to mention: As soon as I attach the debugger (IntelliJ IDEA) it, too, works. Still blows up the RAM and takes forever but he can step over it. I see that IDEA comes with some "fixes" jest-intellij-stdin-fix.js and it might do other non-obvious things.


/edit

I was, however, able to boil it down further.

    expect(s3.client).toHaveReceivedCommandWith(PutObjectCommand, {
      Body: expect.anything(),
    // ...

works, whereas

    expect(s3.client).toHaveReceivedCommandWith(PutObjectCommand, {
      Body: expect.jsonMatching({}), // or jsonMatchin(1)
    // ...

or even

    expect(s3.client).toHaveReceivedCommandWith(PutObjectCommand, {
      Body: expect.any(Number),
    // ...

blows up.

@SimenB is there any way I can profile the jest worker? When I --inspect-brk jest I only seem to get some coordinating process and not the actual runner, that blows up.

/edit2: tried node 20, same behaviour

/edit3: expect.anything() was a red herring - it, too, takes >2GB of RAM - sometimes the GC is just lucky enough - the tests seems to live right on the edge of my heap_size_limit: 2197815296.

@enzious
Copy link

enzious commented Oct 31, 2023

This started happening to us this week. Affects previously passing builds.

@black-snow
Copy link

black-snow commented Nov 1, 2023

I tried in on pure Windows and Mac and it's basically the same behaviour. The single test, which virtually does nothing, takes up about 2.5 GB of RAM. For some reasons that's too much for my WSL, in PS and Mac the max heap seems to be higher for some reason. With --max-old-space-size=4096 everything is "fine", it just take 2.5 GB of RAM. But ofc, that's an absurd amount.
In Windows, jest launches as many workers as I have virtual cores (I think), but each one takes up at least 500 MB, even though there's nothing to do for them (I run 1 test).

Maybe related: #11956

My stripped-down sample only seems to use aroud 350MB:

[130094:0x71dd830] 3764 ms: Scavenge 389.2 (415.0) -> 381.8 (418.8) MB, 3.80 / 0.00 ms (average mu = 0.973, current mu = 0.975) task;

It does the same thing, there's just less code not being executed.

@pedrohamarques
Copy link

Trying to migrate from 26.6.3 to 27.2.5 and got the same issue.

@mike-4040 I'm facing the same problem. Did you solve it?

@black-snow
Copy link

Alright, I drilled further down into jest and here's what I've found:

There's some map-reduce equality check, counting if any invocation matched the given args (and swallowing any exception). In my test, one input was a "larger" image and for some reason that equality check took 2+ GB of RAM for the 4 MB image. I don't know what Jest does down there in order to compare the 4 MB buffer to a 4 byte string, but I circumvented the issue by reducing the image to 1x1 pixels (for the actual contents don't matter for now).

@mike-4040
Copy link

@pedrohamarques

I'm facing the same problem. Did you solve it?

Yes, by migrating to mocha + chai + sinon :)

@SimenB
Copy link
Member

SimenB commented Nov 13, 2023

@black-snow would you be able to share that image (perhaps privately?)

@black-snow
Copy link

@SimenB the concrete image doesn't seem to be the issue, jest hangs in the matcher toHaveReceivedCommandWith, especially in the try block in

            check: ({ commandCalls }) => {
                const matchCount = commandCalls
                    .map(call => call.args[0].input) // eslint-disable-line @typescript-eslint/no-unsafe-return
                    .map(received => {
                    try {
                        expect(received).toEqual(expect.objectContaining(input)); // here
                        return true;
                    }
                    catch (e) {
                        return false;
                    }
                })
                    .reduce((acc, val) => acc + Number(val), 0);
                return { pass: matchCount > 0, data: { matchCount } };
            }

Jest iterates over all the things it received and does some comparison. When it hits my

    expect(s3.client).toHaveReceivedCommandWith(PutObjectCommand, {
      Bucket: IMAGES_BUCKET,
      Key: `${a}/${b}/info.json`,
      Body: expect.jsonMatching(infoFile),
      ContentType: "application/json",
    });

it compares a 4 MB buffer, probably containing the jpeg with some 4 byte values and for some reason this blows up the heap and takes forever.

P.S.: Why not return 0/1 instead of booleans, which get cast to Number right after?

@tasdemirbahadir
Copy link

I had the same problem (also while collecting coverage data, on GitHub Actions/CI) and fixed it by limiting the number of workers:

  • maxWorkers: 2 in jest.config.js
  • or --w 2 as a command line parameter.

This solved my issue: https://stackoverflow.com/a/68307839/9331978

This solved mine too thanks!

@CodingItWrong
Copy link
Contributor

CodingItWrong commented Jan 5, 2024

I ran into the issue with Node 16.19.0 and Jest 29.7.0.

What did not work for me:

What did work for me:

  • Setting workerIdleMemoryLimit to 1 GB

@robblovell
Copy link

Same problem in Node 16.20.2 and Jest 29.7.0

What did not work for me:

  • Clearing cache
  • Limiting maxWorkers to 2 or to 1, or runInBand
  • Setting workerIdleMemoryLimit to 1 GB, to 50%, or to 0.2
  • Changing Node version is not an option.

This worked:

  • Setting NODE_OPTIONS max_old_space_size before running jest:
export NODE_OPTIONS=--max_old_space_size=8192
jest

or

NODE_OPTIONS='--max_old_space_size=8192' jest

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests