Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(jest-core): Add performance markers around significant lifecycle events #13859

Merged
merged 4 commits into from Feb 5, 2023

Conversation

robhogan
Copy link
Contributor

@robhogan robhogan commented Feb 3, 2023

Summary

I'd like to upstream some of the work we've been doing over in Metro to improve startup times (particularly with an overhaul of jest-haste-map that started by forking it), and possibly carve out some time to optimise other areas of Jest startup. A first step (especially to get some buy-in at Meta!) is instrumentation - this PR is a stab at adding some key markers via the standard global performance API.

I'm not wedded to this implementation - open to bikeshedding about labels, abstraction, or if there's a pre-existing Jest instrumentation pattern I'd missed, but I'm particularly looking for some way to time the startup independently of running tests.

Test plan

Running Jest programmatically in a thin wrapper allows performance data to be gathered:

'use strict';

const {performance} = require('perf_hooks');

if (process.env.NODE_ENV == null) {
  process.env.NODE_ENV = 'test';
}

require('jest-cli')
  .run()
  .then(() => {
    console.log(performance.getEntries());
  });
[
  PerformanceMark {
    name: 'jest/runCLI:start',
    entryType: 'mark',
    startTime: 179.9593139886856,
    duration: 0,
    detail: null
  },
  PerformanceMark {
    name: 'jest/buildContextsAndHasteMaps:start',
    entryType: 'mark',
    startTime: 269.55708903074265,
    duration: 0,
    detail: null
  },
  PerformanceMark {
    name: 'jest/buildContextsAndHasteMaps:end',
    entryType: 'mark',
    startTime: 1525.8082609772682,
    duration: 0,
    detail: null
  },
  PerformanceMark {
    name: 'jest/getTestRunData:start',
    entryType: 'mark',
    startTime: 1535.9708949923515,
    duration: 0,
    detail: null
  },
  PerformanceMark {
    name: 'jest/getTestRunData:end',
    entryType: 'mark',
    startTime: 2911.5280399918556,
    duration: 0,
    detail: null
  },
  PerformanceMark {
    name: 'jest/globalSetup:start',
    entryType: 'mark',
    startTime: 2928.4168980121613,
    duration: 0,
    detail: null
  },
  PerformanceMark {
    name: 'jest/globalSetup:end',
    entryType: 'mark',
    startTime: 2928.6306880116463,
    duration: 0,
    detail: null
  },
  PerformanceMark {
    name: 'jest/scheduleAndRun:start',
    entryType: 'mark',
    startTime: 2938.5601339936256,
    duration: 0,
    detail: { numTests: 1 }
  },
  PerformanceMark {
    name: 'jest/scheduleAndRun:start',
    entryType: 'mark',
    startTime: 5480.210482001305,
    duration: 0,
    detail: null
  },
  PerformanceMark {
    name: 'jest/cacheResults:start',
    entryType: 'mark',
    startTime: 5480.292959034443,
    duration: 0,
    detail: null
  },
  PerformanceMark {
    name: 'jest/cacheResults:end',
    entryType: 'mark',
    startTime: 5480.848489999771,
    duration: 0,
    detail: null
  },
  PerformanceMark {
    name: 'jest/globalTeardown:start',
    entryType: 'mark',
    startTime: 5480.870894014835,
    duration: 0,
    detail: null
  },
  PerformanceMark {
    name: 'jest/globalTeardown:end',
    entryType: 'mark',
    startTime: 5480.897032022476,
    duration: 0,
    detail: null
  },
  PerformanceMark {
    name: 'jest/processResults:start',
    entryType: 'mark',
    startTime: 5480.90157699585,
    duration: 0,
    detail: null
  },
  PerformanceMark {
    name: 'jest/processResults:end',
    entryType: 'mark',
    startTime: 5481.04110699892,
    duration: 0,
    detail: null
  },
  PerformanceMark {
    name: 'jest/runCLI:end',
    entryType: 'mark',
    startTime: 5481.130716025829,
    duration: 0,
    detail: null
  }
]

@robhogan robhogan changed the title Add start/end performance markers around significant lifecycle events feat(jest-core): Add performance markers around significant lifecycle events Feb 3, 2023
@SimenB
Copy link
Member

SimenB commented Feb 5, 2023

Yay, super exciting! I've wanted to do this sort of things for years, but never got around to it. I'm actually working on instrumentation at work atm (albeit using open telemetry), so it's been on my mind lately to get into it for Jest as well.

const {contexts, hasteMapInstances} = await buildContextsAndHasteMaps(
configs,
globalConfig,
outputStream,
);
performance.mark('jest/buildContextsAndHasteMaps:end');
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would using something like https://nodejs.org/api/perf_hooks.html#performancetimerifyfn-options make sense? both here and other places with a start & end pair

Copy link
Contributor Author

@robhogan robhogan Feb 5, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did consider timerify, but it's limited - there's no way to pass detail, unlike with marks (I used that here for numTests), and the only label on the entry is the literal name of the function, which isn't necessarily descriptive out of context. It's not interchangeable with mark either, because it requires a PerformanceObserver and creates different entry types.

One thing I like about being able to specify the labels is being able to namespace them to Jest, which is important if you've got other instrumented code in the same process (if, e.g, Babel introduced instrumentation, we'd want to be able to distinguish it from Jest's, and at Meta we run Jest programmatically in a wrapper that has its own instrumentation).

Another alternative I looked at is measure, but it pretty much just changes the syntax of the end marker:

const fooStartMarker = performance.mark('foo:start');
// ...foo
performance.measure('foo', fooStartMarker); // OR performance.measure('foo', 'foo:start');

vs

performance.mark('foo:start');
// ...foo
performance.mark('foo:end');

..I'm not sure the former is any cleaner. Also, a consumer can trivially create measures from marks if that's what they want, whereas the reverse is harder.

So TL;DR IMO mark optimises for flexibility and simplicity. Admittedly it isn't super clean, but I wouldn't anticipate littering the whole codebase or even much more than what's in this PR - if we did end up adding lots more markers, we could reduce some duplication and strengthen typing with an abstraction (require('jest-instrumentation').startMark('foo') or similar).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool, thanks for the thorough response 👍

@SimenB SimenB merged commit c8e7577 into jestjs:main Feb 5, 2023
@robhogan robhogan deleted the feat/perf-markers branch February 5, 2023 22:36
@SimenB
Copy link
Member

SimenB commented Feb 7, 2023

@github-actions
Copy link

This pull request has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
Please note this issue tracker is not a help forum. We recommend using StackOverflow or our discord channel for questions.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Mar 10, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants