Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: rewrite-stack-trace logic break stack trace for Jest #14927

Closed
1 task
smcenlly opened this issue Sep 14, 2022 · 14 comments 路 Fixed by #14930
Closed
1 task

[Bug]: rewrite-stack-trace logic break stack trace for Jest #14927

smcenlly opened this issue Sep 14, 2022 · 14 comments 路 Fixed by #14930
Assignees
Labels
i: bug outdated A closed issue/PR that is archived due to age. Recommended to make a new issue pkg: core

Comments

@smcenlly
Copy link

馃捇

  • Would you like to work on a fix?

How are you using Babel?

Programmatic API (babel.transform, babel.parse)

Input code

Steps for reproducing are described here (jestjs/jest#13248); confirmed by jest team as upstream (babel) issue. More detail on confirming the problem described below.

In rewrite-stack-trace.ts, the Error.stackTraceLimit is being adjusted in a singleton function. This runs just once within a process.

What's happening is as follows:

  1. Jest starts and sets Error.stackTraceLimit to 100.
  2. At some point the babel rewrite code runs and sets Error.stackTraceLimit to Error.stackTraceLimit += STACK_TRACE_LIMIT_DELTA; (const STACK_TRACE_LIMIT_DELTA = 100;).
  3. When errors are reported on first run, they are truncated to newTrace.slice(0, Error.stackTraceLimit - STACK_TRACE_LIMIT_DELTA), (see here).
  4. When a process is re-used, jest re-initializes the Error.stackTraceLimit to 100.
  5. Because rewrite-stack-trace.ts operates as a singleton, it is not expecting Error.stackTraceLimit to have been reset, and then completely truncates subsequent stack traces because now Error.stackTraceLimit is 100, and STACK_TRACE_LIMIT_DELTA is 100, and the processing code newTrace.slice(0, Error.stackTraceLimit - STACK_TRACE_LIMIT_DELTA) becomes newTrace.slice(0, 100 - 100).

You can confirm this behavior by changing setupPrepareStackTrace method in distributed @babel/core/lib/errors/rewrite-stack-trace.js:

...
function setupPrepareStackTrace() {
+   require('fs').appendFileSync('path_to_log_file.txt', (Error.stackTraceLimit - STACK_TRACE_LIMIT_DELTA).toString() + '\n');
    return prepareStackTrace(err, newTrace.slice(0, Error.stackTraceLimit - STACK_TRACE_LIMIT_DELTA));
...
}
...

Sample output when running the sample repo provided in the jest issue, for Error.stackTraceLimit - STACK_TRACE_LIMIT_DELTA:

100
100
100
100
0

After this point, the error stack is always completely truncated.

Configuration file name

No response

Configuration

Default Jest Babel Configuration (not applicable to the issue)

Current and expected behavior

Babel should be tolerant of other tooling setting Error.stackTraceLimit.

Environment

@babel/core@7.19.0

Possible solution

No response

Additional context

No response

@babel-bot
Copy link
Collaborator

Hey @smcenlly! We really appreciate you taking the time to report an issue. The collaborators on this project attempt to help as many people as possible, but we're a limited number of volunteers, so it's possible this won't be addressed swiftly.

If you need any help, or just have general Babel or JavaScript questions, we have a vibrant Slack community that typically always has someone willing to help. You can sign-up here for an invite.

@nicolo-ribaudo
Copy link
Member

I wonder what's the expected interaction story between multiple libraries that modify Error.stackTraceLimit.

Instead of blindly setting Error.stackTraceLimit to what we need, I only increased it by some specific delta so that we know how many stack frames are present because of Babel and we know that "the rest" is because of user code. However, this is fails if non-Babel code later sets Error.stackTraceLimit to a specific number.

I was thinking that we could fix by intercepting assignments to Error.stackTraceLimit and add our STACK_TRACE_LIMIT_DELTA when it happens, but that means that this code stops working:

const oldLimit = Error.stackTraceLimit;
Error.stackTraceLimit = 10_000;
try {
  // ... do something ...
} finally {
  Error.stackTraceLimit = oldLimit;
}

because it would end up increasing Error.stackTraceLimit rather than leaving it unchanged.

@SimenB Do you have any idea? In the worst case I can disable this whenever we detect that we are running in Jest.

@liuxingbaoyu
Copy link
Member

liuxingbaoyu commented Sep 14, 2022

Maybe we can increase stackTraceLimit at the beginning of each call and decrease it at the end? (Not sure if there will be a problem in the async method)
Also detect stackTraceLimit, when it is not default and < 50(30?) we will temporarily disable this feature.

@nicolo-ribaudo
Copy link
Member

Right, that doesn't work with async methods (and all of our methods can be async).

@liuxingbaoyu
Copy link
Member

Error.stackTraceLimit = 1;
console.log(new Error());
Error.stackTraceLimit = 5;
console.log(new Error());

Can we set on every throw?

@nicolo-ribaudo
Copy link
Member

We don't control every throw, because our config and plugins can invoke arbitrary third-party code.

@nicolo-ribaudo
Copy link
Member

Oh well, we could do Error.stackTraceLimit = Math.min(Error.stackTraceLimit, 100), without the fancy "delta" stuff we do to preserve the correct stack trace sizes.

@liuxingbaoyu
Copy link
Member

Maybe we can modify and enable it only when stackTraceLimit is the default value, or enable it when stackTraceLimit > 50?
Especially stackTraceLimit==0, I just found out during my search that someone is using this to avoid outputting the error stack to the user in the production server.馃槙

@SimenB
Copy link
Contributor

SimenB commented Sep 14, 2022

Hmm, not sure. Could add a Proxy to the limit so you know if others modified the limit and can consider it when resetting? Or just a getter/setter pair I guess

@liuxingbaoyu
Copy link
Member

liuxingbaoyu commented Sep 14, 2022

This doesn't seem to detect +=.
We can also increment stackTraceLimit by a magic number, say 99, when enabled, and save the new value 109, if stackTraceLimit is still equal to 109 while processing the stack, then we cut it, otherwise don't cut it.

@nicolo-ribaudo
Copy link
Member

@SimenB I tried, and apparently V8 ignores returns an undefined stack if Error.stackTraceLimit is a getter 馃槵

I opened #14930, which should be a good enough fix.

I just found out during my search that someone is using this to avoid outputting the error stack to the user in the production server.confused

Luckily Babel is used at compile-time and not at runtime in production 馃槢

@smcenlly
Copy link
Author

@nicolo-ribaudo - I'm not sure if it's possible but I had considered perhaps using Number object and stitching some additional properties to it as a fix... something like:

function setupPrepareStackTrace() {
  if ((Error.stackTraceLimit as any).BABEL_PATCHED) return;

  const { prepareStackTrace = defaultPrepareStackTrace } = Error;

  const stackTraceLimitNumberObject = new Number(
    Error.stackTraceLimit + STACK_TRACE_LIMIT_DELTA,
  );
  (stackTraceLimitNumberObject as any).BABEL_PATCHED = true;
  Error.stackTraceLimit = stackTraceLimitNumberObject as number;

  Error.prepareStackTrace = function stackTraceRewriter(err, trace) {
    if (!(Error.stackTraceLimit as any).BABEL_PATCHED) {
      return prepareStackTrace(err, trace);
    }
...

I was planning on perhaps sending a PR but I ran out of time to test and have some other priorities. Hope the idea may help.

@liuxingbaoyu
Copy link
Member

Thank you for your detailed investigation and suggestion, we will release a new patch later!

@nicolo-ribaudo
Copy link
Member

@smcenlly I tested it, but V8 expects a number primitive and not an object 馃槵

@github-actions github-actions bot added the outdated A closed issue/PR that is archived due to age. Recommended to make a new issue label Dec 15, 2022
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Dec 15, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
i: bug outdated A closed issue/PR that is archived due to age. Recommended to make a new issue pkg: core
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants