Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Segfault with AsyncLocalStorage and process.exit #1380

Closed
antoinep92 opened this issue Jun 25, 2021 · 5 comments
Closed

Segfault with AsyncLocalStorage and process.exit #1380

antoinep92 opened this issue Jun 25, 2021 · 5 comments

Comments

@antoinep92
Copy link

antoinep92 commented Jun 25, 2021

Expected Behavior

Should not segfault :)
Native node doesn't segfault running the same code, and tsc doesn't segfault compiling the code, so it must be inside ts-node, although ultimately it might be a Node.JS bug ? If so I may need some help gathering more info before submitting upstream.

Actual Behavior

segmentation fault

Steps to reproduce the problem

Running my code with ts-code randomly segfault half the time when I have a failing unit test. I managed to reduce from ~5k lines to a minimal test case, but it segfaults less often.

Minimal reproduction

repro PR
segfault.ts :

import { AsyncLocalStorage } from 'async_hooks';
const asyncStore = new AsyncLocalStorage;
function sleep(ms: number) { return new Promise(resolve => setTimeout(resolve, ms)) }
for(let i = 0; i < 1000; ++i) {
	asyncStore.run(() => {}, async() => {
		await sleep(Math.random() * 100);
		process.exit();
	});
}

And then run with:

while true ; do ts-node segfault.ts || break ; done

The loop break when it segfaults at v8::Context::setPromiseHooks:

       Message: Process 587352 (node) of user 1000 dumped core.
                
                Stack trace of thread 587352:
                #0  0x0000555ca11701fd v8::Context::SetPromiseHooks(v8::Local<v8::Function>, v8::Local<v8::Function>, v8::Local<v8::Function>, v8::Local<v8::Function>) (node + 0x8f01fd)
                #1  0x0000555ca0ec9f66 n/a (node + 0x649f66)
                #2  0x0000555ca11cad77 v8::internal::FunctionCallbackArguments::Call(v8::internal::CallHandlerInfo) (node + 0x94ad77)
                #3  0x0000555ca11cbc6c n/a (node + 0x94bc6c)
                #4  0x0000555ca11cc21f n/a (node + 0x94c21f)
                #5  0x0000555ca11cc536 v8::internal::Builtin_HandleApiCall(int, unsigned long*, v8::internal::Isolate*) (node + 0x94c536)
                #6  0x0000555ca1a2f5d9 n/a (node + 0x11af5d9)
                #7  0x0000555ca19c8085 n/a (node + 0x1148085)
                #8  0x0000555ca19c8085 n/a (node + 0x1148085)
                #9  0x0000555ca19c8085 n/a (node + 0x1148085)
                #10 0x0000555ca19c8085 n/a (node + 0x1148085)
                #11 0x0000555ca19c8085 n/a (node + 0x1148085)
                #12 0x0000555ca19c8085 n/a (node + 0x1148085)
                #13 0x0000555ca19c8085 n/a (node + 0x1148085)
                #14 0x0000555ca19c8085 n/a (node + 0x1148085)
                #15 0x0000555ca19c8085 n/a (node + 0x1148085)
                #16 0x0000555ca19c8085 n/a (node + 0x1148085)
                #17 0x0000555ca19c8085 n/a (node + 0x1148085)
                #18 0x0000555ca19c8085 n/a (node + 0x1148085)
                #19 0x0000555ca19c8085 n/a (node + 0x1148085)
                #20 0x0000555ca19c8085 n/a (node + 0x1148085)
                #21 0x0000555ca19c8085 n/a (node + 0x1148085)
                #22 0x0000555ca19c8085 n/a (node + 0x1148085)
                #23 0x0000555ca19c8085 n/a (node + 0x1148085)
                #24 0x0000555ca19c8085 n/a (node + 0x1148085)
                #25 0x0000555ca19c8085 n/a (node + 0x1148085)
                #26 0x0000555ca19c8085 n/a (node + 0x1148085)
                #27 0x0000555ca19c8085 n/a (node + 0x1148085)
                #28 0x0000555ca19c8085 n/a (node + 0x1148085)
                #29 0x0000555ca19c8085 n/a (node + 0x1148085)
                #30 0x0000555ca19c62fb n/a (node + 0x11462fb)
                #31 0x0000555ca19c6083 n/a (node + 0x1146083)
                #32 0x0000555ca12cc117 n/a (node + 0xa4c117)
                #33 0x0000555ca12cd10e v8::internal::Execution::Call(v8::internal::Isolate*, v8::internal::Handle<v8::internal::Object>, v8::internal::Handle<v8::internal::Object>, int, v8::internal::Handle<v8::internal::Object>*) (node + 0xa4d10e)
                #34 0x0000555ca1181ab1 v8::Function::Call(v8::Local<v8::Context>, v8::Local<v8::Value>, int, v8::Local<v8::Value>*) (node + 0x901ab1)
                #35 0x0000555ca0f4b767 node::ExecuteBootstrapper(node::Environment*, char const*, std::vector<v8::Local<v8::String>, std::allocator<v8::Local<v8::String> > >*, std::vector<v8::Local<v8::Value>, std::allocator<v8::Local<v8::Value> > >*) (node + 0x6cb767)
                #36 0x0000555ca0f4ba6c n/a (node + 0x6cba6c)
                #37 0x0000555ca0f4d065 node::StartExecution(node::Environment*, std::function<v8::MaybeLocal<v8::Value> (node::StartExecutionCallbackInfo const&)>) (node + 0x6cd065)
                #38 0x0000555ca0ec545f node::LoadEnvironment(node::Environment*, std::function<v8::MaybeLocal<v8::Value> (node::StartExecutionCallbackInfo const&)>) (node + 0x64545f)
                #39 0x0000555ca0fd448b node::NodeMainInstance::Run(node::EnvSerializeInfo const*) (node + 0x75448b)
                #40 0x0000555ca0f4ef65 node::Start(int, char**) (node + 0x6cef65)
                #41 0x00007fbd89665b25 __libc_start_main (libc.so.6 + 0x27b25)
                #42 0x0000555ca0ebf26e _start (node + 0x63f26e)
                
                Stack trace of thread 587353:
                #0  0x00007fbd8973c92e epoll_wait (libc.so.6 + 0xfe92e)
                #1  0x0000555ca1b6a29a n/a (node + 0x12ea29a)
                #2  0x0000555ca1b57967 uv_run (node + 0x12d7967)
                #3  0x0000555ca1004716 node::WorkerThreadsTaskRunner::DelayedTaskScheduler::Run() (node + 0x784716)
                #4  0x00007fbd89815259 start_thread (libpthread.so.0 + 0x9259)
                #5  0x00007fbd8973c5e3 __clone (libc.so.6 + 0xfe5e3)
                
                Stack trace of thread 587357:
                #0  0x00007fbd898218ca __futex_abstimed_wait_common64 (libpthread.so.0 + 0x158ca)
                #1  0x00007fbd8981b270 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf270)
                #2  0x0000555ca1b65f4a uv_cond_wait (node + 0x12e5f4a)
                #3  0x0000555ca100103c n/a (node + 0x78103c)
                #4  0x00007fbd89815259 start_thread (libpthread.so.0 + 0x9259)
                #5  0x00007fbd8973c5e3 __clone (libc.so.6 + 0xfe5e3)
                
                Stack trace of thread 587355:
                #0  0x00007fbd898218ca __futex_abstimed_wait_common64 (libpthread.so.0 + 0x158ca)
                #1  0x00007fbd8981b270 pthread_cond_wait@@GLIBC_2.3.2 (libpthread.so.0 + 0xf270)
                #2  0x0000555ca1b65f4a uv_cond_wait (node + 0x12e5f4a)
                #3  0x0000555ca100103c n/a (node + 0x78103c)
                #4  0x00007fbd89815259 start_thread (libpthread.so.0 + 0x9259)
                #5  0x00007fbd8973c5e3 __clone (libc.so.6 + 0xfe5e3)
                
                Stack trace of thread 587358:
                #0  0x00007fbd898218ca

Specifications

  • ts-node version: 10.0.0

  • node version: 16.3.0

  • TypeScript version: 4.3.2

  • tsconfig.json, if you're using one: (none)

  • Operating system and version: Linux 5.12.10 x64 (ArchLinux), libc 2.33

@antoinep92
Copy link
Author

To clarify my request, I think the segfault happens within Node.JS.

Unfortunately just running the transpiled code under node does not crash, so I'm looking forward to produce a minimal under native node (not using large dependencies such as ts-node, etc.). This amounts to identifying the code pattern within ts-node that in conjunction with my code, yields to the crash.

Is there some doc on ts-node's design or could someone point to the relevant places to look at in the sources ?
I'm open to investigating a bit myself, but I need some pointers 🙏

antoinep92 added a commit to antoinep92/ts-node-repros that referenced this issue Jun 25, 2021
@cspotcode
Copy link
Collaborator

Interesting, I can point you to all the places where we are hooking into node. Most of ts-node is concerned with loading the config file and talking to TypeScript's compiler APIs, so the parts where we actually hook node are fairly small.

Everything mentioned here is for our CommonJS loader, not our new ESM loader which uses a different set of hooking APIs. Your reproduction is using our CommonJS loader.

In the information below, I mention ways that you can modify ts-node to simplify its behavior. You can do this by either modifying the ts-node .js files in ./node_modules/ts-node/dist/*, or you can clone the ts-node repository, make your changes, run the build, and tell npm to use your modified ts-node. There are many different ways to do this: npm link, npm pack and install directly from the tarball, etc.


We install a third-party dependency called source-map-support which hooks into node's stack trace API. It rewrites stack traces, using sourcemaps, to use .ts line numbers instead of .js. You can comment out that code to quickly test your reproduction without source-map-support installed.
https://github.com/TypeStrong/ts-node/blob/main/src/index.ts#L6
https://github.com/TypeStrong/ts-node/blob/main/src/index.ts#L550-L568

We hook into the module loading process here:
https://github.com/TypeStrong/ts-node/blob/main/src/index.ts#L1266-L1292
It can be simplified to this:

require.extensions['.ts'] = function(m, filename) {
  const _compile = m._compile;
  m._compile = function(code, fileName) {
    _compile.call(this, service.compile(code), fileName);
  }
}

service is ts-node's compiler service. It is the same thing you get from our API when you call require('ts-node').register() or .create(). Docs for the service API are here: https://typestrong.org/ts-node/api/interfaces/Service.html

Internally it uses TypeScript's APIs. The first time you compile a file, it will need to do a lot of CPU work because that's the way typescript works: it needs to load lots of declaration files, it may need to load a bunch of @types/*. If this segfault is related to timing in any way, I wonder if it's being triggered by the time it takes TypeScript to start compiling and typechecking.

Our CLI also does a few tricks to make sure the execution environment looks correct to whatever script we are running.
https://github.com/TypeStrong/ts-node/blob/main/src/bin.ts#L318-L329
To simplify your reproduction, you can avoid those tricks by running your code like this:

node -r ts-node/register ./segfault.ts

This will still install ts-node's hooks but will bypass our bin.ts


I hope this helps, best of luck!

@cspotcode
Copy link
Collaborator

Could it be related to this? nodejs/node#39019

@antoinep92
Copy link
Author

Thanks a lot. Honestly I was not hopping to get a response so fast and with so many actionable details. Really appreciate it 👍

I'll definitely have a closer look as soon as I find some time. From a quick glance the discussion at nodejs/node#39019 seems to be about at least two issues, one of which is fixed in master, the other apparently not, and also a bit random. Stack traces are similar. I guess I'll start with trying to reproduce with node-nightly -r ts-node/register ./segfault.ts and work from there.

@cspotcode
Copy link
Collaborator

I'm doing housekeeping and I think I can close this issue. If this turns out to be a ts-node bug, we can reopen it or file a new bug report.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants