Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

async_hooks: execute destroy hooks earlier #34342

Closed

Conversation

Flarna
Copy link
Member

@Flarna Flarna commented Jul 13, 2020

Use a microtask to call destroy hooks in case there are a lot queued as immediate may be scheduled late in case of long running
promise chains.

Queuing a mircrotasks in GC context is not allowed therefore an interrupt is triggered to do this in JS context as fast as possible.

fixes: #34328
refs: #33896

@nodejs-github-bot nodejs-github-bot added the c++ Issues and PRs that require attention from people who are familiar with C++. label Jul 13, 2020
@Flarna Flarna added the async_hooks Issues and PRs related to the async hooks subsystem. label Jul 13, 2020
@Flarna
Copy link
Member Author

Flarna commented Jul 13, 2020

Not sure at all if this change should be done/is really helpful as such long running promise chains as in #34328 block IO so most likely not a typical usecase - except maybe in workers?

@addaleax
Copy link
Member

addaleax commented Jul 13, 2020

Can you add a test that verifies that this also works when EmitDestroy() is being run during asynchronous GC (i.e. with no JS stack below it)?

(That complexity is why I haven’t opened a PR myself yet)

Copy link
Member

@addaleax addaleax left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(Marking requested changes since having an explicit test verifying that the typical use case isn’t broken is a blocker for me, and ideally this should also come with a regression test of its own)

@Flarna
Copy link
Member Author

Flarna commented Jul 13, 2020

Can you add a test that verifies that this also works when EmitDestroy() is being run during asynchronous GC (i.e. with no JS stack below it)?

How can I enforce this? I guess calling global.gc() is exactly the opposite.

@Flarna Flarna added the wip Issues and PRs that are still a work in progress. label Jul 13, 2020
@addaleax
Copy link
Member

@Flarna Yeah, it’s tricky – V8 doesn’t provide any way to do this, afaik, so what I think we’ve done in the past is to allocate a lot of objects, but not too many, so that GC is triggered, but as a background task instead of as an interrupt.

(Alternatively, I think you could just queue an immediate + a microtask, separately.)

test/async-hooks/test-destroy-not-blocked.js Outdated Show resolved Hide resolved
@Flarna Flarna force-pushed the asynchooks-destroy-microtask branch from cc9aa0d to 3b62014 Compare July 16, 2020 16:50
@Flarna
Copy link
Member Author

Flarna commented Jul 16, 2020

@addaleax Is the test I added what you expected or should I play around a little bit more?

I tried also to run benchmarks for async_hooks but they hang after a while. Is this a known issue or does this indicated that this change causes problems?

@addaleax
Copy link
Member

@Flarna What I had in mind for testing is something that verifies that this works even when a native weak AsyncWrap object (e.g. TLSWrap) is collected through background-task garbage collection – because we need to make sure that the microtask scheduled from that actually ends up being run after the background task, and not just waiting around until the next time the microtask queue is run (which might be an indefinite waiting period).

My understanding is that this does work, but it’s tricky to test because intentionally causing a background GC to run is not trivial. But I also realize that it might be annoying that my review is currently blocking this, so I can put in some time tomorrow to write a test myself and push that to this PR so that you don’t need to write a test like that yourself :)

@Flarna
Copy link
Member Author

Flarna commented Jul 16, 2020

No need to hurry. I'm on vacation next week and will not find any time this week.
I'm also a little worried regarding the benchmarks. Locally they didn't stall but showed degrades in some tests. Unfortunately I accidentally deleted the results, will re-run them later.

@Flarna
Copy link
Member Author

Flarna commented Jul 17, 2020

Local run benchmarks show some degrades (CI still hangs).

async_hooks\\gc-tracking.js method='trackingEnabledWithDestroyHook' n=1000000   ***  -1.35 %
async_hooks\\promises.js asyncHooks='disabled' n=1000000                        ***  -1.42 %
async_hooks\\promises.js asyncHooks='enabled' n=1000000                         ***  -2.77 %
async_hooks\\promises.js asyncHooks='enabledWithDestroy' n=1000000              ***  -6.45 %
Full result
                                                                                                                                                                           confidence improvement
 async_hooks\\async-resource-vs-destroy.js n=1000000 duration=5 connections=500 path='/' asyncMethod='async' type='async-local-storage' benchmarker='test-double-http'                    -0.17 %
 async_hooks\\async-resource-vs-destroy.js n=1000000 duration=5 connections=500 path='/' asyncMethod='async' type='async-resource' benchmarker='test-double-http'                          0.18 %
 async_hooks\\async-resource-vs-destroy.js n=1000000 duration=5 connections=500 path='/' asyncMethod='async' type='destroy' benchmarker='test-double-http'                                -0.43 %
 async_hooks\\async-resource-vs-destroy.js n=1000000 duration=5 connections=500 path='/' asyncMethod='callbacks' type='async-local-storage' benchmarker='test-double-http'                -0.17 %
 async_hooks\\async-resource-vs-destroy.js n=1000000 duration=5 connections=500 path='/' asyncMethod='callbacks' type='async-resource' benchmarker='test-double-http'                     -0.00 %
 async_hooks\\async-resource-vs-destroy.js n=1000000 duration=5 connections=500 path='/' asyncMethod='callbacks' type='destroy' benchmarker='test-double-http'                            -0.17 %
 async_hooks\\gc-tracking.js method='trackingDisabled' n=1000000                                                                                                                    *     -1.35 %
 async_hooks\\gc-tracking.js method='trackingEnabled' n=1000000                                                                                                                           -0.17 %
 async_hooks\\gc-tracking.js method='trackingEnabledWithDestroyHook' n=1000000                                                                                                    ***     -1.35 %
 async_hooks\\http-server.js duration=5 connections=50 asyncHooks='after' benchmarker='test-double-http'                                                                                  -0.88 %
 async_hooks\\http-server.js duration=5 connections=50 asyncHooks='all' benchmarker='test-double-http'                                                                                    -0.65 %
 async_hooks\\http-server.js duration=5 connections=50 asyncHooks='before' benchmarker='test-double-http'                                                                                 -0.67 %
 async_hooks\\http-server.js duration=5 connections=50 asyncHooks='disabled' benchmarker='test-double-http'                                                                               -0.57 %
 async_hooks\\http-server.js duration=5 connections=50 asyncHooks='init' benchmarker='test-double-http'                                                                                   -0.66 %
 async_hooks\\http-server.js duration=5 connections=50 asyncHooks='none' benchmarker='test-double-http'                                                                                   -0.53 %
 async_hooks\\http-server.js duration=5 connections=500 asyncHooks='after' benchmarker='test-double-http'                                                                                 -0.74 %
 async_hooks\\http-server.js duration=5 connections=500 asyncHooks='all' benchmarker='test-double-http'                                                                                   -0.79 %
 async_hooks\\http-server.js duration=5 connections=500 asyncHooks='before' benchmarker='test-double-http'                                                                                -0.79 %
 async_hooks\\http-server.js duration=5 connections=500 asyncHooks='disabled' benchmarker='test-double-http'                                                                              -0.31 %
 async_hooks\\http-server.js duration=5 connections=500 asyncHooks='init' benchmarker='test-double-http'                                                                                  -1.01 %
 async_hooks\\http-server.js duration=5 connections=500 asyncHooks='none' benchmarker='test-double-http'                                                                                  -0.30 %
 async_hooks\\promises.js asyncHooks='disabled' n=1000000                                                                                                                         ***     -1.42 %
 async_hooks\\promises.js asyncHooks='enabled' n=1000000                                                                                                                          ***     -2.77 %
 async_hooks\\promises.js asyncHooks='enabledWithDestroy' n=1000000                                                                                                               ***     -6.45 %
                                                                                                                                                                           accuracy (*)    (**)   (***)
 async_hooks\\async-resource-vs-destroy.js n=1000000 duration=5 connections=500 path='/' asyncMethod='async' type='async-local-storage' benchmarker='test-double-http'          ±0.87% ±1.15% ±1.49%
 async_hooks\\async-resource-vs-destroy.js n=1000000 duration=5 connections=500 path='/' asyncMethod='async' type='async-resource' benchmarker='test-double-http'               ±0.89% ±1.18% ±1.52%
 async_hooks\\async-resource-vs-destroy.js n=1000000 duration=5 connections=500 path='/' asyncMethod='async' type='destroy' benchmarker='test-double-http'                      ±0.88% ±1.16% ±1.49%
 async_hooks\\async-resource-vs-destroy.js n=1000000 duration=5 connections=500 path='/' asyncMethod='callbacks' type='async-local-storage' benchmarker='test-double-http'      ±0.89% ±1.18% ±1.52%
 async_hooks\\async-resource-vs-destroy.js n=1000000 duration=5 connections=500 path='/' asyncMethod='callbacks' type='async-resource' benchmarker='test-double-http'           ±0.91% ±1.20% ±1.55%
 async_hooks\\async-resource-vs-destroy.js n=1000000 duration=5 connections=500 path='/' asyncMethod='callbacks' type='destroy' benchmarker='test-double-http'                  ±0.89% ±1.17% ±1.51%
 async_hooks\\gc-tracking.js method='trackingDisabled' n=1000000                                                                                                                ±1.04% ±1.38% ±1.78%
 async_hooks\\gc-tracking.js method='trackingEnabled' n=1000000                                                                                                                 ±0.76% ±1.00% ±1.30%
 async_hooks\\gc-tracking.js method='trackingEnabledWithDestroyHook' n=1000000                                                                                                  ±0.59% ±0.78% ±1.00%
 async_hooks\\http-server.js duration=5 connections=50 asyncHooks='after' benchmarker='test-double-http'                                                                        ±2.51% ±3.31% ±4.27%
 async_hooks\\http-server.js duration=5 connections=50 asyncHooks='all' benchmarker='test-double-http'                                                                          ±1.52% ±2.01% ±2.59%
 async_hooks\\http-server.js duration=5 connections=50 asyncHooks='before' benchmarker='test-double-http'                                                                       ±2.54% ±3.35% ±4.33%
 async_hooks\\http-server.js duration=5 connections=50 asyncHooks='disabled' benchmarker='test-double-http'                                                                     ±1.03% ±1.36% ±1.76%
 async_hooks\\http-server.js duration=5 connections=50 asyncHooks='init' benchmarker='test-double-http'                                                                         ±2.79% ±3.69% ±4.76%
 async_hooks\\http-server.js duration=5 connections=50 asyncHooks='none' benchmarker='test-double-http'                                                                         ±1.13% ±1.49% ±1.92%
 async_hooks\\http-server.js duration=5 connections=500 asyncHooks='after' benchmarker='test-double-http'                                                                       ±1.92% ±2.54% ±3.28%
 async_hooks\\http-server.js duration=5 connections=500 asyncHooks='all' benchmarker='test-double-http'                                                                         ±1.06% ±1.41% ±1.82%
 async_hooks\\http-server.js duration=5 connections=500 asyncHooks='before' benchmarker='test-double-http'                                                                      ±2.52% ±3.33% ±4.30%
 async_hooks\\http-server.js duration=5 connections=500 asyncHooks='disabled' benchmarker='test-double-http'                                                                    ±1.06% ±1.40% ±1.80%
 async_hooks\\http-server.js duration=5 connections=500 asyncHooks='init' benchmarker='test-double-http'                                                                        ±2.61% ±3.45% ±4.45%
 async_hooks\\http-server.js duration=5 connections=500 asyncHooks='none' benchmarker='test-double-http'                                                                        ±1.08% ±1.43% ±1.85%
 async_hooks\\promises.js asyncHooks='disabled' n=1000000                                                                                                                       ±0.51% ±0.68% ±0.87%
 async_hooks\\promises.js asyncHooks='enabled' n=1000000                                                                                                                        ±0.53% ±0.70% ±0.90%
 async_hooks\\promises.js asyncHooks='enabledWithDestroy' n=1000000                                                                                                             ±0.38% ±0.50% ±0.64%

Be aware that when doing many comparisons the risk of a false-positive
result increases. In this case there are 24 comparisons, you can thus
expect the following amount of false-positive results:
  1.20 false positives, when considering a   5% risk acceptance (*, **, ***),
  0.24 false positives, when considering a   1% risk acceptance (**, ***),
  0.02 false positives, when considering a 0.1% risk acceptance (***)

@addaleax
Copy link
Member

@Flarna Okay, I’ll just try to push something here then (and maybe also take care of the perf regression :))

@addaleax
Copy link
Member

@Flarna How about this?

diff --git a/src/async_wrap.cc b/src/async_wrap.cc
index f53c53999471..add88aaac536 100644
--- a/src/async_wrap.cc
+++ b/src/async_wrap.cc
@@ -782,6 +782,11 @@ void AsyncWrap::EmitDestroy(Environment* env, double async_id) {
   }
 
   if (env->destroy_async_id_list()->empty()) {
+    env->SetImmediate(DestroyAsyncIdsCallback, CallbackFlags::kUnrefed);
+  }
+  // If the list gets very large, try scheduling a microtask to empty it
+  // at an earlier point in time.
+  if (env->destroy_async_id_list()->size() == 16384) {
     env->isolate()->EnqueueMicrotask(DestroyAsyncIdsCallback, env);
   }
 

it keeps performance where it is right but, but still keeps the id list from filling up without limit in the cases in which we can empty it through a microtask?

@Flarna
Copy link
Member Author

Flarna commented Jul 17, 2020

Looks fine. it should be also quite easy to craft a test for this.
If I understand your concerns regarding the current implementation and GC correct this should be no issue anymore afterwards as we always have the setImmediate like on current master.

@addaleax
Copy link
Member

If I understand your concerns regarding the current implementation and GC correct this should be no issue anymore afterwards as we always have the setImmediate like on current master.

Yes :) I think that way we get a best-of-both-worlds result.

@Flarna Flarna force-pushed the asynchooks-destroy-microtask branch from 3b62014 to c10f59f Compare July 27, 2020 22:57
@nodejs-github-bot

This comment has been minimized.

@nodejs-github-bot
Copy link
Collaborator

@Flarna
Copy link
Member Author

Flarna commented Jul 28, 2020

Seems that this doesn't work in some cases as CI shows a crash in node-test-commit-linux-containered for ubuntu1804_sharedlibs_debug_x64 (see https://ci.nodejs.org/job/node-test-commit-linux-containered/21528/).

I think the crash could be avoided by checking from_gc but I fear that this also disables the optimization. Will try...

#
# Fatal error in ../deps/v8/src/heap/heap-inl.h, line 270
# Debug check failed: AllowHeapAllocation::IsAllowed().
#
#
#
#FailureMessage Object: 0x7ffd2d4953c0
 1: 0x55f3ad0dfed5 node::DumpBacktrace(_IO_FILE*) [out/Debug/node]
 2: 0x55f3ad283cff  [out/Debug/node]
 3: 0x55f3ad283d1f  [out/Debug/node]
 4: 0x55f3aecf4f36 V8_Fatal(char const*, int, char const*, ...) [out/Debug/node]
 5: 0x55f3aecf4f65  [out/Debug/node]
 6: 0x55f3ad762c08 v8::internal::HeapObject v8::internal::Heap::AllocateRawWith<(v8::internal::Heap::AllocationRetryMode)1>(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [out/Debug/node]
 7: 0x55f3ad743054 v8::internal::FactoryBase<v8::internal::Factory>::AllocateRawWithImmortalMap(int, v8::internal::AllocationType, v8::internal::Map, v8::internal::AllocationAlignment) [out/Debug/node]
 8: 0x55f3ad750aa1 v8::internal::Factory::NewForeign(unsigned long) [out/Debug/node]
 9: 0x55f3ad703bb7 v8::internal::MicrotaskQueue::EnqueueMicrotask(v8::Isolate*, void (*)(void*), void*) [out/Debug/node]
10: 0x55f3ad0b1231 node::AsyncWrap::EmitDestroy(node::Environment*, double) [out/Debug/node]
11: 0x55f3ad0a9c8c node::AsyncWrap::EmitDestroy(bool) [out/Debug/node]
12: 0x55f3ad0acde1 node::AsyncWrap::~AsyncWrap() [out/Debug/node]
13: 0x55f3ad0bec3c node::PromiseWrap::~PromiseWrap() [out/Debug/node]
14: 0x55f3ad0bec58 node::PromiseWrap::~PromiseWrap() [out/Debug/node]
15: 0x55f3ad0ba04e node::BaseObject::OnGCCollect() [out/Debug/node]
16: 0x55f3ad0b9f54 node::BaseObject::MakeWeak()::{lambda(v8::WeakCallbackInfo<node::BaseObject> const&)#1}::operator()(v8::WeakCallbackInfo<node::BaseObject> const&) const [out/Debug/node]
17: 0x55f3ad0b9f74 node::BaseObject::MakeWeak()::{lambda(v8::WeakCallbackInfo<node::BaseObject> const&)#1}::_FUN(v8::WeakCallbackInfo<node::BaseObject> const&) [out/Debug/node]
18: 0x55f3ad723f77 unsigned long v8::internal::GlobalHandles::InvokeFirstPassWeakCallbacks<v8::internal::GlobalHandles::Node>(std::vector<std::pair<v8::internal::GlobalHandles::Node*, v8::internal::GlobalHandles::PendingPhantomCallback>, std::allocator<std::pair<v8::internal::GlobalHandles::Node*, v8::internal::GlobalHandles::PendingPhantomCallback> > >*) [out/Debug/node]
19: 0x55f3ad724a32 v8::internal::GlobalHandles::InvokeFirstPassWeakCallbacks() [out/Debug/node]
20: 0x55f3ad7b1e2d v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [out/Debug/node]
21: 0x55f3ad7b265b v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [out/Debug/node]
22: 0x55f3ad7b674e v8::internal::Heap::PreciseCollectAllGarbage(int, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [out/Debug/node]
23: 0x55f3ad714ee0 v8::internal::GCExtension::GC(v8::FunctionCallbackInfo<v8::Value> const&) [out/Debug/node]
24: 0x55f3ad4fa0ed v8::internal::FunctionCallbackArguments::Call(v8::internal::CallHandlerInfo) [out/Debug/node]
25: 0x55f3ad4fada3  [out/Debug/node]
26: 0x55f3ad4ffbbc  [out/Debug/node]
27: 0x55f3ad5007c7 v8::internal::Builtin_HandleApiCall(int, unsigned long*, v8::internal::Isolate*) [out/Debug/node]
28: 0x55f3ae34a0c0  [out/Debug/node]

@Flarna
Copy link
Member Author

Flarna commented Jul 28, 2020

I think the crash could be avoided by checking from_gc but I fear that this also disables the optimization. Will try...

Yes, checking from_gc avoids the crash but renders the whole change useless.

@Flarna Flarna force-pushed the asynchooks-destroy-microtask branch from c10f59f to 9c7b852 Compare July 28, 2020 23:38
@nodejs-github-bot

This comment has been minimized.

@nodejs-github-bot

This comment has been minimized.

@nodejs-github-bot

This comment has been minimized.

@nodejs-github-bot
Copy link
Collaborator

@Flarna Flarna changed the title async_hooks: execute destroy hooks in interrupt async_hooks: execute destroy hooks faster Jul 30, 2020
@nodejs-github-bot
Copy link
Collaborator

@Flarna Flarna changed the title async_hooks: execute destroy hooks faster async_hooks: execute destroy hooks earlier Aug 2, 2020
Flarna added a commit that referenced this pull request Aug 2, 2020
Use a microtask to call destroy hooks in case there are a lot queued
as immediate may be scheduled late in case of long running
promise chains.

Queuing a mircrotasks in GC context is not allowed therefore an
interrupt is triggered to do this in JS context as fast as possible.

fixes: #34328
refs: #33896

PR-URL: #34342
Fixes: #34328
Refs: #33896
Reviewed-By: Gus Caplan <me@gus.host>
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: Ben Noordhuis <info@bnoordhuis.nl>
Reviewed-By: James M Snell <jasnell@gmail.com>
@Flarna
Copy link
Member Author

Flarna commented Aug 2, 2020

Landed in cb142d1

@Flarna Flarna closed this Aug 2, 2020
@Flarna Flarna deleted the asynchooks-destroy-microtask branch August 2, 2020 21:15
codebytere pushed a commit that referenced this pull request Aug 5, 2020
Use a microtask to call destroy hooks in case there are a lot queued
as immediate may be scheduled late in case of long running
promise chains.

Queuing a mircrotasks in GC context is not allowed therefore an
interrupt is triggered to do this in JS context as fast as possible.

fixes: #34328
refs: #33896

PR-URL: #34342
Fixes: #34328
Refs: #33896
Reviewed-By: Gus Caplan <me@gus.host>
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: Ben Noordhuis <info@bnoordhuis.nl>
Reviewed-By: James M Snell <jasnell@gmail.com>
@codebytere codebytere mentioned this pull request Aug 10, 2020
addaleax pushed a commit that referenced this pull request Sep 22, 2020
Use a microtask to call destroy hooks in case there are a lot queued
as immediate may be scheduled late in case of long running
promise chains.

Queuing a mircrotasks in GC context is not allowed therefore an
interrupt is triggered to do this in JS context as fast as possible.

fixes: #34328
refs: #33896

PR-URL: #34342
Fixes: #34328
Refs: #33896
Reviewed-By: Gus Caplan <me@gus.host>
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: Ben Noordhuis <info@bnoordhuis.nl>
Reviewed-By: James M Snell <jasnell@gmail.com>
addaleax pushed a commit that referenced this pull request Sep 22, 2020
Use a microtask to call destroy hooks in case there are a lot queued
as immediate may be scheduled late in case of long running
promise chains.

Queuing a mircrotasks in GC context is not allowed therefore an
interrupt is triggered to do this in JS context as fast as possible.

fixes: #34328
refs: #33896

PR-URL: #34342
Fixes: #34328
Refs: #33896
Reviewed-By: Gus Caplan <me@gus.host>
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: Ben Noordhuis <info@bnoordhuis.nl>
Reviewed-By: James M Snell <jasnell@gmail.com>
@codebytere codebytere mentioned this pull request Sep 28, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
async_hooks Issues and PRs related to the async hooks subsystem. author ready PRs that have at least one approval, no pending requests for changes, and a CI started. c++ Issues and PRs that require attention from people who are familiar with C++.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Memory leak on Promise.all with async/await
6 participants