Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question: What about for support WorkingThreads on WRTC addon? #623

Open
nick-erm opened this issue May 4, 2020 · 15 comments
Open

Question: What about for support WorkingThreads on WRTC addon? #623

nick-erm opened this issue May 4, 2020 · 15 comments

Comments

@nick-erm
Copy link
Contributor

nick-erm commented May 4, 2020

I'm trying to create workertread's with new wrtc.RTCPeerConnection object in ever thread and in second created thread NodeJS throw the error: "HandleScope::HandleScope Entering the V8 API without proper locking in place". Not shure, but, in my opinion, this error throw because wrtc addon not context-aware addon. Maybe I'm wrong... Thank you in advance for reply.

@markandrus
Copy link
Member

Thanks, @nick-erm — I haven't looked into context-aware addons very closely, but I don't think we are context-aware. I believe we would need to update (at least) all the instances where we have

  static Napi::FunctionReference& constructor();

on a class (probably these need to become synchronized maps from context to constructor) as well as all the instances where we have, e.g.,

  static ::node_webrtc::Wrap <
  RTCDtlsTransport*,
  rtc::scoped_refptr<webrtc::DtlsTransportInterface>,
  PeerConnectionFactory*
  > * wrap();

The Wrap class would also need to take context into account. Also, PeerConnectionFactory's defaulting behavior would need to be made context aware:

  /**
   * Get or create the default PeerConnectionFactory. The default uses
   * webrtc::AudioDeviceModule::AudioLayer::kDummyAudio. Call {@link Release} when done.
   */
  static PeerConnectionFactory* GetOrCreateDefault();

@markandrus
Copy link
Member

@nick-erm OTOH, if I read Node's documentation for Worker support, just below the context-aware information you shared, it claims that

In order to be loaded from multiple Node.js environments, such as a main thread and a Worker thread, an add-on needs to either:

  • Be an N-API addon, or
  • Be declared as context-aware using NODE_MODULE_INIT() as described above

As of 0.4.x, node-webrtc is indeed an N-API addon. It's unclear to me if that's actually sufficient…

Can you please share a minimal reproduction example that demonstrates the Handle issue? Can you also share Node and node-webrtc versions?

@nick-erm
Copy link
Contributor Author

nick-erm commented May 5, 2020

Versions and Error stack:
*:***>node -v
v13.13.0

package.json
{
"_from": "wrtc@^0.4.4",
"_id": "wrtc@0.4.4",

2020-05-05 13:01:17.105 New workerThread Created
2020-05-05 13:01:17.105 PId/threadId: 8432/2
2020-05-05 13:01:17.105 Developer mode: true
2020-05-05 13:01:17.105 Config: port:9443 stunservers:[{"urls":"stun:***.streamlock.net:19302"}]
2020-05-05 13:01:17.110 Info from parent: MSG from MAIN: Your started!
2020-05-05 13:01:17.148 ON SOCKET CONNECT! Sending Token...
2020-05-05 13:01:17.228 Connection Ready! SocketID: PbY_pKyzR3DocTmQAAAD
2020-05-05 13:01:18.647 MSG chkWorkerState 1
2020-05-05 13:01:21.661 MSG chkWorkerState 2
2020-05-05 13:01:21.700 PEER Sending message: avatar
2020-05-05 13:01:23.648 MSG chkWorkerState 1
2020-05-05 13:01:26.661 MSG chkWorkerState 2
2020-05-05 13:01:26.699 PEER Sending message: avatar
FATAL ERROR: HandleScope::HandleScope Entering the V8 API without proper locking in place
1: 000000013F8D006F napi_wrap+110719
2: 000000013F87AFC6 v8::base::CPU::has_sse+65190
3: 000000013F87BDD6 v8::base::CPU::has_sse+68790
4: 000000014009475B v8::HandleScope::Initialize+107
5: 000000014007CAFE v8::HandleScope::HandleScope+14
6: 000000013F8B3D78 napi_open_handle_scope+120
7: 000007FEE8534187
8: 000007FEE8534364
9: 000000013F8AD0B8 node::Stop+29704
10: 000000013FF84341 v8::internal::GlobalHandles::InvokeSecondPassPhantomCallbacks+177
11: 000000013FF844EE v8::internal::GlobalHandles::InvokeSecondPassPhantomCallbacksFromTask+366
12: 000000013F8163FF std::basic_ostream<char,std::char_traits >::operator<<+70527
13: 000000013F815464 std::basic_ostream<char,std::char_traits >::operator<<+66532
14: 000000013F91FECB uv_async_send+331
15: 000000013F91F66C uv_loop_init+1212
16: 000000013F91F834 uv_run+244
17: 000000013F7C5D96 v8::internal::wasm::SignatureMap::Freeze+21318
18: 000000013F7C25D2 v8::internal::wasm::SignatureMap::Freeze+7042
19: 000000013F910E3D uv_poll_stop+765
20: 00000001405E2310 v8::internal::SetupIsolateDelegate::SetupHeap+1529216
21: 00000000779A556D BaseThreadInitThunk+13
22: 0000000077B0372D RtlUserThreadStart+29

@nick-erm
Copy link
Contributor Author

nick-erm commented May 5, 2020

May be this wrong for my realisation:
node::AtExit(dispose); (in binding.cc)
https://github.com/nodejs/node/blob/c17dcb32533aa007dfbf507d22c28ef3c7c11c29/src/node.h#L818
NODE_DEPRECATED(
"Use the three-argument variant of AtExit() or AddEnvironmentCleanupHook()",
NODE_EXTERN void AtExit(void (cb)(void arg), void* arg = nullptr));

Also generating Warning when compile:
..\node-webrtc\src\binding.cc(67,9): warning C4996: 'node::AtExit': was declared deprecated

@nick-erm
Copy link
Contributor Author

nick-erm commented May 7, 2020

Reflections on the topic:
AtExit() deprecation history: nodejs/node#30227
getenv:
If the function is used in a mutlithreaded environment, then the buffer could be modified while someone is reading from it. It is also not re-entrant, meaning that if, while reading the buffer, you make a function call that also eventually calls getenv(), then the value could change unexpectedly. (https://stackoverflow.com/questions/48568707/getenv-function-may-be-unsafe-really)
(https://docs.microsoft.com/en-us/cpp/c-runtime-library/reference/getenv-wgetenv?view=vs-2019)
Worker support:
Needed add AddEnvironmentCleanupHook() in NODE_MODULE_INIT for all Init objects???

@nick-erm
Copy link
Contributor Author

Error stack with new version of WRTC:
FATAL ERROR: HandleScope::HandleScope Entering the V8 API without proper locking in place
1: 000000013F7406EF napi_wrap+110719
2: 000000013F6EB546 v8::base::CPU::has_sse+65190
3: 000000013F6EC356 v8::base::CPU::has_sse+68790
4: 000000013FF051FB v8::HandleScope::Initialize+107
5: 000000013FEED59E v8::HandleScope::HandleScope+14
6: 000000013F7243F8 napi_open_handle_scope+120
7: 000007FEED754ADD node::ArrayBufferAllocator::operator=+571917
8: 000007FEED7499AB node::ArrayBufferAllocator::operator=+526555
9: 000007FEED7498DC node::ArrayBufferAllocator::operator=+526348
10: 000007FEED7774AD node::ArrayBufferAllocator::operator=+713693
11: 000007FEED76FB86 node::ArrayBufferAllocator::operator=+682678
12: 000007FEED76E10F node::ArrayBufferAllocator::operator=+675903
13: 000007FEED74F363 node::ArrayBufferAllocator::operator=+549523
14: 000000013F7908BB uv_async_send+331
15: 000000013F79005C uv_loop_init+1212
16: 000000013F790224 uv_run+244
17: 000000013F635FA6 v8::internal::interpreter::BytecodeLabel::bind+21318
18: 000000013F6327E2 v8::internal::interpreter::BytecodeLabel::bind+7042
19: 000000013F78143D uv_poll_stop+765
20: 0000000140452D60 v8::internal::SetupIsolateDelegate::SetupHeap+1529216
21: 0000000076E7556D BaseThreadInitThunk+13
22: 0000000076FD372D RtlUserThreadStart+29

@nick-erm
Copy link
Contributor Author

...For every next RTCPConnection Nodejs creates 7 child processes.
614#issue
7: 000007FEED754ADD node::ArrayBufferAllocator::operator=+571917
8: 000007FEED7499AB node::ArrayBufferAllocator::operator=+526555
9: 000007FEED7498DC node::ArrayBufferAllocator::operator=+526348
10: 000007FEED7774AD node::ArrayBufferAllocator::operator=+713693
11: 000007FEED76FB86 node::ArrayBufferAllocator::operator=+682678
12: 000007FEED76E10F node::ArrayBufferAllocator::operator=+675903
13: 000007FEED74F363 node::ArrayBufferAllocator::operator=+549523

@nick-erm
Copy link
Contributor Author

Greetings, @markandrus!
After some troubles with the zoo of versions and settings of the preprocessor, compiler, linker for Windows and reading RTFM and code, I've got the following results (sorry for so long history and previously incorrect assumptions):
Part One:

  • normal release/debug build with current repo sources and settings

  • normal release/debug build with: Node v10.20.1/13.14.0, MSVC 2019 x64 19.14.25.28610/19.26.28805.0, NAPI v6, node-addon-api@3.0.0 and napi_add_env_cleanup_hook() as in Replace deprecated AtExit(). #633

  • exceptions when creating a new thread remain:
    when video stream present - incorrect calling RTCRtpReceiver destructor? -

6: 000000013FEB43F8 napi_open_handle_scope+120
7: 000007FEDD01BDA2 Napi::HandleScope::HandleScope+114 [..\node-webrtc\node-addon-api\napi-inl.h]:L3514
8: 000007FEDD28021C node_webrtc::RTCRtpReceiver::~RTCRtpReceiver+76 [..\node-webrtc\src\interfaces\rtc_rtp_receiver.cc]:L50
9: 000007FEDD28A91C node_webrtc::RTCRtpReceiver::`scalar deleting destructor'+44
10: 000007FEDD28B804 Napi::ObjectWrap<node_webrtc::RTCRtpReceiver>::FinalizeCallback+132 [..\node-webrtc\node-addon-api\napi-inl.h]:L3502
11: 000000013FEAD738 node::Stop+29720

when only Datachannel - incorrect Wrap? and uv_poll_stop -

7: 000007FEDAC0BDA2 Napi::HandleScope::HandleScope+114 [..\node-webrtc\node-addon-api\napi-inl.h]:L3514
8: 000007FEDADD223B node_webrtc::RTCDataChannel::Create+75 [..\node-webrtc\src\interfaces\rtc_data_channel.cc]:L344
9: 000007FEDAE3DEBB <lambda_2956175782e5deb35b654a683c21462c>::operator()+91 [..\node-webrtc\src\node\wrap.h]:L28
10: 000007FEDAE21BD0 std::_Invoker_functor::_Call<<lambda_2956175782e5deb35b654a683c21462c> &>+48 [C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.25.28610\include\type_traits]:L1610
11: 000007FEDAE2E420 std::invoke<<lambda_2956175782e5deb35b654a683c21462c> &>+48 [C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.25.28610\19:35\include\type_traits]:L1610
12: 000007FEDAE21C10 std::_Invoker_ret<node_webrtc::RTCDataChannel *,0>::_Call<<lambda_2956175782e5deb35b654a683c21462c> &>+48 [C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.25.28610\include\type_traits]:L1637
13: 000007FEDAE4D5EF std::_Func_impl_no_alloc<<lambda_2956175782e5deb35b654a683c21462c>,node_webrtc::RTCDataChannel *>::_Do_call+47 [C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.25.28610\include\functional]:L927
14: 000007FEDAE40213 std::_Func_class<node_webrtc::RTCDataChannel *>::operator()+83 [C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.25.28610\include\functional]:L977
15: 000007FEDAE3E4DF <lambda_4f15fdc133e4de6219498bc02c05a6e8>::operator()+47 [..\node-webrtc\src\utilities\bidi_map.h]:L50
16: 000007FEDAE21D50 std::_Invoker_functor::_Call<<lambda_4f15fdc133e4de6219498bc02c05a6e8> &>+48 [C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC
14.25.28610\include\type_traits]:L1610
17: 000007FEDAE2E4E0 std::invoke<<lambda_4f15fdc133e4de6219498bc02c05a6e8> &>+48 [C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.25.28610\include\type_traits]:L1610
18: 000007FEDAE21D90 std::_Invoker_ret<node_webrtc::RTCDataChannel *,0>::_Call<<lambda_4f15fdc133e4de6219498bc02c05a6e8> &>+48 [C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.25.28610\include\type_traits]:L1637
19: 000007FEDAE4D69F std::_Func_impl_no_alloc<<lambda_4f15fdc133e4de6219498bc02c05a6e8>,node_webrtc::RTCDataChannel *>::_Do_call+47 [C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.25.28610\include\functional]:L927
20: 000007FEDAE40213 std::Func_class<node_webrtc::RTCDataChannel *>::operator()+83 [C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.25.28610\include\functional]:L977
21: 000007FEDAE49259 node_webrtc::Maybe<node_webrtc::RTCDataChannel *>::Or+73 [..\node-webrtc\src\functional\maybe.h]:L119
22: 000007FEDAE5B4F0 node_webrtc::BidiMap<rtc::scoped_refptrwebrtc::DataChannelInterface,node_webrtc::RTCDataChannel *>::computeIfAbsent+256 [..\node-webrtc\src\utilities\bidi_map.h]:L49
23: 000007FEDAE470CD node_webrtc::Wrap<node_webrtc::RTCDataChannel *,rtc::scoped_refptrwebrtc::DataChannelInterface,node_webrtc::DataChannelObserver *>::GetOrCreate+237 [..\node-webrtc\src\node\wrap.h]:L26
24: 000007FEDAE3DDF2 <lambda_1fc0630ff1c7605eb399ec9cd770941c>::operator()+114 [..\node-webrtc\src\interfaces\rtc_peer_connection.cc]:L197
25: 000007FEDAE46924 node_webrtc::Callback<<lambda_1fc0630ff1c7605eb399ec9cd770941c>,node_webrtc::RTCPeerConnection>::Dispatch+52 [..\node-webrtc\src\node\events.h]:L42
26: 000007FEDAE49B3E node_webrtc::EventLoop<node_webrtc::RTCPeerConnection>::Run+350 [..\node-webrtc\src\node\event_loop.h]:L59
27: 000007FEDAE1D540 <lambda_4e40db74d38cf7c089931fbf026cabb4>::operator()<uv_async_s *>+64 [..\node-webrtc\src\node\event_loop.h]:L44
28: 000007FEDAE15F1A <lambda_4e40db74d38cf7c089931fbf026cabb4>::<lambda_invoker_cdecl><uv_async_s *>+42 [..\node-webrtc\src\node\event_loop.h]:L44
29: 000000013FC808BB uv_async_send+331
30: 000000013FC8005C uv_loop_init+1212
31: 000000013FC80224 uv_run+244
32: 000000013FB25FA6 v8::internal::interpreter::BytecodeLabel::bind+21318
33: 000000013FB227E2 v8::internal::interpreter::BytecodeLabel::bind+7042
34: 000000013FC7143D uv_poll_stop+765

when WorkerThread exit - exception when Delete(RefBase* reference) in `anonymous-namespace'

(paced_sender.cc:176): ProcessThreadAttached 0x0
The thread 0x2c30 has exited with code 0 (0x0).
(paced_sender.cc:176): ProcessThreadAttached 0x0
(peer_connection.cc:7011): Usage signature is 133119
(peer_connection.cc:1077): Session: 5990766547152992292 is destroyed.

Exception thrown at 0x000000013FEBD7E6 in node.exe: 0xC0000005: Access violation reading location 0xFFFFFFFFFFFFFFFF.

In ..\node\sources\src\js_native_api_v8.cc

// The second way this is called is from
// the finalizer and _delete_self is set. In this case we
// know we need to do the deletion so just do it.
static inline void Delete(RefBase* reference) {
reference->Unlink();
if ((reference->RefCount() != 0) ||
(reference->_delete_self) ||
(reference->_finalize_ran)) {
delete reference;
} else {
// defer until finalizer runs as
// it may alread be queued
reference->_delete_self = true;
}
}

Call stack -

[Inline Frame] node.exe!v8impl::anonymous-namespace'::RefBase::Delete(v8impl::anonymous-namespace'::RefBase *) Line 244 C++
node.exe!v8impl::`anonymous namespace'::RefBase::Finalize(bool is_env_teardown) Line 282 C++
[Inline Frame] node.exe!v8impl::RefTracker::FinalizeAll(v8impl::RefTracker ) Line 43 C++
node.exe!napi_env__::~napi_env__() Line 66 C++
[External Code]
node.exe!node::Environment::RunCleanup() Line 625 C++
[Inline Frame] node.exe!node::worker::Worker::Run::__l27::<lambda_7b21d699c7db37f4b0af4c8cd914626d>::operator()() Line 287 C++
node.exe!node::OnScopeLeaveImpl<<lambda_7b21d699c7db37f4b0af4c8cd914626d>>::~OnScopeLeaveImpl<<lambda_7b21d699c7db37f4b0af4c8cd914626d>>() Line 521 C++
node.exe!node::worker::Worker::Run() Line 397 C++
node.exe!node::worker::Worker::StartThread::__l2::(void * arg) Line 632 C++
node.exe!uv__thread_start(void * arg) Line 110 C
[Inline Frame] node.exe!invoke_thread_procedure(unsigned int(
)(void )) Line 91 C++
node.exe!thread_start<unsigned int (__cdecl
)(void * __ptr64)>(void * const parameter) Line 115 C++
[External Code]

@nick-erm
Copy link
Contributor Author

nick-erm commented May 22, 2020

Part two:

`We find that webrtc use a proxy (PeerConnectionFactoryProxy) with a signal thread internally to warp the PeerConnectionFactory. The reason for that is webrtc need a proxy to proxy calls from different threads to the signal thread to keep things get done in a single thread. In that way, webrtc can write thread safe code without lock. (https://unclerunning.github.io/2017/11/26/CreatePeerConnectionFactory.html)
'

  • Questions (and for self):

Is it true that _signalingThread is Main thread for _workerThread and other transport threads (dtls,sctp, ice) ?
How correctly set hieararchy of Context's for thread objects? As discuss on this Issue ?

@markandrus
Copy link
Member

I've written up a little RTCDataChannel-based worker script for testing with. It's useful for reproducing at least some of the issues described, such as destructor issues:

FATAL ERROR: HandleScope::HandleScope Entering the V8 API without proper locking in place
 1: 0x10123d725 node::Abort() (.cold.1) [/Users/mark/.volta/tools/image/node/14.3.0/bin/node]
 2: 0x10009dcd9 node::Abort() [/Users/mark/.volta/tools/image/node/14.3.0/bin/node]
 3: 0x10009de3f node::OnFatalError(char const*, char const*) [/Users/mark/.volta/tools/image/node/14.3.0/bin/node]
 4: 0x1001dc731 v8::HandleScope::HandleScope(v8::Isolate*) [/Users/mark/.volta/tools/image/node/14.3.0/bin/node]
 5: 0x100063a68 napi_open_handle_scope [/Users/mark/.volta/tools/image/node/14.3.0/bin/node]
 6: 0x107005b64 Napi::HandleScope::HandleScope(Napi::Env) [/Users/mark/src/node-webrtc/build/Debug/wrtc.node]
 7: 0x107005add Napi::HandleScope::HandleScope(Napi::Env) [/Users/mark/src/node-webrtc/build/Debug/wrtc.node]
 8: 0x1072a2df4 node_webrtc::AsyncContextReleaser::GetDefault() [/Users/mark/src/node-webrtc/build/Debug/wrtc.node]
 9: 0x1071af6e7 node_webrtc::AsyncObjectWrap<node_webrtc::RTCDataChannel>::DestroyAsyncContext() [/Users/mark/src/node-webrtc/build/Debug/wrtc.node]
10: 0x1071af5f3 node_webrtc::AsyncObjectWrap<node_webrtc::RTCDataChannel>::~AsyncObjectWrap() [/Users/mark/src/node-webrtc/build/Debug/wrtc.node]
11: 0x1071a837f node_webrtc::AsyncObjectWrapWithLoop<node_webrtc::RTCDataChannel>::~AsyncObjectWrapWithLoop() [/Users/mark/src/node-webrtc/build/Debug/wrtc.node]
12: 0x1071a8491 node_webrtc::RTCDataChannel::~RTCDataChannel() [/Users/mark/src/node-webrtc/build/Debug/wrtc.node]
13: 0x1071a8615 node_webrtc::RTCDataChannel::~RTCDataChannel() [/Users/mark/src/node-webrtc/build/Debug/wrtc.node]
14: 0x1071a8679 node_webrtc::RTCDataChannel::~RTCDataChannel() [/Users/mark/src/node-webrtc/build/Debug/wrtc.node]
15: 0x1071b434b Napi::ObjectWrap<node_webrtc::RTCDataChannel>::FinalizeCallback(napi_env__*, void*, void*) [/Users/mark/src/node-webrtc/build/Debug/wrtc.node]
16: 0x100066066 v8impl::(anonymous namespace)::RefBase::Finalize(bool) [/Users/mark/.volta/tools/image/node/14.3.0/bin/node]
17: 0x10032e453 v8::internal::GlobalHandles::InvokeSecondPassPhantomCallbacksFromTask() [/Users/mark/.volta/tools/image/node/14.3.0/bin/node]
18: 0x1001036f3 node::PerIsolatePlatformData::RunForegroundTask(std::__1::unique_ptr<v8::Task, std::__1::default_delete<v8::Task> >) [/Users/mark/.volta/tools/image/node/14.3.0/bin/node]
19: 0x1001021c7 node::PerIsolatePlatformData::FlushForegroundTasksInternal() [/Users/mark/.volta/tools/image/node/14.3.0/bin/node]
20: 0x10096ee14 uv__async_io [/Users/mark/.volta/tools/image/node/14.3.0/bin/node]
21: 0x10098228f uv__io_poll [/Users/mark/.volta/tools/image/node/14.3.0/bin/node]
22: 0x10096f381 uv_run [/Users/mark/.volta/tools/image/node/14.3.0/bin/node]
23: 0x1000daea5 node::NodeMainInstance::Run() [/Users/mark/.volta/tools/image/node/14.3.0/bin/node]
24: 0x100072a22 node::Start(int, char**) [/Users/mark/.volta/tools/image/node/14.3.0/bin/node]
25: 0x7fff58f1f3d5 start [/usr/lib/system/libdyld.dylib]
26: 0x2 
'use strict';

const { RTCPeerConnection } = require('.');

const {
  Worker,
  isMainThread,
  parentPort,
  workerData
} = require('worker_threads');

const TIMEOUT = 10 * 1000;

if (isMainThread) {
  const worker = new Worker(__filename);
  startOfferer(worker);
} else {
  startAnswerer(parentPort);
}

async function startOfferer(worker) {
  const { pc, dc } = await createOfferer();
  const timeout = setTimeout(() => {
    console.log('Timed out! Stopping...');
    pc.close();
    process.exit(1);
  }, TIMEOUT);
  const offer = pc.localDescription;
  console.log('Offerer sending offer:\n', offer);
  worker.postMessage(JSON.stringify(offer));
  const answer = JSON.parse(await getMessage(worker));
  console.log('Offerer received answer:\n', answer);
  try {
    await pc.setRemoteDescription(answer);
  } catch (error) {
    pc.close();
    throw error;
  }
  dc.addEventListener('message', ({ data: message }) => {
    console.log('Offerer received message:\n', message);
    if (message === 'pong') {
      console.log('Success! Stopping...');
      clearTimeout(timeout);
      pc.close();
    }
  });
  await dataChannelIsOpen(dc);
  dc.send('ping');
}

async function startAnswerer(parentPort) {
  const offer = JSON.parse(await getMessage(parentPort));
  console.log('Answerer received offer:\n', offer);
  const { pc, dcPromise } = await createAnswerer(offer);
  const answer = pc.localDescription;
  console.log('Answerer sending answer:\n', answer);
  parentPort.postMessage(JSON.stringify(answer));
  const dc = await dcPromise;
  console.log('Answerer received RTCDataChannel:\n', dc);
  dc.addEventListener('message', ({ data: message }) => {
    console.log('Answerer received message:\n', message);
    if (message === 'ping') {
      dc.send('pong');
    }
  });
}

async function createOfferer() {
  const pc = new RTCPeerConnection();
  try {
    const dc = pc.createDataChannel('foo');
    console.log('Offerer created RTCDataChannel:\n', dc);
    const offer = await pc.createOffer();
    await pc.setLocalDescription(offer);
    await iceGatheringIsComplete(pc);
    return {
      pc,
      dc
    }
  } catch (error) {
    pc.close();
    throw error;
  }
}

async function createAnswerer(offer) {
  const pc = new RTCPeerConnection();
  const dcPromise = getDataChannel(pc);
  try {
    await pc.setRemoteDescription(offer);
    const answer = await pc.createAnswer();
    await pc.setLocalDescription(answer);
    return {
      pc,
      dcPromise
    };
  } catch (error) {
    pc.close();
    throw error;
  }
}

async function iceGatheringIsComplete(pc) {
  if (pc.iceGatheringState === 'complete') {
    return null;
  }
  return new Promise(resolve => pc.addEventListener('icecandidate', ({ candidate }) => {
    if (!candidate) {
      resolve(null);
    }
  }));
}

function getDataChannel(pc) {
  return new Promise(resolve => pc.addEventListener('datachannel', ({ channel }) => resolve(channel)));
}

async function dataChannelIsOpen(dc) {
  if (dc.readyState === 'open') {
    return null;
  }
  return new Promise(resolve => dc.addEventListener('open', () => resolve(null)));
}

function getMessage(parentPortOrWorker) {
  return new Promise(resolve => parentPortOrWorker.once('message', resolve));
}

@nick-erm
Copy link
Contributor Author

Hmm..., Thanks, @markandrus !
Exception with one thread, it's new for me. In my environment I always put STUN/TURN in PC options, therefore I did not get this exception.
If a little modify (timeout 0 or >0, really >one tick):

if (isMainThread) {
  const worker = new Worker(__filename);
  setTimeout( () => {
	startOfferer(worker);
  }, 30);
} else {
  setTimeout( () => {
	startAnswerer(parentPort);
  }, 30);
}

will see a new behavior...

If TURN/STUN is set then wrtc normally finished:

Timed out! Stopping...
(peer_connection.cc:4794): Session: 3848223522458020486 Old state: kStable New state: kClosed
(peer_connection.cc:6741): Tearing down data channel transport for mid=0
(openssl_stream_adapter.cc:912): Cleanup
(paced_sender.cc:176): ProcessThreadAttached 0x0
(paced_sender.cc:176): ProcessThreadAttached 0x0
(peer_connection.cc:7011): Usage signature is 1127
(peer_connection.cc:1077): Session: 3848223522458020486 is destroyed.
(peer_connection.cc:6741): Tearing down data channel transport for mid=0
(peer_connection.cc:1077): Session: 8055325123871393922 is destroyed.
(openssl_stream_adapter.cc:912): Cleanup
(paced_sender.cc:176): ProcessThreadAttached 0x0
(paced_sender.cc:176): ProcessThreadAttached 0x0

But exception on exit still staying:

[Inline Frame] node.exe!v8impl::anonymous-namespace'::RefBase::Delete(v8impl::anonymous-namespace'::RefBase *) Line 244 C++
node.exe!v8impl::`anonymous namespace'::RefBase::Finalize(bool is_env_teardown) Line 282 C++
[Inline Frame] node.exe!v8impl::RefTracker::FinalizeAll(v8impl::RefTracker ) Line 43 C++
node.exe!napi_env__::~napi_env__() Line 66 C++
[External Code]
node.exe!node::Environment::RunCleanup() Line 625 C++
[Inline Frame] node.exe!node::worker::Worker::Run::__l27::<lambda_7b21d699c7db37f4b0af4c8cd914626d>::operator()() Line 287 C++
node.exe!node::OnScopeLeaveImpl<<lambda_7b21d699c7db37f4b0af4c8cd914626d>>::~OnScopeLeaveImpl<<lambda_7b21d699c7db37f4b0af4c8cd914626d>>() Line 521 C++
node.exe!node::worker::Worker::Run() Line 397 C++
node.exe!node::worker::Worker::StartThread::__l2::(void * arg) Line 632 C++
node.exe!uv__thread_start(void * arg) Line 110 C
[Inline Frame] node.exe!invoke_thread_procedure(unsigned int(
)(void )) Line 91 C++
node.exe!thread_start<unsigned int (__cdecl
)(void * __ptr64)>(void * const parameter) Line 115 C++

@nick-erm
Copy link
Contributor Author

And more, more docs, and more, more bugs...)
About exception (my comment, and #636) on exit - I think this bug so similar like this
nodejs/node-addon-api#729 and nodejs/node#33508, and we need waiting for the final implementation, that close this bug.
For workingThreads it seems the main problem is nodejs/node-addon-api#711 (comment)! Continuation of its discussion here for Constructor nodejs/node-addon-api#654, and nodejs/node-addon-api#730, nodejs/node-addon-api#725, nodejs/node-addon-api#738, upgrade for ObjectWrap nodejs/node-addon-examples#137. Sample for Multiple loading and unloading Context awareness addon https://napi.inspiredware.com/special-topics/context-awareness.html.
I would like to hear your opinion, @markandrus.
And could you give an example of passing context to the Wrap class, as suggested above? Thanks in advance.

@CharlesRA
Copy link
Contributor

Hello, thank you for your amazing work !
Is there any update on this issue ?

@nick-erm
Copy link
Contributor Author

Hello, as I see in https://github.com/nodejs/node/pull/34093-03 a new version of NodeJS (v14.5.0) with the necessary updates #nodejs/node#33508 plans to release 30.06.2020.

@Laky-64
Copy link

Laky-64 commented Oct 4, 2021

I'm getting this error when i try to use NodeJs 16.10 with WorkerThreads, and doesn't works

FATAL ERROR: v8::HandleScope::CreateHandle() Cannot create a handle without a HandleScope
 1: 0xafd010 node::Abort() [node]
 2: 0xa141fb node::FatalError(char const*, char const*) [node]
 3: 0xce71aa v8::Utils::ReportApiFailure(char const*, char const*) [node]
 4: 0xe69292 v8::internal::HandleScope::Extend(v8::internal::Isolate*) [node]
 5: 0xce8851 v8::HandleScope::CreateHandle(v8::internal::Isolate*, unsigned long) [node]
 6: 0xaafea5 napi_get_reference_value [node]
 7: 0x7f6ed4e380ab  [/var/www/music_bot/pytgcalls/tests/env/lib/python3.8/site-packages/pytgcalls/node_modules/wrtc/build/Release/wrtc.node]
 8: 0x7f6ed4e3066a  [/var/www/music_bot/pytgcalls/tests/env/lib/python3.8/site-packages/pytgcalls/node_modules/wrtc/build/Release/wrtc.node]
 9: 0x7f6ed4e37440  [/var/www/music_bot/pytgcalls/tests/env/lib/python3.8/site-packages/pytgcalls/node_modules/wrtc/build/Release/wrtc.node]
10: 0x7f6ed4ed4089  [/var/www/music_bot/pytgcalls/tests/env/lib/python3.8/site-packages/pytgcalls/node_modules/wrtc/build/Release/wrtc.node]
11: 0x7f6ed4ed4a04  [/var/www/music_bot/pytgcalls/tests/env/lib/python3.8/site-packages/pytgcalls/node_modules/wrtc/build/Release/wrtc.node]
12: 0xaa7f8d  [node]
13: 0xd4339b  [node]
14: 0xd4462c  [node]
15: 0xd44b06 v8::internal::Builtin_HandleApiCall(int, unsigned long*, v8::internal::Isolate*) [node]
16: 0x15e8099  [node]
^C(env) root@Ubuntu-1804-bionic-64-minimal:/var/www/music_bot/pytgcalls/tests# python3.8 test3.py
PyTgCalls v0.8.1 Beta 25, Copyright (C) 2021 Laky-64 <https://github.com/Laky-64>
Licensed under the terms of the GNU Lesser General Public License v3 or later (LGPLv3+)

 [952166868] Started Node.js core! 
TEST AAA
TEST ABAB
TEST BBB
TEST CCC
TEST DDD
TEST AAA
FATAL ERROR: v8::HandleScope::CreateHandle() Cannot create a handle without a HandleScope
 1: 0xafd010 node::Abort() [node]
 2: 0xa141fb node::FatalError(char const*, char const*) [node]
 3: 0xce71aa v8::Utils::ReportApiFailure(char const*, char const*) [node]
 4: 0xe69292 v8::internal::HandleScope::Extend(v8::internal::Isolate*) [node]
 5: 0xce8851 v8::HandleScope::CreateHandle(v8::internal::Isolate*, unsigned long) [node]
 6: 0xaafea5 napi_get_reference_value [node]
 7: 0x7f64822000ab  [/var/www/music_bot/pytgcalls/tests/env/lib/python3.8/site-packages/pytgcalls/node_modules/wrtc/build/Release/wrtc.node]
 8: 0x7f64821f866a  [/var/www/music_bot/pytgcalls/tests/env/lib/python3.8/site-packages/pytgcalls/node_modules/wrtc/build/Release/wrtc.node]
 9: 0x7f64821ff440  [/var/www/music_bot/pytgcalls/tests/env/lib/python3.8/site-packages/pytgcalls/node_modules/wrtc/build/Release/wrtc.node]
10: 0x7f648229c089  [/var/www/music_bot/pytgcalls/tests/env/lib/python3.8/site-packages/pytgcalls/node_modules/wrtc/build/Release/wrtc.node]
11: 0x7f648229ca04  [/var/www/music_bot/pytgcalls/tests/env/lib/python3.8/site-packages/pytgcalls/node_modules/wrtc/build/Release/wrtc.node]
12: 0xaa7f8d  [node]
13: 0xd4339b  [node]
14: 0xd4462c  [node]
15: 0xd44b06 v8::internal::Builtin_HandleApiCall(int, unsigned long*, v8::internal::Isolate*) [node]
16: 0x15e8099  [node]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants