From db291aaf065d5b96cf179f74be7bafb4d1023a6e Mon Sep 17 00:00:00 2001 From: Michael Dawson Date: Fri, 24 Jan 2020 11:46:40 -0500 Subject: [PATCH 01/91] doc: guide - using valgrind to debug memory leaks Add doc for using valgrind to debug native memory leaks. Started writing this up as part of an effort in the Diagnostic WG but think it's better to have it in the core guides and then be referenced by the docs in the Diagnostic WG repo. For more details on the Diagnostic WG effort see https://github.com/nodejs/diagnostics/issues/254#issuecomment-538853390 This guide is related to `/step3 - using_native_tools.md` PR-URL: https://github.com/nodejs/node/pull/31501 Reviewed-By: Gireesh Punathil Reviewed-By: Denys Otrishko Reviewed-By: James M Snell --- .../investigating_native_memory_leak.md | 447 ++++++++++++++++++ 1 file changed, 447 insertions(+) create mode 100644 doc/guides/investigating_native_memory_leak.md diff --git a/doc/guides/investigating_native_memory_leak.md b/doc/guides/investigating_native_memory_leak.md new file mode 100644 index 00000000000000..366cc2917f6a4c --- /dev/null +++ b/doc/guides/investigating_native_memory_leak.md @@ -0,0 +1,447 @@ +# Investigating Memory Leaks with valgrind + +A Node.js process may run out of memory due to excessive consumption of +native memory. Native Memory is memory which is not managed by the +V8 Garbage collector and is allocated either by the Node.js runtime, its +dependencies or native [addons](https://nodejs.org/docs/latest/api/n-api.html). + +This guide provides information on how to use valgrind to investigate these +issues on Linux platforms. + +## valgrind + +[Valgrind](https://valgrind.org/docs/manual/quick-start.html) is a +tool available on Linux distributions which can be used to investigate +memory usage including identifying memory leaks (memory which is +allocated and not freed) and other memory related problems +like double freeing memory. + +To use valgrind: + +* Be patient, running under valgrind slows execution significantly + due to the checks being performed. +* Reduce your test case to the smallest reproduce. Due to the slowdown it is + important to run the minimum test case in order to be able to do it in + a reasonable time. + +## Installation + +It is an optional package in most cases and must be installed explicitly. +For example on Debian/Ubuntu: + +```console +apt-get install valgrind +``` + +## Invocation +The simplest invocation of valgrind is: + +```console +valgrind node test.js +``` + +with the output being: + +```console +user1@minikube1:~/valgrind/node-addon-examples/1_hello_world/napi$ valgrind node test.js +==28993== Memcheck, a memory error detector +==28993== Copyright (C) 2002-2017, and GNU GPL'd, by Julian Seward et al. +==28993== Using valgrind-3.13.0 and LibVEX; rerun with -h for copyright info +==28993== Command: node test.js +==28993== +==28993== Use of uninitialised value of size 8 +==28993== at 0x12F2279: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==28993== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==28993== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==28993== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==28993== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==28993== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==28993== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==28993== by 0x12F3E9C: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==28993== by 0x12F3C77: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==28993== by 0xC7C9CF: v8::internal::(anonymous namespace)::Invoke(v8::internal::Isolate*, v8::internal::(anonymous namespace)::InvokeParams const&) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==28993== by 0xC7CE87: v8::internal::Execution::Call(v8::internal::Isolate*, v8::internal::Handle, v8::internal::Handle, int, v8::internal::Handle*) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==28993== by 0xB4CF3A: v8::Function::Call(v8::Local, v8::Local, int, v8::Local*) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==28993== +--28993-- WARNING: unhandled amd64-linux syscall: 332 +--28993-- You may be able to write your own handler. +--28993-- Read the file README_MISSING_SYSCALL_OR_IOCTL. +--28993-- Nevertheless we consider this a bug. Please report +--28993-- it at http://valgrind.org/support/bug_reports.html. +==28993== +==28993== HEAP SUMMARY: +==28993== in use at exit: 6,140 bytes in 23 blocks +==28993== total heap usage: 12,888 allocs, 12,865 frees, 13,033,244 bytes allocated +==28993== +==28993== LEAK SUMMARY: +==28993== definitely lost: 0 bytes in 0 blocks +==28993== indirectly lost: 0 bytes in 0 blocks +==28993== possibly lost: 304 bytes in 1 blocks +==28993== still reachable: 5,836 bytes in 22 blocks +==28993== suppressed: 0 bytes in 0 blocks +==28993== Rerun with --leak-check=full to see details of leaked memory +==28993== +==28993== For counts of detected and suppressed errors, rerun with: -v +==28993== Use --track-origins=yes to see where uninitialised values come +``` + +This reports that Node.js is not _completely_ clean as there is some memory +that was allocated but not freed when the process shut down. It is often +impractical/not worth being completely clean in this respect. Modern +operating systems will clean up the memory of the process after the +shutdown while attempting to free all memory to get a clean +report may have a negative impact on the code complexity and +shutdown times. Node.js does a pretty good job only leaving on +the order of 6KB that are not freed on shutdown. + +## An obvious memory leak + +Leaks can be introduced in native addons and the following is a simple +example leak based on the "Hello world" addon from +[node-addon-examples](https://github.com/nodejs/node-addon-examples). + +In this example, a loop which allocates ~1MB of memory and never frees it +has been added: + +```C++ +void* malloc_holder = nullptr; +napi_value Method(napi_env env, napi_callback_info info) { + napi_status status; + napi_value world; + status = napi_create_string_utf8(env, "world", 5, &world); + assert(status == napi_ok); + + // NEW LEAK HERE + for (int i=0; i < 1024; i++) { + malloc_holder = malloc(1024); + } + + return world; +} +``` + +When trying to create a memory leak you need to ensure that +the compiler has not optimized out the code that creates +the leak. For example, by assigning the result of the allocation +to either a global variable or a variable that will be read +afterwards the compiler will not optimize it out along with +the malloc and Valgrind will properly report the memory leak. +If `malloc_holder` in the example above is made into a +local variable then the compiler may freely remove +it along with the allocations (since it is not used) +and Valgrind will not find any leaks since they +will no longer exist in the code being run. + +Running valgrind on this code shows the following: + +```console +user1@minikube1:~/valgrind/node-addon-examples/1_hello_world/napi$ valgrind node hello.js +==1504== Memcheck, a memory error detector +==1504== Copyright (C) 2002-2017, and GNU GPL'd, by Julian Seward et al. +==1504== Using V#algrind-3.13.0 and LibVEX; rerun with -h for copyright info +==1504== Command: node hello.js +==1504== +==1504== Use of uninitialised value of size 8 +==1504== at 0x12F2279: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==1504== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==1504== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==1504== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==1504== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==1504== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==1504== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==1504== by 0x12F3E9C: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==1504== by 0x12F3C77: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==1504== by 0xC7C9CF: v8::internal::(anonymous namespace)::Invoke(v8::internal::Isolate*, v8::internal::(anonymous namespace)::InvokeParams const&) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==1504== by 0xC7CE87: v8::internal::Execution::Call(v8::internal::Isolate*, v8::internal::Handle, v8::internal::Handle, int, v8::internal::Handle*) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==1504== by 0xB4CF3A: v8::Function::Call(v8::Local, v8::Local, int, v8::Local*) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==1504== +--1504-- WARNING: unhandled amd64-linux syscall: 332 +--1504-- You may be able to write your own handler. +--1504-- Read the file README_MISSING_SYSCALL_OR_IOCTL. +--1504-- Nevertheless we consider this a bug. Please report +--1504-- it at http://valgrind.org/support/bug_reports.html. +world +==1504== +==1504== HEAP SUMMARY: +==1504== in use at exit: 1,008,003 bytes in 1,032 blocks +==1504== total heap usage: 17,603 allocs, 16,571 frees, 18,306,103 bytes allocated +==1504== +==1504== LEAK SUMMARY: +==1504== definitely lost: 996,064 bytes in 997 blocks +==1504== indirectly lost: 0 bytes in 0 blocks +==1504== possibly lost: 3,304 bytes in 4 blocks +==1504== still reachable: 8,635 bytes in 31 blocks +==1504== of which reachable via heuristic: +==1504== multipleinheritance: 48 bytes in 1 blocks +==1504== suppressed: 0 bytes in 0 blocks +==1504== Rerun with --leak-check=full to see details of leaked memory +==1504== +==1504== For counts of detected and suppressed errors, rerun with: -v +==1504== Use --track-origins=yes to see where uninitialised values come from +==1504== ERROR SUMMARY: 1 errors from 1 contexts (suppressed: 0 from 0) +``` + +Valgrind is reporting a problem as it shows 996,604 bytes as +definitely lost and the question is how to find where that memory was +allocated. The next step is to rerun as suggested in the +output with `--leak-check=full`: + +``` bash +user1@minikube1:~/valgrind/node-addon-examples/1_hello_world/napi$ valgrind --leak-check=full node hello.js +==4174== Memcheck, a memory error detector +==4174== Copyright (C) 2002-2017, and GNU GPL'd, by Julian Seward et al. +==4174== Using Valgrind-3.13.0 and LibVEX; rerun with -h for copyright info +==4174== Command: node hello.js +==4174== +==4174== Use of uninitialised value of size 8 +==4174== at 0x12F2279: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F3E9C: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F3C77: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0xC7C9CF: v8::internal::(anonymous namespace)::Invoke(v8::internal::Isolate*, v8::internal::(anonymous namespace)::InvokeParams const&) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0xC7CE87: v8::internal::Execution::Call(v8::internal::Isolate*, v8::internal::Handle, v8::internal::Handle, int, v8::internal::Handle*) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0xB4CF3A: v8::Function::Call(v8::Local, v8::Local, int, v8::Local*) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== +--4174-- WARNING: unhandled amd64-linux syscall: 332 +--4174-- You may be able to write your own handler. +--4174-- Read the file README_MISSING_SYSCALL_OR_IOCTL. +--4174-- Nevertheless we consider this a bug. Please report +--4174-- it at http://valgrind.org/support/bug_reports.html. +world +==4174== +==4174== HEAP SUMMARY: +==4174== in use at exit: 1,008,003 bytes in 1,032 blocks +==4174== total heap usage: 17,606 allocs, 16,574 frees, 18,305,977 bytes allocated +==4174== +==4174== 64 bytes in 1 blocks are definitely lost in loss record 17 of 35 +==4174== at 0x4C3017F: operator new(unsigned long) (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so) +==4174== by 0x9AEAD5: napi_module_register (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x4010732: call_init (dl-init.c:72) +==4174== by 0x4010732: _dl_init (dl-init.c:119) +==4174== by 0x40151FE: dl_open_worker (dl-open.c:522) +==4174== by 0x5D052DE: _dl_catch_exception (dl-error-skeleton.c:196) +==4174== by 0x40147C9: _dl_open (dl-open.c:605) +==4174== by 0x4E3CF95: dlopen_doit (dlopen.c:66) +==4174== by 0x5D052DE: _dl_catch_exception (dl-error-skeleton.c:196) +==4174== by 0x5D0536E: _dl_catch_error (dl-error-skeleton.c:215) +==4174== by 0x4E3D734: _dlerror_run (dlerror.c:162) +==4174== by 0x4E3D050: dlopen@@GLIBC_2.2.5 (dlopen.c:87) +==4174== by 0x9B29A0: node::binding::DLOpen(v8::FunctionCallbackInfo const&)::{lambda(node::binding::DLib*)#1}::operator()(node::binding::DLib*) const (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== +==4174== 304 bytes in 1 blocks are possibly lost in loss record 27 of 35 +==4174== at 0x4C31B25: calloc (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so) +==4174== by 0x40134A6: allocate_dtv (dl-tls.c:286) +==4174== by 0x40134A6: _dl_allocate_tls (dl-tls.c:530) +==4174== by 0x5987227: allocate_stack (allocatestack.c:627) +==4174== by 0x5987227: pthread_create@@GLIBC_2.2.5 (pthread_create.c:644) +==4174== by 0xAAF9DC: node::inspector::Agent::Start(std::string const&, node::DebugOptions const&, std::shared_ptr, bool) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x9A8BE7: node::Environment::InitializeInspector(std::unique_ptr >) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0xA1C9A5: node::NodeMainInstance::CreateMainEnvironment(int*) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0xA1CB42: node::NodeMainInstance::Run() (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x9ACB67: node::Start(int, char**) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x5BBFB96: (below main) (libc-start.c:310) +==4174== +==4174== 2,000 bytes in 2 blocks are possibly lost in loss record 33 of 35 +==4174== at 0x4C2FB0F: malloc (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so) +==4174== by 0x9794979: Method(napi_env__*, napi_callback_info__*) (in /home/user1/valgrind/node-addon-examples/1_hello_world/napi/build/Release/hello.node) +==4174== by 0x98F764: v8impl::(anonymous namespace)::FunctionCallbackWrapper::Invoke(v8::FunctionCallbackInfo const&) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0xBA6FC8: v8::internal::MaybeHandle v8::internal::(anonymous namespace)::HandleApiCallHelper(v8::internal::Isolate*, v8::internal::Handle, v8::internal::Handle, v8::internal::Handle, v8::internal::Handle, v8::internal::BuiltinArguments) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0xBA8DB6: v8::internal::Builtin_HandleApiCall(int, unsigned long*, v8::internal::Isolate*) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x1376358: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== +==4174== 997,000 bytes in 997 blocks are definitely lost in loss record 35 of 35 +==4174== at 0x4C2FB0F: malloc (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so) +==4174== by 0x9794979: Method(napi_env__*, napi_callback_info__*) (in /home/user1/valgrind/node-addon-examples/1_hello_world/napi/build/Release/hello.node) +==4174== by 0x98F764: v8impl::(anonymous namespace)::FunctionCallbackWrapper::Invoke(v8::FunctionCallbackInfo const&) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0xBA6FC8: v8::internal::MaybeHandle v8::internal::(anonymous namespace)::HandleApiCallHelper(v8::internal::Isolate*, v8::internal::Handle, v8::internal::Handle, v8::internal::Handle, v8::internal::Handle, v8::internal::BuiltinArguments) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0xBA8DB6: v8::internal::Builtin_HandleApiCall(int, unsigned long*, v8::internal::Isolate*) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x1376358: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== +==4174== LEAK SUMMARY: +==4174== definitely lost: 997,064 bytes in 998 blocks +==4174== indirectly lost: 0 bytes in 0 blocks +==4174== possibly lost: 2,304 bytes in 3 blocks +==4174== still reachable: 8,635 bytes in 31 blocks +==4174== of which reachable via heuristic: +==4174== multipleinheritance: 48 bytes in 1 blocks +==4174== suppressed: 0 bytes in 0 blocks +==4174== Reachable blocks (those to which a pointer was found) are not shown. +==4174== To see them, rerun with: --leak-check=full --show-leak-kinds=all +==4174== +==4174== For counts of detected and suppressed errors, rerun with: -v +==4174== Use --track-origins=yes to see where uninitialised values come from +==4174== ERROR SUMMARY: 5 errors from 5 contexts (suppressed: 0 from 0) +``` + +This is the most interesting part of the report: + +```console +==4174== 997,000 bytes in 997 blocks are definitely lost in loss record 35 of 35 +==4174== at 0x4C2FB0F: malloc (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so) +==4174== by 0x9794979: Method(napi_env__*, napi_callback_info__*) (in /home/user1/valgrind/node-addon-examples/1_hello_world/napi/build/Release/hello.node) +==4174== by 0x98F764: v8impl::(anonymous namespace)::FunctionCallbackWrapper::Invoke(v8::FunctionCallbackInfo const&) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0xBA6FC8: v8::internal::MaybeHandle v8::internal::(anonymous namespace)::HandleApiCallHelper(v8::internal::Isolate*, v8::internal::Handle, v8::internal::Handle, v8::internal::Handle, v8::internal::Handle, v8::internal::BuiltinArguments) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0xBA8DB6: v8::internal::Builtin_HandleApiCall(int, unsigned long*, v8::internal::Isolate*) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x1376358: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +``` + +From the stack trace we can tell that the leak came from a native addon: + +```console +==4174== by 0x9794979: Method(napi_env__*, napi_callback_info__*) (in /home/user1/valgrind/node-addon-examples/1_hello_world/napi/build/Release/hello.node) +``` + +What we can't tell is where in the native addon the memory is being +allocated. This is because by default the addon is compiled without +the debug symbols which valgrind needs to be able to provide more +information. + +## Enabling debug symbols to get more information + +Leaks may be either in addons or Node.js itself. The sections which +follow cover the steps needed to enable debug symbols to get more info. + +### Native addons + +To enable debug symbols for all of your addons that are compiled on +install use: + +```console +npm install --debug +``` + +Any options which are not consumed by npm are passed on to node-gyp and this +results in the addons being compiled with the debug option. + +If the native addon contains pre-built binaries you will need to force +a rebuild. + +```console +npm install --debug +npm rebuild +``` + +The next step is to run valgrind after the rebuild. This time the information +for the leaking location includes the name of the source file and the +line number: + +```console +==18481== 997,000 bytes in 997 blocks are definitely lost in loss record 35 of 35 +==18481== at 0x4C2FB0F: malloc (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so) +>>>>> ==18481== by 0x9794989: Method(napi_env__*, napi_callback_info__*) (hello.cc:13) <<<<< +==18481== by 0x98F764: v8impl::(anonymous namespace)::FunctionCallbackWrapper::Invoke(v8::FunctionCallbackInfo const&) (in /home/user1/val grind/node-v12.14.1-linux-x64/bin/node) +==18481== by 0xBA6FC8: v8::internal::MaybeHandle v8::internal::(anonymous namespace)::HandleApiCallHelper(v8::internal:: Isolate*, v8::internal::Handle, v8::internal::Handle, v8::internal::Handle, v8::internal::Handle, v8::internal::BuiltinArguments) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==18481== by 0xBA8DB6: v8::internal::Builtin_HandleApiCall(int, unsigned long*, v8::internal::Isolate*) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==18481== by 0x1376358: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==18481== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==18481== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==18481== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==18481== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==18481== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==18481== by 0x12F68A3: ??? (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +``` + +This new output shows us exactly where the leak is occurring in the file `hello.cc`: + +```C++ + 6 void* malloc_holder = nullptr; + 7 napi_value Method(napi_env env, napi_callback_info info) { + 8 napi_status status; + 9 napi_value world; + 10 status = napi_create_string_utf8(env, "world", 5, &world); + 11 assert(status == napi_ok); + 12 for (int i=0; i< 1000; i++) { + 13 malloc_holder = malloc(1000); // <<<<<< This is where we are allocating the memory that is not freed + 14 } + 15 return world; + 16 } +``` + +### Node.js binary + +If the leak is not in an addon and is instead in the Node.js binary itself, +you may need to compile node yourself and turn on debug symbols. Looking at +this entry reported by valgrind, with a release binary we see: + +```console + ==4174== 304 bytes in 1 blocks are possibly lost in loss record 27 of 35 +==4174== at 0x4C31B25: calloc (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so) +==4174== by 0x40134A6: allocate_dtv (dl-tls.c:286) +==4174== by 0x40134A6: _dl_allocate_tls (dl-tls.c:530) +==4174== by 0x5987227: allocate_stack (allocatestack.c:627) +==4174== by 0x5987227: pthread_create@@GLIBC_2.2.5 (pthread_create.c:644) +==4174== by 0xAAF9DC: node::inspector::Agent::Start(std::string const&, node::DebugOptions const&, std::shared_ptr, bool) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x9A8BE7: node::Environment::InitializeInspector(std::unique_ptr >) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0xA1C9A5: node::NodeMainInstance::CreateMainEnvironment(int*) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0xA1CB42: node::NodeMainInstance::Run() (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x9ACB67: node::Start(int, char**) (in /home/user1/valgrind/node-v12.14.1-linux-x64/bin/node) +==4174== by 0x5BBFB96: (below main) (libc-start.c:310) +``` + +This gives us some information of where to look (`node::inspector::Agent::Start`) +but not where in that function. We get more information than you might expect +(or see by default with addons) because the Node.js binary exports many of +its symbols using `-rdynamic` so that they can be used by addons. If the stack +gives you enough information to track down where the leak is, that's great, +otherwise the next step is to compile a debug build of Node.js. + +To get additional information with valgrind: + +* Check out the Node.js source corresponding to the release that you + want to debug. For example: +```console +git clone https://github.com/nodejs/node.git +git checkout v12.14.1 +``` +* Compile with debug enabled (for additional info see +[building a debug build](https://github.com/nodejs/node/blob/v12.14.1/BUILDING.md#building-a-debug-build)). +For example, on *nix: +```console +./configure --debug +make -j4 +``` +* Make sure to run with your compiled debug version of Node.js. Having used + `./configure --debug`, two binaries will have been built when `make` was run. + You must use the one which is in `out/Debug`. + +Running valgrind using the debug build of Node.js shows: + +```console +==44112== 592 bytes in 1 blocks are possibly lost in loss record 26 of 27 +==44112== at 0x4C2BF79: calloc (vg_replace_malloc.c:762) +==44112== by 0x4012754: _dl_allocate_tls (in /usr/lib64/ld-2.17.so) +==44112== by 0x586287B: pthread_create@@GLIBC_2.2.5 (in /usr/lib64/libpthread-2.17.so) +==44112== by 0xFAB2D2: node::inspector::(anonymous namespace)::StartDebugSignalHandler() (inspector_agent.cc:140) +==44112== by 0xFACB10: node::inspector::Agent::Start(std::string const&, node::DebugOptions const&, std::shared_ptr, bool) (inspector_agent.cc:777) +==44112== by 0xE3A0BB: node::Environment::InitializeInspector(std::unique_ptr >) (node.cc:216) +==44112== by 0xEE8F3E: node::NodeMainInstance::CreateMainEnvironment(int*) (node_main_instance.cc:222) +==44112== by 0xEE8831: node::NodeMainInstance::Run() (node_main_instance.cc:108) +==44112== by 0xE3CDEC: node::Start(int, char**) (node.cc:996) +==44112== by 0x22D8BBF: main (node_main.cc:126) +``` + +Now we can see the specific file name and line in the Node.js code which +caused the allocation (inspector_agent.cc:140). From 57302f866e9ba954992ac32334b6dad3ebd4090f Mon Sep 17 00:00:00 2001 From: Anna Henningsen Date: Thu, 13 Feb 2020 20:40:50 +0100 Subject: [PATCH 02/91] src: prefer 3-argument Array::New() MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit This is nicer, because: 1. It reduces overall code size, 2. It’s faster, because `Object::Set()` calls are relatively slow, and 3. It helps avoid invalid `.Check()`/`.FromJust()` calls. PR-URL: https://github.com/nodejs/node/pull/31775 Reviewed-By: James M Snell Reviewed-By: Ben Noordhuis Reviewed-By: Colin Ihrig Reviewed-By: David Carlier --- src/cares_wrap.cc | 30 +++++++++--------- src/js_stream.cc | 9 +++--- src/module_wrap.cc | 18 ++++++----- src/node_crypto.cc | 78 ++++++++++++++++++++-------------------------- src/node_file.cc | 27 ++++++---------- src/node_v8.cc | 16 +++++----- src/spawn_sync.cc | 10 +++--- 7 files changed, 85 insertions(+), 103 deletions(-) diff --git a/src/cares_wrap.cc b/src/cares_wrap.cc index f7a02e469aa79a..8eeaeddf8300d8 100644 --- a/src/cares_wrap.cc +++ b/src/cares_wrap.cc @@ -749,16 +749,12 @@ Local AddrTTLToArray(Environment* env, const T* addrttls, size_t naddrttls) { auto isolate = env->isolate(); - EscapableHandleScope escapable_handle_scope(isolate); - auto context = env->context(); - Local ttls = Array::New(isolate, naddrttls); - for (size_t i = 0; i < naddrttls; i++) { - auto value = Integer::NewFromUnsigned(isolate, addrttls[i].ttl); - ttls->Set(context, i, value).Check(); - } + MaybeStackBuffer, 8> ttls(naddrttls); + for (size_t i = 0; i < naddrttls; i++) + ttls[i] = Integer::NewFromUnsigned(isolate, addrttls[i].ttl); - return escapable_handle_scope.Escape(ttls); + return Array::New(isolate, ttls.out(), naddrttls); } @@ -2039,6 +2035,7 @@ void GetServers(const FunctionCallbackInfo& args) { int r = ares_get_servers_ports(channel->cares_channel(), &servers); CHECK_EQ(r, ARES_SUCCESS); + auto cleanup = OnScopeLeave([&]() { ares_free_data(servers); }); ares_addr_port_node* cur = servers; @@ -2049,17 +2046,18 @@ void GetServers(const FunctionCallbackInfo& args) { int err = uv_inet_ntop(cur->family, caddr, ip, sizeof(ip)); CHECK_EQ(err, 0); - Local ret = Array::New(env->isolate(), 2); - ret->Set(env->context(), 0, OneByteString(env->isolate(), ip)).Check(); - ret->Set(env->context(), - 1, - Integer::New(env->isolate(), cur->udp_port)).Check(); + Local ret[] = { + OneByteString(env->isolate(), ip), + Integer::New(env->isolate(), cur->udp_port) + }; - server_array->Set(env->context(), i, ret).Check(); + if (server_array->Set(env->context(), i, + Array::New(env->isolate(), ret, arraysize(ret))) + .IsNothing()) { + return; + } } - ares_free_data(servers); - args.GetReturnValue().Set(server_array); } diff --git a/src/js_stream.cc b/src/js_stream.cc index a67fd37dbdb2b6..64941b1c4e4fb7 100644 --- a/src/js_stream.cc +++ b/src/js_stream.cc @@ -116,16 +116,15 @@ int JSStream::DoWrite(WriteWrap* w, HandleScope scope(env()->isolate()); Context::Scope context_scope(env()->context()); - Local bufs_arr = Array::New(env()->isolate(), count); - Local buf; + MaybeStackBuffer, 16> bufs_arr(count); for (size_t i = 0; i < count; i++) { - buf = Buffer::Copy(env(), bufs[i].base, bufs[i].len).ToLocalChecked(); - bufs_arr->Set(env()->context(), i, buf).Check(); + bufs_arr[i] = + Buffer::Copy(env(), bufs[i].base, bufs[i].len).ToLocalChecked(); } Local argv[] = { w->object(), - bufs_arr + Array::New(env()->isolate(), bufs_arr.out(), count) }; TryCatchScope try_catch(env()); diff --git a/src/module_wrap.cc b/src/module_wrap.cc index 0bc32f7846b022..68359178f4ab38 100644 --- a/src/module_wrap.cc +++ b/src/module_wrap.cc @@ -264,11 +264,11 @@ void ModuleWrap::Link(const FunctionCallbackInfo& args) { Local mod_context = obj->context_.Get(isolate); Local module = obj->module_.Get(isolate); - Local promises = Array::New(isolate, - module->GetModuleRequestsLength()); + const int module_requests_length = module->GetModuleRequestsLength(); + MaybeStackBuffer, 16> promises(module_requests_length); // call the dependency resolve callbacks - for (int i = 0; i < module->GetModuleRequestsLength(); i++) { + for (int i = 0; i < module_requests_length; i++) { Local specifier = module->GetModuleRequest(i); Utf8Value specifier_utf8(env->isolate(), specifier); std::string specifier_std(*specifier_utf8, specifier_utf8.length()); @@ -290,10 +290,11 @@ void ModuleWrap::Link(const FunctionCallbackInfo& args) { Local resolve_promise = resolve_return_value.As(); obj->resolve_cache_[specifier_std].Reset(env->isolate(), resolve_promise); - promises->Set(mod_context, i, resolve_promise).Check(); + promises[i] = resolve_promise; } - args.GetReturnValue().Set(promises); + args.GetReturnValue().Set( + Array::New(isolate, promises.out(), promises.length())); } void ModuleWrap::Instantiate(const FunctionCallbackInfo& args) { @@ -426,12 +427,13 @@ void ModuleWrap::GetStaticDependencySpecifiers( int count = module->GetModuleRequestsLength(); - Local specifiers = Array::New(env->isolate(), count); + MaybeStackBuffer, 16> specifiers(count); for (int i = 0; i < count; i++) - specifiers->Set(env->context(), i, module->GetModuleRequest(i)).Check(); + specifiers[i] = module->GetModuleRequest(i); - args.GetReturnValue().Set(specifiers); + args.GetReturnValue().Set( + Array::New(env->isolate(), specifiers.out(), count)); } void ModuleWrap::GetError(const FunctionCallbackInfo& args) { diff --git a/src/node_crypto.cc b/src/node_crypto.cc index 92760fb8c8577b..2176fffc543e0b 100644 --- a/src/node_crypto.cc +++ b/src/node_crypto.cc @@ -1011,19 +1011,19 @@ static X509_STORE* NewRootCertStore() { void GetRootCertificates(const FunctionCallbackInfo& args) { Environment* env = Environment::GetCurrent(args); - Local result = Array::New(env->isolate(), arraysize(root_certs)); + Local result[arraysize(root_certs)]; for (size_t i = 0; i < arraysize(root_certs); i++) { - Local value; - if (!String::NewFromOneByte(env->isolate(), - reinterpret_cast(root_certs[i]), - NewStringType::kNormal).ToLocal(&value) || - !result->Set(env->context(), i, value).FromMaybe(false)) { + if (!String::NewFromOneByte( + env->isolate(), + reinterpret_cast(root_certs[i]), + NewStringType::kNormal).ToLocal(&result[i])) { return; } } - args.GetReturnValue().Set(result); + args.GetReturnValue().Set( + Array::New(env->isolate(), result, arraysize(root_certs))); } @@ -2138,22 +2138,22 @@ static Local X509ToObject(Environment* env, X509* cert) { StackOfASN1 eku(static_cast( X509_get_ext_d2i(cert, NID_ext_key_usage, nullptr, nullptr))); if (eku) { - Local ext_key_usage = Array::New(env->isolate()); + const int count = sk_ASN1_OBJECT_num(eku.get()); + MaybeStackBuffer, 16> ext_key_usage(count); char buf[256]; int j = 0; - for (int i = 0; i < sk_ASN1_OBJECT_num(eku.get()); i++) { + for (int i = 0; i < count; i++) { if (OBJ_obj2txt(buf, sizeof(buf), sk_ASN1_OBJECT_value(eku.get(), i), 1) >= 0) { - ext_key_usage->Set(context, - j++, - OneByteString(env->isolate(), buf)).Check(); + ext_key_usage[j++] = OneByteString(env->isolate(), buf); } } eku.reset(); - info->Set(context, env->ext_key_usage_string(), ext_key_usage).Check(); + info->Set(context, env->ext_key_usage_string(), + Array::New(env->isolate(), ext_key_usage.out(), count)).Check(); } if (ASN1_INTEGER* serial_number = X509_get_serialNumber(cert)) { @@ -6799,15 +6799,8 @@ void GenerateKeyPair(const FunctionCallbackInfo& args, Local err, pubkey, privkey; job->ToResult(&err, &pubkey, &privkey); - bool (*IsNotTrue)(Maybe) = [](Maybe maybe) { - return maybe.IsNothing() || !maybe.ToChecked(); - }; - Local ret = Array::New(env->isolate(), 3); - if (IsNotTrue(ret->Set(env->context(), 0, err)) || - IsNotTrue(ret->Set(env->context(), 1, pubkey)) || - IsNotTrue(ret->Set(env->context(), 2, privkey))) - return; - args.GetReturnValue().Set(ret); + Local ret[] = { err, pubkey, privkey }; + args.GetReturnValue().Set(Array::New(env->isolate(), ret, arraysize(ret))); } void GenerateKeyPairRSA(const FunctionCallbackInfo& args) { @@ -6940,17 +6933,6 @@ void GetSSLCiphers(const FunctionCallbackInfo& args) { CHECK(ssl); STACK_OF(SSL_CIPHER)* ciphers = SSL_get_ciphers(ssl.get()); - int n = sk_SSL_CIPHER_num(ciphers); - Local arr = Array::New(env->isolate(), n); - - for (int i = 0; i < n; ++i) { - const SSL_CIPHER* cipher = sk_SSL_CIPHER_value(ciphers, i); - arr->Set(env->context(), - i, - OneByteString(args.GetIsolate(), - SSL_CIPHER_get_name(cipher))).Check(); - } - // TLSv1.3 ciphers aren't listed by EVP. There are only 5, we could just // document them, but since there are only 5, easier to just add them manually // and not have to explain their absence in the API docs. They are lower-cased @@ -6963,13 +6945,20 @@ void GetSSLCiphers(const FunctionCallbackInfo& args) { "tls_aes_128_ccm_sha256" }; + const int n = sk_SSL_CIPHER_num(ciphers); + std::vector> arr(n + arraysize(TLS13_CIPHERS)); + + for (int i = 0; i < n; ++i) { + const SSL_CIPHER* cipher = sk_SSL_CIPHER_value(ciphers, i); + arr[i] = OneByteString(env->isolate(), SSL_CIPHER_get_name(cipher)); + } + for (unsigned i = 0; i < arraysize(TLS13_CIPHERS); ++i) { const char* name = TLS13_CIPHERS[i]; - arr->Set(env->context(), - arr->Length(), OneByteString(args.GetIsolate(), name)).Check(); + arr[n + i] = OneByteString(env->isolate(), name); } - args.GetReturnValue().Set(arr); + args.GetReturnValue().Set(Array::New(env->isolate(), arr.data(), arr.size())); } @@ -7020,22 +7009,23 @@ void GetHashes(const FunctionCallbackInfo& args) { void GetCurves(const FunctionCallbackInfo& args) { Environment* env = Environment::GetCurrent(args); const size_t num_curves = EC_get_builtin_curves(nullptr, 0); - Local arr = Array::New(env->isolate(), num_curves); if (num_curves) { std::vector curves(num_curves); if (EC_get_builtin_curves(curves.data(), num_curves)) { - for (size_t i = 0; i < num_curves; i++) { - arr->Set(env->context(), - i, - OneByteString(env->isolate(), - OBJ_nid2sn(curves[i].nid))).Check(); - } + std::vector> arr(num_curves); + + for (size_t i = 0; i < num_curves; i++) + arr[i] = OneByteString(env->isolate(), OBJ_nid2sn(curves[i].nid)); + + args.GetReturnValue().Set( + Array::New(env->isolate(), arr.data(), arr.size())); + return; } } - args.GetReturnValue().Set(arr); + args.GetReturnValue().Set(Array::New(env->isolate())); } diff --git a/src/node_file.cc b/src/node_file.cc index ddba9ad42ccbc4..360f20e5f6c36a 100644 --- a/src/node_file.cc +++ b/src/node_file.cc @@ -700,16 +700,11 @@ void AfterScanDirWithTypes(uv_fs_t* req) { type_v.emplace_back(Integer::New(isolate, ent.type)); } - Local result = Array::New(isolate, 2); - result->Set(env->context(), - 0, - Array::New(isolate, name_v.data(), - name_v.size())).Check(); - result->Set(env->context(), - 1, - Array::New(isolate, type_v.data(), - type_v.size())).Check(); - req_wrap->Resolve(result); + Local result[] = { + Array::New(isolate, name_v.data(), name_v.size()), + Array::New(isolate, type_v.data(), type_v.size()) + }; + req_wrap->Resolve(Array::New(isolate, result, arraysize(result))); } void Access(const FunctionCallbackInfo& args) { @@ -1519,13 +1514,11 @@ static void ReadDir(const FunctionCallbackInfo& args) { Local names = Array::New(isolate, name_v.data(), name_v.size()); if (with_types) { - Local result = Array::New(isolate, 2); - result->Set(env->context(), 0, names).Check(); - result->Set(env->context(), - 1, - Array::New(isolate, type_v.data(), - type_v.size())).Check(); - args.GetReturnValue().Set(result); + Local result[] = { + names, + Array::New(isolate, type_v.data(), type_v.size()) + }; + args.GetReturnValue().Set(Array::New(isolate, result, arraysize(result))); } else { args.GetReturnValue().Set(names); } diff --git a/src/node_v8.cc b/src/node_v8.cc index ed2e71de1069bb..32a85944da48c7 100644 --- a/src/node_v8.cc +++ b/src/node_v8.cc @@ -218,19 +218,19 @@ void Initialize(Local target, // Heap space names are extracted once and exposed to JavaScript to // avoid excessive creation of heap space name Strings. HeapSpaceStatistics s; - const Local heap_spaces = Array::New(env->isolate(), - number_of_heap_spaces); + MaybeStackBuffer, 16> heap_spaces(number_of_heap_spaces); for (size_t i = 0; i < number_of_heap_spaces; i++) { env->isolate()->GetHeapSpaceStatistics(&s, i); - Local heap_space_name = String::NewFromUtf8(env->isolate(), - s.space_name(), - NewStringType::kNormal) - .ToLocalChecked(); - heap_spaces->Set(env->context(), i, heap_space_name).Check(); + heap_spaces[i] = String::NewFromUtf8(env->isolate(), + s.space_name(), + NewStringType::kNormal) + .ToLocalChecked(); } target->Set(env->context(), FIXED_ONE_BYTE_STRING(env->isolate(), "kHeapSpaces"), - heap_spaces).Check(); + Array::New(env->isolate(), + heap_spaces.out(), + number_of_heap_spaces)).Check(); env->SetMethod(target, "updateHeapSpaceStatisticsArrayBuffer", diff --git a/src/spawn_sync.cc b/src/spawn_sync.cc index 3b277ad70adb66..589b77f6c1eb95 100644 --- a/src/spawn_sync.cc +++ b/src/spawn_sync.cc @@ -721,18 +721,18 @@ Local SyncProcessRunner::BuildOutputArray() { CHECK(!stdio_pipes_.empty()); EscapableHandleScope scope(env()->isolate()); - Local context = env()->context(); - Local js_output = Array::New(env()->isolate(), stdio_count_); + MaybeStackBuffer, 8> js_output(stdio_pipes_.size()); for (uint32_t i = 0; i < stdio_pipes_.size(); i++) { SyncProcessStdioPipe* h = stdio_pipes_[i].get(); if (h != nullptr && h->writable()) - js_output->Set(context, i, h->GetOutputAsBuffer(env())).Check(); + js_output[i] = h->GetOutputAsBuffer(env()); else - js_output->Set(context, i, Null(env()->isolate())).Check(); + js_output[i] = Null(env()->isolate()); } - return scope.Escape(js_output); + return scope.Escape( + Array::New(env()->isolate(), js_output.out(), js_output.length())); } Maybe SyncProcessRunner::ParseOptions(Local js_value) { From eb2dce834259130b4652e4de8f8f034079b7a584 Mon Sep 17 00:00:00 2001 From: Samuel Attard Date: Thu, 13 Feb 2020 12:23:58 -0800 Subject: [PATCH 03/91] doc: claim ABI version 82 for Electron 10 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/31778 Reviewed-By: Richard Lau Reviewed-By: Anna Henningsen Reviewed-By: Benjamin Gruenbaum Reviewed-By: Colin Ihrig Reviewed-By: James M Snell Reviewed-By: Jiawen Geng Reviewed-By: Michaël Zasso Reviewed-By: Myles Borins Reviewed-By: Luigi Pinca --- doc/abi_version_registry.json | 2 ++ 1 file changed, 2 insertions(+) diff --git a/doc/abi_version_registry.json b/doc/abi_version_registry.json index ba27aeb024d0d8..f044d97599fcc0 100644 --- a/doc/abi_version_registry.json +++ b/doc/abi_version_registry.json @@ -1,5 +1,7 @@ { "NODE_MODULE_VERSION": [ + { "modules": 82, "runtime": "electron", "variant": "electron", "versions": "10" }, + { "modules": 81, "runtime": "node", "variant": "v8_7.9", "versions": "14.0.0-pre" }, { "modules": 80, "runtime": "electron", "variant": "electron", "versions": "9" }, { "modules": 79, "runtime": "node", "variant": "v8_7.8", "versions": "13" }, { "modules": 78, "runtime": "node", "variant": "v8_7.7", "versions": "13.0.0-pre" }, From f87ac90849d44f26cecb38517c2b4cf53e99a98a Mon Sep 17 00:00:00 2001 From: Anna Henningsen Date: Thu, 13 Feb 2020 21:32:43 +0100 Subject: [PATCH 04/91] worker: unroll file extension regexp Refs: https://github.com/nodejs/node/pull/31662#discussion_r377016190 PR-URL: https://github.com/nodejs/node/pull/31779 Reviewed-By: Richard Lau Reviewed-By: Benjamin Gruenbaum Reviewed-By: Denys Otrishko Reviewed-By: Gus Caplan Reviewed-By: Colin Ihrig Reviewed-By: James M Snell Reviewed-By: Yongsheng Zhang Reviewed-By: Ruben Bridgewater --- lib/internal/worker.js | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/lib/internal/worker.js b/lib/internal/worker.js index b690ab82debc29..de626af1bef962 100644 --- a/lib/internal/worker.js +++ b/lib/internal/worker.js @@ -104,7 +104,7 @@ class Worker extends EventEmitter { filename = path.resolve(filename); const ext = path.extname(filename); - if (!/^\.[cm]?js$/.test(ext)) { + if (ext !== '.js' && ext !== '.mjs' && ext !== '.cjs') { throw new ERR_WORKER_UNSUPPORTED_EXTENSION(ext); } } From 9e4aad705f44f4d425e89f13aeb6c9c6daea95f4 Mon Sep 17 00:00:00 2001 From: Jeff Date: Fri, 14 Feb 2020 22:36:11 +0800 Subject: [PATCH 05/91] doc: fix typos in doc/api/https.md MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/31793 Reviewed-By: Michaël Zasso Reviewed-By: James M Snell Reviewed-By: Yongsheng Zhang Reviewed-By: Luigi Pinca Reviewed-By: Colin Ihrig Reviewed-By: Richard Lau --- doc/api/https.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/api/https.md b/doc/api/https.md index 19d2053d377c59..ac53b6f0fdeccf 100644 --- a/doc/api/https.md +++ b/doc/api/https.md @@ -372,7 +372,7 @@ const options = { return new Error(msg); } - // Pin the exact certificate, rather then the pub key + // Pin the exact certificate, rather than the pub key const cert256 = '25:FE:39:32:D9:63:8C:8A:FC:A1:9A:29:87:' + 'D8:3E:4C:1D:98:DB:71:E4:1A:48:03:98:EA:22:6A:BD:8B:93:16'; if (cert.fingerprint256 !== cert256) { From b177bba5553090e7a3e1abdd746560d48ec29d51 Mon Sep 17 00:00:00 2001 From: Jeremiah Senkpiel Date: Mon, 10 Feb 2020 10:46:02 -0800 Subject: [PATCH 06/91] doc: move @Fishrock123 to a previous releaser MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit I have not done a release in well over a year, maybe even two. I also don't really plan to do more, as Node.js releases are very tedious. PR-URL: https://github.com/nodejs/node/pull/31725 Reviewed-By: Matteo Collina Reviewed-By: Daniel Bevenius Reviewed-By: James M Snell Reviewed-By: Myles Borins Reviewed-By: Anna Henningsen Reviewed-By: Anto Aravinth Reviewed-By: Colin Ihrig Reviewed-By: Tobias Nießen Reviewed-By: Gus Caplan Reviewed-By: Gireesh Punathil Reviewed-By: Michael Dawson --- README.md | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/README.md b/README.md index 07d6ad4e7ed039..7f82c490ccb674 100644 --- a/README.md +++ b/README.md @@ -546,8 +546,6 @@ GPG keys used to sign Node.js releases: `77984A986EBC2AA786BC0F66B01FBB92821C587A` * **James M Snell** <jasnell@keybase.io> `71DCFD284A79C3B38668286BC97EC7A07EDE3FC1` -* **Jeremiah Senkpiel** <fishrock@keybase.io> -`FD3A5288F042B6850C66B31F09FE44734EB7990E` * **Michaël Zasso** <targos@protonmail.com> `8FCCA13FEF1D0C2E91008E09770F7A9A5AE15600` * **Myles Borins** <myles.borins@gmail.com> @@ -568,7 +566,6 @@ gpg --keyserver pool.sks-keyservers.net --recv-keys 94AE36675C464D64BAFA68DD7434 gpg --keyserver pool.sks-keyservers.net --recv-keys B9AE9905FFD7803F25714661B63B535A4C206CA9 gpg --keyserver pool.sks-keyservers.net --recv-keys 77984A986EBC2AA786BC0F66B01FBB92821C587A gpg --keyserver pool.sks-keyservers.net --recv-keys 71DCFD284A79C3B38668286BC97EC7A07EDE3FC1 -gpg --keyserver pool.sks-keyservers.net --recv-keys FD3A5288F042B6850C66B31F09FE44734EB7990E gpg --keyserver pool.sks-keyservers.net --recv-keys 8FCCA13FEF1D0C2E91008E09770F7A9A5AE15600 gpg --keyserver pool.sks-keyservers.net --recv-keys C4F0DFFF4E8C1A8236409D08E73BC641CC11F4C8 gpg --keyserver pool.sks-keyservers.net --recv-keys DD8F2338BAE7501E3DD5AC78C273792F7D83545D @@ -580,6 +577,8 @@ use these keys to verify a downloaded file. Other keys used to sign some previous releases: +* **Jeremiah Senkpiel** <fishrock@keybase.io> +`FD3A5288F042B6850C66B31F09FE44734EB7990E` * **Chris Dickinson** <christopher.s.dickinson@gmail.com> `9554F04D7259F04124DE6B476D5A82AC7E37093B` * **Isaac Z. Schlueter** <i@izs.me> From 7f4d6ee8eaac0a709ce4f3416625c90468ce4566 Mon Sep 17 00:00:00 2001 From: Jeremiah Senkpiel Date: Mon, 10 Feb 2020 10:56:58 -0800 Subject: [PATCH 07/91] doc: move @Fishrock123 to TSC Emeriti MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit It was a good run. Almost 5 years. I haven't really been involved in the last 3+? months though, so it's time I call it and 'retire'. I think it is unlikely that I'll be on the TSC again, as node is unfortunately becoming increasingly disinteresting (& frustrating) to me. (So long and thanks for all the fish!) PR-URL: https://github.com/nodejs/node/pull/31725 Reviewed-By: Matteo Collina Reviewed-By: Daniel Bevenius Reviewed-By: James M Snell Reviewed-By: Myles Borins Reviewed-By: Anna Henningsen Reviewed-By: Anto Aravinth Reviewed-By: Colin Ihrig Reviewed-By: Tobias Nießen Reviewed-By: Gus Caplan Reviewed-By: Gireesh Punathil Reviewed-By: Michael Dawson --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 7f82c490ccb674..9f876bec200802 100644 --- a/README.md +++ b/README.md @@ -165,8 +165,6 @@ For information about the governance of the Node.js project, see **Daniel Bevenius** <daniel.bevenius@gmail.com> (he/him) * [fhinkel](https://github.com/fhinkel) - **Franziska Hinkelmann** <franziska.hinkelmann@gmail.com> (she/her) -* [Fishrock123](https://github.com/Fishrock123) - -**Jeremiah Senkpiel** <fishrock123@rocketmail.com> * [gabrielschulhof](https://github.com/gabrielschulhof) - **Gabriel Schulhof** <gabriel.schulhof@intel.com> * [gireeshpunathil](https://github.com/gireeshpunathil) - @@ -200,6 +198,8 @@ For information about the governance of the Node.js project, see **Chris Dickinson** <christopher.s.dickinson@gmail.com> * [evanlucas](https://github.com/evanlucas) - **Evan Lucas** <evanlucas@me.com> (he/him) +* [Fishrock123](https://github.com/Fishrock123) - +**Jeremiah Senkpiel** <fishrock123@rocketmail.com> * [gibfahn](https://github.com/gibfahn) - **Gibson Fahnestock** <gibfahn@gmail.com> (he/him) * [indutny](https://github.com/indutny) - From b07175853fa559a28b205dd100faf69595d0a23a Mon Sep 17 00:00:00 2001 From: Jeremiah Senkpiel Date: Mon, 10 Feb 2020 10:59:27 -0800 Subject: [PATCH 08/91] doc: pronouns for @Fishrock123 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit might as well while I'm at it feels a bit weird being the first person on this list with '/they' but I guess someone's gota do it PR-URL: https://github.com/nodejs/node/pull/31725 Reviewed-By: Matteo Collina Reviewed-By: Daniel Bevenius Reviewed-By: James M Snell Reviewed-By: Myles Borins Reviewed-By: Anna Henningsen Reviewed-By: Anto Aravinth Reviewed-By: Colin Ihrig Reviewed-By: Tobias Nießen Reviewed-By: Gus Caplan Reviewed-By: Gireesh Punathil Reviewed-By: Michael Dawson --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 9f876bec200802..c71cbba4090975 100644 --- a/README.md +++ b/README.md @@ -199,7 +199,7 @@ For information about the governance of the Node.js project, see * [evanlucas](https://github.com/evanlucas) - **Evan Lucas** <evanlucas@me.com> (he/him) * [Fishrock123](https://github.com/Fishrock123) - -**Jeremiah Senkpiel** <fishrock123@rocketmail.com> +**Jeremiah Senkpiel** <fishrock123@rocketmail.com> (he/they) * [gibfahn](https://github.com/gibfahn) - **Gibson Fahnestock** <gibfahn@gmail.com> (he/him) * [indutny](https://github.com/indutny) - @@ -292,7 +292,7 @@ For information about the governance of the Node.js project, see * [fhinkel](https://github.com/fhinkel) - **Franziska Hinkelmann** <franziska.hinkelmann@gmail.com> (she/her) * [Fishrock123](https://github.com/Fishrock123) - -**Jeremiah Senkpiel** <fishrock123@rocketmail.com> +**Jeremiah Senkpiel** <fishrock123@rocketmail.com> (he/they) * [gabrielschulhof](https://github.com/gabrielschulhof) - **Gabriel Schulhof** <gabriel.schulhof@intel.com> * [gdams](https://github.com/gdams) - From f9526057b343a7be3076b8659eeb39ddf35ae7fb Mon Sep 17 00:00:00 2001 From: Gireesh Punathil Date: Thu, 13 Feb 2020 21:58:27 +0530 Subject: [PATCH 09/91] doc: move gireeshpunathil to TSC emeritus MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/31770 Reviewed-By: James M Snell Reviewed-By: Myles Borins Reviewed-By: Luigi Pinca Reviewed-By: Colin Ihrig Reviewed-By: Tobias Nießen Reviewed-By: Richard Lau Reviewed-By: Michael Dawson --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index c71cbba4090975..a06c51f22a4cd1 100644 --- a/README.md +++ b/README.md @@ -167,8 +167,6 @@ For information about the governance of the Node.js project, see **Franziska Hinkelmann** <franziska.hinkelmann@gmail.com> (she/her) * [gabrielschulhof](https://github.com/gabrielschulhof) - **Gabriel Schulhof** <gabriel.schulhof@intel.com> -* [gireeshpunathil](https://github.com/gireeshpunathil) - -**Gireesh Punathil** <gpunathi@in.ibm.com> (he/him) * [jasnell](https://github.com/jasnell) - **James M Snell** <jasnell@gmail.com> (he/him) * [joyeecheung](https://github.com/joyeecheung) - @@ -202,6 +200,8 @@ For information about the governance of the Node.js project, see **Jeremiah Senkpiel** <fishrock123@rocketmail.com> (he/they) * [gibfahn](https://github.com/gibfahn) - **Gibson Fahnestock** <gibfahn@gmail.com> (he/him) +* [gireeshpunathil](https://github.com/gireeshpunathil) - +**Gireesh Punathil** <gpunathi@in.ibm.com> (he/him) * [indutny](https://github.com/indutny) - **Fedor Indutny** <fedor.indutny@gmail.com> * [isaacs](https://github.com/isaacs) - From 60c71dcad21adb2916853f40cbe72635a6f0e075 Mon Sep 17 00:00:00 2001 From: James M Snell Date: Wed, 12 Feb 2020 12:21:34 -0800 Subject: [PATCH 10/91] test: add known issue test for sync writable callback If the write callbacks are invoked synchronously with an error, onwriteError would cause the error event to be emitted synchronously, making it impossible to attach an error handler after the call that triggered it. PR-URL: https://github.com/nodejs/node/pull/31756 Refs: https://github.com/nodejs/quic/commit/b0d469c69c49c9186c1a581a7cebce4c5d398947 Refs: https://github.com/nodejs/quic/pull/341 Reviewed-By: Robert Nagy Reviewed-By: Matteo Collina Reviewed-By: Anna Henningsen Reviewed-By: Minwoo Jung --- .../test-stream-writable-sync-error.js | 44 +++++++++++++++++++ 1 file changed, 44 insertions(+) create mode 100644 test/known_issues/test-stream-writable-sync-error.js diff --git a/test/known_issues/test-stream-writable-sync-error.js b/test/known_issues/test-stream-writable-sync-error.js new file mode 100644 index 00000000000000..202cf7bf23e2fd --- /dev/null +++ b/test/known_issues/test-stream-writable-sync-error.js @@ -0,0 +1,44 @@ +'use strict'; +const common = require('../common'); + +// Tests for the regression in _stream_writable discussed in +// https://github.com/nodejs/node/pull/31756 + +// Specifically, when a write callback is invoked synchronously +// with an error, and autoDestroy is not being used, the error +// should still be emitted on nextTick. + +const { Writable } = require('stream'); + +class MyStream extends Writable { + #cb = undefined; + + constructor() { + super({ autoDestroy: false }); + } + + _write(_, __, cb) { + this.#cb = cb; + } + + close() { + // Synchronously invoke the callback with an error. + this.#cb(new Error('foo')); + } +} + +const stream = new MyStream(); + +const mustError = common.mustCall(2); + +stream.write('test', () => {}); + +// Both error callbacks should be invoked. + +stream.on('error', mustError); + +stream.close(); + +// Without the fix in #31756, the error handler +// added after the call to close will not be invoked. +stream.on('error', mustError); From 3969af43b432b363a19ed2a97ebd0772af16eea6 Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Tue, 11 Feb 2020 14:57:12 -1000 Subject: [PATCH 11/91] doc: reword possessive form of Node.js in debugger.md MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Throughout the docs, we sometimes write the possessive of _Node.js_ as _Node.js'_ and other times as _Node.js's_. The former conforms with some generally accepted style guides (e.g., Associated Press Stylebook) while the latter complies with others (e.g., Chicago Manual of Style). Since there is no clear authoritative answer as to which form is correct, and since (at least to me) both are visually jarring and sometimes cause a pause to understand, I'd like to reword things to eliminate the possessive form where possible. This is one of those examples. PR-URL: https://github.com/nodejs/node/pull/31748 Reviewed-By: Benjamin Gruenbaum Reviewed-By: Tobias Nießen Reviewed-By: Luigi Pinca Reviewed-By: Richard Lau Reviewed-By: Michael Dawson --- doc/api/debugger.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/api/debugger.md b/doc/api/debugger.md index e43cfa8a163cf2..2821cd605c568e 100644 --- a/doc/api/debugger.md +++ b/doc/api/debugger.md @@ -23,7 +23,7 @@ Break on start in myscript.js:1 debug> ``` -Node.js's debugger client is not a full-featured debugger, but simple step and +The Node.js debugger client is not a full-featured debugger, but simple step and inspection are possible. Inserting the statement `debugger;` into the source code of a script will From cb210e6b166a60577bff6f64bcba45557815bdcf Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Tue, 11 Feb 2020 14:59:59 -1000 Subject: [PATCH 12/91] doc: reword possessive form of Node.js in process.md MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Throughout the docs, we sometimes write the possessive of _Node.js_ as _Node.js'_ and other times as _Node.js's_. The former conforms with some generally accepted style guides (e.g., Associated Press Stylebook) while the latter complies with others (e.g., Chicago Manual of Style). Since there is no clear authoritative answer as to which form is correct, and since (at least to me) both are visually jarring and sometimes cause a pause to understand, I'd like to reword things to eliminate the possessive form where possible. This is one of those examples. PR-URL: https://github.com/nodejs/node/pull/31748 Reviewed-By: Benjamin Gruenbaum Reviewed-By: Tobias Nießen Reviewed-By: Luigi Pinca Reviewed-By: Richard Lau Reviewed-By: Michael Dawson --- doc/api/process.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/doc/api/process.md b/doc/api/process.md index 1c75f6915043f6..9850711d596e6f 100644 --- a/doc/api/process.md +++ b/doc/api/process.md @@ -805,7 +805,7 @@ added: v0.7.2 * {number} -The port used by Node.js's debugger when enabled. +The port used by the Node.js debugger when enabled. ```js process.debugPort = 5858; @@ -2458,11 +2458,11 @@ cases: handler. * `2`: Unused (reserved by Bash for builtin misuse) * `3` **Internal JavaScript Parse Error**: The JavaScript source code - internal in Node.js's bootstrapping process caused a parse error. This + internal in the Node.js bootstrapping process caused a parse error. This is extremely rare, and generally can only happen during development of Node.js itself. * `4` **Internal JavaScript Evaluation Failure**: The JavaScript - source code internal in Node.js's bootstrapping process failed to + source code internal in the Node.js bootstrapping process failed to return a function value when evaluated. This is extremely rare, and generally can only happen during development of Node.js itself. * `5` **Fatal Error**: There was a fatal unrecoverable error in V8. @@ -2481,7 +2481,7 @@ cases: * `9` **Invalid Argument**: Either an unknown option was specified, or an option requiring a value was provided without a value. * `10` **Internal JavaScript Run-Time Failure**: The JavaScript - source code internal in Node.js's bootstrapping process threw an error + source code internal in the Node.js bootstrapping process threw an error when the bootstrapping function was called. This is extremely rare, and generally can only happen during development of Node.js itself. * `12` **Invalid Debug Argument**: The `--inspect` and/or `--inspect-brk` From 3eaf37767e52a2d56829e7758517262fa4f2b1b4 Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Tue, 11 Feb 2020 15:03:31 -1000 Subject: [PATCH 13/91] doc: reword possessive form of Node.js in http.md MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Throughout the docs, we sometimes write the possessive of _Node.js_ as _Node.js'_ and other times as _Node.js's_. The former conforms with some generally accepted style guides (e.g., Associated Press Stylebook) while the latter complies with others (e.g., Chicago Manual of Style). Since there is no clear authoritative answer as to which form is correct, and since (at least to me) both are visually jarring and sometimes cause a pause to understand, I'd like to reword things to eliminate the possessive form where possible. This is one of those examples. PR-URL: https://github.com/nodejs/node/pull/31748 Reviewed-By: Benjamin Gruenbaum Reviewed-By: Tobias Nießen Reviewed-By: Luigi Pinca Reviewed-By: Richard Lau Reviewed-By: Michael Dawson --- doc/api/http.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/api/http.md b/doc/api/http.md index beba834bd72d4f..cf1e5d76bf42bc 100644 --- a/doc/api/http.md +++ b/doc/api/http.md @@ -25,7 +25,7 @@ HTTP message headers are represented by an object like this: Keys are lowercased. Values are not modified. -In order to support the full spectrum of possible HTTP applications, Node.js's +In order to support the full spectrum of possible HTTP applications, the Node.js HTTP API is very low-level. It deals with stream handling and message parsing only. It parses a message into headers and body but it does not parse the actual headers or the body. From 672f76d6bd5a416d2d798f0b79021a327990377d Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Tue, 11 Feb 2020 15:07:06 -1000 Subject: [PATCH 14/91] doc: reword possessive form of Node.js in adding-new-napi-api.md MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Throughout the docs, we sometimes write the possessive of _Node.js_ as _Node.js'_ and other times as _Node.js's_. The former conforms with some generally accepted style guides (e.g., Associated Press Stylebook) while the latter complies with others (e.g., Chicago Manual of Style). Since there is no clear authoritative answer as to which form is correct, and since (at least to me) both are visually jarring and sometimes cause a pause to understand, I'd like to reword things to eliminate the possessive form where possible. This is one of those examples. PR-URL: https://github.com/nodejs/node/pull/31748 Reviewed-By: Benjamin Gruenbaum Reviewed-By: Tobias Nießen Reviewed-By: Luigi Pinca Reviewed-By: Richard Lau Reviewed-By: Michael Dawson --- doc/guides/adding-new-napi-api.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/guides/adding-new-napi-api.md b/doc/guides/adding-new-napi-api.md index dc8d9dda233f31..825b4877783f9b 100644 --- a/doc/guides/adding-new-napi-api.md +++ b/doc/guides/adding-new-napi-api.md @@ -1,6 +1,6 @@ # Contributing a new API to N-API -N-API is Node.js's next generation ABI-stable API for native modules. +N-API is the next-generation ABI-stable API for native modules. While improving the API surface is encouraged and welcomed, the following are a set of principles and guidelines to keep in mind while adding a new N-API API. From 98d262e5f3d04b9e737e13e1e19ba3127ecce285 Mon Sep 17 00:00:00 2001 From: Anna Henningsen Date: Fri, 14 Feb 2020 22:25:57 +0100 Subject: [PATCH 15/91] src: inform callback scopes about exceptions in HTTP parser Refs: https://github.com/nodejs/node/commit/4aca277f16b8649b5fc21d41f340fad0a47c2e61 Refs: https://github.com/nodejs/node/pull/30236 Fixes: https://github.com/nodejs/node/issues/31796 PR-URL: https://github.com/nodejs/node/pull/31801 Reviewed-By: James M Snell Reviewed-By: David Carlier --- src/node_http_parser.cc | 2 ++ ...est-http-uncaught-from-request-callback.js | 29 +++++++++++++++++++ 2 files changed, 31 insertions(+) create mode 100644 test/parallel/test-http-uncaught-from-request-callback.js diff --git a/src/node_http_parser.cc b/src/node_http_parser.cc index a8c48999c57901..40ece82b625746 100644 --- a/src/node_http_parser.cc +++ b/src/node_http_parser.cc @@ -330,6 +330,7 @@ class Parser : public AsyncWrap, public StreamListener { this, InternalCallbackScope::kSkipTaskQueues); head_response = cb.As()->Call( env()->context(), object(), arraysize(argv), argv); + if (head_response.IsEmpty()) callback_scope.MarkAsFailed(); } int64_t val; @@ -401,6 +402,7 @@ class Parser : public AsyncWrap, public StreamListener { InternalCallbackScope callback_scope( this, InternalCallbackScope::kSkipTaskQueues); r = cb.As()->Call(env()->context(), object(), 0, nullptr); + if (r.IsEmpty()) callback_scope.MarkAsFailed(); } if (r.IsEmpty()) { diff --git a/test/parallel/test-http-uncaught-from-request-callback.js b/test/parallel/test-http-uncaught-from-request-callback.js new file mode 100644 index 00000000000000..5c75958617898f --- /dev/null +++ b/test/parallel/test-http-uncaught-from-request-callback.js @@ -0,0 +1,29 @@ +'use strict'; +const common = require('../common'); +const asyncHooks = require('async_hooks'); +const http = require('http'); + +// Regression test for https://github.com/nodejs/node/issues/31796 + +asyncHooks.createHook({ + after: () => {} +}).enable(); + + +process.once('uncaughtException', common.mustCall(() => { + server.close(); +})); + +const server = http.createServer(common.mustCall((request, response) => { + response.writeHead(200, { 'Content-Type': 'text/plain' }); + response.end(); +})); + +server.listen(0, common.mustCall(() => { + http.get({ + host: 'localhost', + port: server.address().port + }, common.mustCall(() => { + throw new Error('whoah'); + })); +})); From 724bf3105be8c6eeea4138c11d82b87dad9fb6aa Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Tue, 11 Feb 2020 19:14:57 -0800 Subject: [PATCH 16/91] test: remove common.PORT from test-net-timeout MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Switch test-net-timeout from common.PORT to a port assigned by the operating system. PR-URL: https://github.com/nodejs/node/pull/31749 Reviewed-By: Denys Otrishko Reviewed-By: Tobias Nießen Reviewed-By: Luigi Pinca Reviewed-By: James M Snell Reviewed-By: Anna Henningsen Reviewed-By: Ruben Bridgewater --- test/pummel/test-net-timeout.js | 9 +++++---- 1 file changed, 5 insertions(+), 4 deletions(-) diff --git a/test/pummel/test-net-timeout.js b/test/pummel/test-net-timeout.js index 59a8d50f796d64..5b9f2a01b3823e 100644 --- a/test/pummel/test-net-timeout.js +++ b/test/pummel/test-net-timeout.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. 'use strict'; -const common = require('../common'); +require('../common'); const assert = require('assert'); const net = require('net'); @@ -54,10 +54,11 @@ const echo_server = net.createServer((socket) => { }); }); -echo_server.listen(common.PORT, () => { - console.log(`server listening at ${common.PORT}`); +echo_server.listen(0, () => { + const port = echo_server.address().port; + console.log(`server listening at ${port}`); - const client = net.createConnection(common.PORT); + const client = net.createConnection(port); client.setEncoding('UTF8'); client.setTimeout(0); // Disable the timeout for client client.on('connect', () => { From e76ac1d2c9b0275f7ee79d511bbf45567e9538bc Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Tue, 11 Feb 2020 19:18:27 -0800 Subject: [PATCH 17/91] test: remove common.PORT from test-net-throttle MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Switch test-net-throttle from common.PORT to a port assigned by the operating system. PR-URL: https://github.com/nodejs/node/pull/31749 Reviewed-By: Denys Otrishko Reviewed-By: Tobias Nießen Reviewed-By: Luigi Pinca Reviewed-By: James M Snell Reviewed-By: Anna Henningsen Reviewed-By: Ruben Bridgewater --- test/pummel/test-net-throttle.js | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/test/pummel/test-net-throttle.js b/test/pummel/test-net-throttle.js index 190c242d6e1636..9708d69f9621a3 100644 --- a/test/pummel/test-net-throttle.js +++ b/test/pummel/test-net-throttle.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. 'use strict'; -const common = require('../common'); +require('../common'); const assert = require('assert'); const net = require('net'); @@ -32,8 +32,6 @@ let npauses = 0; console.log('build big string'); const body = 'C'.repeat(N); -console.log(`start server on port ${common.PORT}`); - const server = net.createServer((connection) => { connection.write(body.slice(0, part_N)); connection.write(body.slice(part_N, 2 * part_N)); @@ -44,9 +42,11 @@ const server = net.createServer((connection) => { connection.end(); }); -server.listen(common.PORT, () => { +server.listen(0, () => { + const port = server.address().port; + console.log(`server started on port ${port}`); let paused = false; - const client = net.createConnection(common.PORT); + const client = net.createConnection(port); client.setEncoding('ascii'); client.on('data', (d) => { chars_recved += d.length; From 3fbd5ab265fde73c594bc846f0ab88bc635b4db6 Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Tue, 11 Feb 2020 19:21:12 -0800 Subject: [PATCH 18/91] test: remove common.PORT from test-tls-server-large-request MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Switch test-tls-server-large-request from common.PORT to a port assigned by the operating system. PR-URL: https://github.com/nodejs/node/pull/31749 Reviewed-By: Denys Otrishko Reviewed-By: Tobias Nießen Reviewed-By: Luigi Pinca Reviewed-By: James M Snell Reviewed-By: Anna Henningsen Reviewed-By: Ruben Bridgewater --- test/pummel/test-tls-server-large-request.js | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/test/pummel/test-tls-server-large-request.js b/test/pummel/test-tls-server-large-request.js index 5d3a0615bad6e2..7537ca813af41c 100644 --- a/test/pummel/test-tls-server-large-request.js +++ b/test/pummel/test-tls-server-large-request.js @@ -59,9 +59,9 @@ const server = tls.Server(options, common.mustCall(function(socket) { socket.pipe(mediator); })); -server.listen(common.PORT, common.mustCall(function() { +server.listen(0, common.mustCall(() => { const client1 = tls.connect({ - port: common.PORT, + port: server.address().port, rejectUnauthorized: false }, common.mustCall(function() { client1.end(request); From 87e9014764a154294b0047f331bc218432af41ba Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Tue, 11 Feb 2020 19:23:51 -0800 Subject: [PATCH 19/91] test: remove common.PORT from test-net-pause MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Switch test-net-pause from common.PORT to a port assigned by the operating system. PR-URL: https://github.com/nodejs/node/pull/31749 Reviewed-By: Denys Otrishko Reviewed-By: Tobias Nießen Reviewed-By: Luigi Pinca Reviewed-By: James M Snell Reviewed-By: Anna Henningsen Reviewed-By: Ruben Bridgewater --- test/pummel/test-net-pause.js | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/test/pummel/test-net-pause.js b/test/pummel/test-net-pause.js index 512d833ae75717..76237c17214d23 100644 --- a/test/pummel/test-net-pause.js +++ b/test/pummel/test-net-pause.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. 'use strict'; -const common = require('../common'); +require('../common'); const assert = require('assert'); const net = require('net'); @@ -43,7 +43,7 @@ const server = net.createServer((connection) => { }); server.on('listening', () => { - const client = net.createConnection(common.PORT); + const client = net.createConnection(server.address().port); client.setEncoding('ascii'); client.on('data', (d) => { console.log(d); @@ -83,7 +83,7 @@ server.on('listening', () => { client.end(); }); }); -server.listen(common.PORT); +server.listen(0); process.on('exit', () => { assert.strictEqual(recv.length, N); From bbb6cc733cccf0e93a5683b21a8cbc08222e267c Mon Sep 17 00:00:00 2001 From: Guy Bedford Date: Mon, 3 Feb 2020 13:31:12 +0200 Subject: [PATCH 20/91] module: package "exports" error refinements PR-URL: https://github.com/nodejs/node/pull/31625 Reviewed-By: Jan Krems --- doc/api/errors.md | 20 ++ doc/api/esm.md | 75 ++--- lib/internal/errors.js | 29 +- lib/internal/modules/cjs/loader.js | 104 ++++--- src/module_wrap.cc | 258 ++++++++++-------- src/node_errors.h | 4 +- test/es-module/test-esm-exports.mjs | 54 ++-- .../node_modules/pkgexports/package.json | 4 + .../pkgexports/resolve-self-invalid.js | 1 + .../pkgexports/resolve-self-invalid.mjs | 1 + 10 files changed, 331 insertions(+), 219 deletions(-) create mode 100644 test/fixtures/node_modules/pkgexports/resolve-self-invalid.js create mode 100644 test/fixtures/node_modules/pkgexports/resolve-self-invalid.mjs diff --git a/doc/api/errors.md b/doc/api/errors.md index 74a4faa88b929f..d231cf28857dc0 100644 --- a/doc/api/errors.md +++ b/doc/api/errors.md @@ -1319,6 +1319,12 @@ An invalid HTTP token was supplied. An IP address is not valid. + +### `ERR_INVALID_MODULE_SPECIFIER` + +The imported module string is an invalid URL, package name, or package subpath +specifier. + ### `ERR_INVALID_OPT_VALUE` @@ -1334,6 +1340,12 @@ An invalid or unknown file encoding was passed. An invalid `package.json` file was found which failed parsing. + +### `ERR_INVALID_PACKAGE_TARGET` + +The `package.json` [exports][] field contains an invalid target mapping value +for the attempted module resolution. + ### `ERR_INVALID_PERFORMANCE_MARK` @@ -1640,6 +1652,13 @@ A non-context-aware native addon was loaded in a process that disallows them. A given value is out of the accepted range. + +### `ERR_PACKAGE_PATH_NOT_EXPORTED` + +The `package.json` [exports][] field does not export the requested subpath. +Because exports are encapsulated, private internal modules that are not exported +cannot be imported through the package resolution, unless using an absolute URL. + ### `ERR_REQUIRE_ESM` @@ -2499,6 +2518,7 @@ such as `process.stdout.on('data')`. [crypto digest algorithm]: crypto.html#crypto_crypto_gethashes [domains]: domain.html [event emitter-based]: events.html#events_class_eventemitter +[exports]: esm.html#esm_package_exports [file descriptors]: https://en.wikipedia.org/wiki/File_descriptor [policy]: policy.html [stream-based]: stream.html diff --git a/doc/api/esm.md b/doc/api/esm.md index f1378b59ad10a2..7abaab87e0f6f5 100644 --- a/doc/api/esm.md +++ b/doc/api/esm.md @@ -1391,6 +1391,17 @@ of these top-level routines unless stated otherwise. _defaultEnv_ is the conditional environment name priority array, `["node", "import"]`. +The resolver can throw the following errors: +* _Invalid Module Specifier_: Module specifier is an invalid URL, package name + or package subpath specifier. +* _Invalid Package Configuration_: package.json configuration is invalid or + contains an invalid configuration. +* _Invalid Package Target_: Package exports define a target module within the + package that is an invalid type or string target. +* _Package Path Not Exported_: Package exports do not define or permit a target + subpath in the package for the given module. +* _Module Not Found_: The package or module requested does not exist. +
Resolver algorithm specification @@ -1401,7 +1412,7 @@ _defaultEnv_ is the conditional environment name priority array, > 1. Set _resolvedURL_ to the result of parsing and reserializing > _specifier_ as a URL. > 1. Otherwise, if _specifier_ starts with _"/"_, then -> 1. Throw an _Invalid Specifier_ error. +> 1. Throw an _Invalid Module Specifier_ error. > 1. Otherwise, if _specifier_ starts with _"./"_ or _"../"_, then > 1. Set _resolvedURL_ to the URL resolution of _specifier_ relative to > _parentURL_. @@ -1411,7 +1422,7 @@ _defaultEnv_ is the conditional environment name priority array, > **PACKAGE_RESOLVE**(_specifier_, _parentURL_). > 1. If _resolvedURL_ contains any percent encodings of _"/"_ or _"\\"_ (_"%2f"_ > and _"%5C"_ respectively), then -> 1. Throw an _Invalid Specifier_ error. +> 1. Throw an _Invalid Module Specifier_ error. > 1. If _resolvedURL_ does not end with a trailing _"/"_ and the file at > _resolvedURL_ does not exist, then > 1. Throw a _Module Not Found_ error. @@ -1425,14 +1436,14 @@ _defaultEnv_ is the conditional environment name priority array, > 1. Let _packageName_ be *undefined*. > 1. Let _packageSubpath_ be *undefined*. > 1. If _packageSpecifier_ is an empty string, then -> 1. Throw an _Invalid Specifier_ error. +> 1. Throw an _Invalid Module Specifier_ error. > 1. Otherwise, > 1. If _packageSpecifier_ does not contain a _"/"_ separator, then -> 1. Throw an _Invalid Specifier_ error. +> 1. Throw an _Invalid Module Specifier_ error. > 1. Set _packageName_ to the substring of _packageSpecifier_ > until the second _"/"_ separator or the end of the string. > 1. If _packageName_ starts with _"."_ or contains _"\\"_ or _"%"_, then -> 1. Throw an _Invalid Specifier_ error. +> 1. Throw an _Invalid Module Specifier_ error. > 1. Let _packageSubpath_ be _undefined_. > 1. If the length of _packageSpecifier_ is greater than the length of > _packageName_, then @@ -1440,7 +1451,7 @@ _defaultEnv_ is the conditional environment name priority array, > _packageSpecifier_ from the position at the length of _packageName_. > 1. If _packageSubpath_ contains any _"."_ or _".."_ segments or percent > encoded strings for _"/"_ or _"\\"_, then -> 1. Throw an _Invalid Specifier_ error. +> 1. Throw an _Invalid Module Specifier_ error. > 1. Set _selfUrl_ to the result of > **SELF_REFERENCE_RESOLVE**(_packageName_, _packageSubpath_, _parentURL_). > 1. If _selfUrl_ isn't empty, return _selfUrl_. @@ -1497,7 +1508,7 @@ _defaultEnv_ is the conditional environment name priority array, > 1. Throw a _Module Not Found_ error. > 1. If _pjson.exports_ is not **null** or **undefined**, then > 1. If _exports_ is an Object with both a key starting with _"."_ and a key -> not starting with _"."_, throw an "Invalid Package Configuration" error. +> not starting with _"."_, throw an _Invalid Package Configuration_ error. > 1. If _pjson.exports_ is a String or Array, or an Object containing no > keys starting with _"."_, then > 1. Return **PACKAGE_EXPORTS_TARGET_RESOLVE**(_packageURL_, @@ -1506,6 +1517,7 @@ _defaultEnv_ is the conditional environment name priority array, > 1. Let _mainExport_ be the _"."_ property in _pjson.exports_. > 1. Return **PACKAGE_EXPORTS_TARGET_RESOLVE**(_packageURL_, > _mainExport_, _""_). +> 1. Throw a _Package Path Not Exported_ error. > 1. If _pjson.main_ is a String, then > 1. Let _resolvedMain_ be the URL resolution of _packageURL_, "/", and > _pjson.main_. @@ -1520,7 +1532,7 @@ _defaultEnv_ is the conditional environment name priority array, **PACKAGE_EXPORTS_RESOLVE**(_packageURL_, _packagePath_, _exports_) > 1. If _exports_ is an Object with both a key starting with _"."_ and a key not -> starting with _"."_, throw an "Invalid Package Configuration" error. +> starting with _"."_, throw an _Invalid Package Configuration_ error. > 1. If _exports_ is an Object and all keys of _exports_ start with _"."_, then > 1. Set _packagePath_ to _"./"_ concatenated with _packagePath_. > 1. If _packagePath_ is a key of _exports_, then @@ -1536,43 +1548,44 @@ _defaultEnv_ is the conditional environment name priority array, > of the length of _directory_. > 1. Return **PACKAGE_EXPORTS_TARGET_RESOLVE**(_packageURL_, _target_, > _subpath_, _defaultEnv_). -> 1. Throw a _Module Not Found_ error. +> 1. Throw a _Package Path Not Exported_ error. **PACKAGE_EXPORTS_TARGET_RESOLVE**(_packageURL_, _target_, _subpath_, _env_) -> 1. If _target_ is a String, then -> 1. If _target_ does not start with _"./"_, throw a _Module Not Found_ -> error. -> 1. If _subpath_ has non-zero length and _target_ does not end with _"/"_, -> throw a _Module Not Found_ error. -> 1. If _target_ or _subpath_ contain any _"node_modules"_ segments including -> _"node_modules"_ percent-encoding, throw a _Module Not Found_ error. +> 1.If _target_ is a String, then +> 1. If _target_ does not start with _"./"_ or contains any _"node_modules"_ +> segments including _"node_modules"_ percent-encoding, throw an +> _Invalid Package Target_ error. > 1. Let _resolvedTarget_ be the URL resolution of the concatenation of > _packageURL_ and _target_. -> 1. If _resolvedTarget_ is contained in _packageURL_, then -> 1. Let _resolved_ be the URL resolution of the concatenation of -> _subpath_ and _resolvedTarget_. -> 1. If _resolved_ is contained in _resolvedTarget_, then -> 1. Return _resolved_. +> 1. If _resolvedTarget_ is not contained in _packageURL_, throw an +> _Invalid Package Target_ error. +> 1. If _subpath_ has non-zero length and _target_ does not end with _"/"_, +> throw an _Invalid Module Specifier_ error. +> 1. Let _resolved_ be the URL resolution of the concatenation of +> _subpath_ and _resolvedTarget_. +> 1. If _resolved_ is not contained in _resolvedTarget_, throw an +> _Invalid Module Specifier_ error. +> 1. Return _resolved_. > 1. Otherwise, if _target_ is a non-null Object, then > 1. If _exports_ contains any index property keys, as defined in ECMA-262 > [6.1.7 Array Index][], throw an _Invalid Package Configuration_ error. > 1. For each property _p_ of _target_, in object insertion order as, > 1. If _env_ contains an entry for _p_, then > 1. Let _targetValue_ be the value of the _p_ property in _target_. -> 1. Let _resolved_ be the result of **PACKAGE_EXPORTS_TARGET_RESOLVE** -> (_packageURL_, _targetValue_, _subpath_, _env_). -> 1. Assert: _resolved_ is a String. -> 1. Return _resolved_. +> 1. Return the result of **PACKAGE_EXPORTS_TARGET_RESOLVE**( +> _packageURL_, _targetValue_, _subpath_, _env_), continuing the +> loop on any _Package Path Not Exported_ error. +> 1. Throw a _Package Path Not Exported_ error. > 1. Otherwise, if _target_ is an Array, then +> 1. If _target.length is zero, throw an _Invalid Package Target_ error. > 1. For each item _targetValue_ in _target_, do > 1. If _targetValue_ is an Array, continue the loop. -> 1. Let _resolved_ be the result of -> **PACKAGE_EXPORTS_TARGET_RESOLVE**(_packageURL_, _targetValue_, -> _subpath_, _env_), continuing the loop on abrupt completion. -> 1. Assert: _resolved_ is a String. -> 1. Return _resolved_. -> 1. Throw a _Module Not Found_ error. +> 1. Return the result of **PACKAGE_EXPORTS_TARGET_RESOLVE**(_packageURL_, +> _targetValue_, _subpath_, _env_), continuing the loop on any +> _Package Path Not Exported_ or _Invalid Package Target_ error. +> 1. Throw the last fallback resolution error. +> 1. Otherwise throw an _Invalid Package Target_ error. **ESM_FORMAT**(_url_) diff --git a/lib/internal/errors.js b/lib/internal/errors.js index 319c836695e88f..8c18eabb7a8d06 100644 --- a/lib/internal/errors.js +++ b/lib/internal/errors.js @@ -13,16 +13,20 @@ const { ArrayIsArray, Error, + JSONStringify, Map, MathAbs, NumberIsInteger, ObjectDefineProperty, ObjectKeys, + StringPrototypeSlice, Symbol, SymbolFor, WeakMap, } = primordials; +const sep = process.platform === 'win32' ? '\\' : '/'; + const messages = new Map(); const codes = {}; @@ -1069,6 +1073,11 @@ E('ERR_INVALID_FILE_URL_PATH', 'File URL path %s', TypeError); E('ERR_INVALID_HANDLE_TYPE', 'This handle type cannot be sent', TypeError); E('ERR_INVALID_HTTP_TOKEN', '%s must be a valid HTTP token ["%s"]', TypeError); E('ERR_INVALID_IP_ADDRESS', 'Invalid IP address: %s', TypeError); +E('ERR_INVALID_MODULE_SPECIFIER', (pkgPath, subpath) => { + assert(subpath !== '.'); + return `Package subpath '${subpath}' is not a valid module request for the ` + + `"exports" resolution of ${pkgPath}${sep}package.json`; +}, TypeError); E('ERR_INVALID_OPT_VALUE', (name, value) => `The value "${String(value)}" is invalid for option "${name}"`, TypeError, @@ -1076,7 +1085,17 @@ E('ERR_INVALID_OPT_VALUE', (name, value) => E('ERR_INVALID_OPT_VALUE_ENCODING', 'The value "%s" is invalid for option "encoding"', TypeError); E('ERR_INVALID_PACKAGE_CONFIG', - 'Invalid package config for \'%s\', %s', Error); + `Invalid package config %s${sep}package.json, %s`, Error); +E('ERR_INVALID_PACKAGE_TARGET', (pkgPath, key, subpath, target) => { + if (key === '.') { + return `Invalid "exports" main target ${JSONStringify(target)} defined ` + + `in the package config ${pkgPath}${sep}package.json`; + } else { + return `Invalid "exports" target ${JSONStringify(target)} defined for '${ + StringPrototypeSlice(key, 0, -subpath.length || key.length)}' in the ` + + `package config ${pkgPath}${sep}package.json`; + } +}, Error); E('ERR_INVALID_PERFORMANCE_MARK', 'The "%s" performance mark has not been set', Error); E('ERR_INVALID_PROTOCOL', @@ -1221,6 +1240,14 @@ E('ERR_OUT_OF_RANGE', msg += ` It must be ${range}. Received ${received}`; return msg; }, RangeError); +E('ERR_PACKAGE_PATH_NOT_EXPORTED', (pkgPath, subpath) => { + if (subpath === '.') { + return `No "exports" main resolved in ${pkgPath}${sep}package.json`; + } else { + return `Package subpath '${subpath}' is not defined by "exports" in ${ + pkgPath}${sep}package.json`; + } +}, Error); E('ERR_REQUIRE_ESM', (filename, parentPath = null, packageJsonPath = null) => { let msg = `Must use import to load ES Module: ${filename}`; diff --git a/lib/internal/modules/cjs/loader.js b/lib/internal/modules/cjs/loader.js index a1b3270b0919ec..c2a8b2a3f7435b 100644 --- a/lib/internal/modules/cjs/loader.js +++ b/lib/internal/modules/cjs/loader.js @@ -82,6 +82,9 @@ const { ERR_INVALID_ARG_VALUE, ERR_INVALID_OPT_VALUE, ERR_INVALID_PACKAGE_CONFIG, + ERR_INVALID_PACKAGE_TARGET, + ERR_INVALID_MODULE_SPECIFIER, + ERR_PACKAGE_PATH_NOT_EXPORTED, ERR_REQUIRE_ESM } = require('internal/errors').codes; const { validateString } = require('internal/validators'); @@ -498,13 +501,9 @@ function applyExports(basePath, expansion) { if (ObjectPrototypeHasOwnProperty(pkgExports, mappingKey)) { const mapping = pkgExports[mappingKey]; return resolveExportsTarget(pathToFileURL(basePath + '/'), mapping, '', - basePath, mappingKey); + mappingKey); } - // Fallback to CJS main lookup when no main export is defined - if (mappingKey === '.') - return basePath; - let dirMatch = ''; for (const candidateKey of ObjectKeys(pkgExports)) { if (candidateKey[candidateKey.length - 1] !== '/') continue; @@ -518,18 +517,11 @@ function applyExports(basePath, expansion) { const mapping = pkgExports[dirMatch]; const subpath = StringPrototypeSlice(mappingKey, dirMatch.length); return resolveExportsTarget(pathToFileURL(basePath + '/'), mapping, - subpath, basePath, mappingKey); + subpath, mappingKey); } } - // Fallback to CJS main lookup when no main export is defined - if (mappingKey === '.') - return basePath; - // eslint-disable-next-line no-restricted-syntax - const e = new Error(`Package exports for '${basePath}' do not define ` + - `a '${mappingKey}' subpath`); - e.code = 'MODULE_NOT_FOUND'; - throw e; + throw new ERR_PACKAGE_PATH_NOT_EXPORTED(basePath, mappingKey); } // This only applies to requests of a specific form: @@ -564,39 +556,53 @@ function isArrayIndex(p) { return n >= 0 && n < (2 ** 32) - 1; } -function resolveExportsTarget(pkgPath, target, subpath, basePath, mappingKey) { +function resolveExportsTarget(baseUrl, target, subpath, mappingKey) { if (typeof target === 'string') { - if (target.startsWith('./') && - (subpath.length === 0 || target.endsWith('/'))) { - const resolvedTarget = new URL(target, pkgPath); - const pkgPathPath = pkgPath.pathname; - const resolvedTargetPath = resolvedTarget.pathname; - if (StringPrototypeStartsWith(resolvedTargetPath, pkgPathPath) && + let resolvedTarget, resolvedTargetPath; + const pkgPathPath = baseUrl.pathname; + if (StringPrototypeStartsWith(target, './')) { + resolvedTarget = new URL(target, baseUrl); + resolvedTargetPath = resolvedTarget.pathname; + if (!StringPrototypeStartsWith(resolvedTargetPath, pkgPathPath) || StringPrototypeIndexOf(resolvedTargetPath, '/node_modules/', - pkgPathPath.length - 1) === -1) { - const resolved = new URL(subpath, resolvedTarget); - const resolvedPath = resolved.pathname; - if (StringPrototypeStartsWith(resolvedPath, resolvedTargetPath) && - StringPrototypeIndexOf(resolvedPath, '/node_modules/', - pkgPathPath.length - 1) === -1) { - return fileURLToPath(resolved); - } - } + pkgPathPath.length - 1) !== -1) + resolvedTarget = undefined; } + if (subpath.length > 0 && target[target.length - 1] !== '/') + resolvedTarget = undefined; + if (resolvedTarget === undefined) + throw new ERR_INVALID_PACKAGE_TARGET(StringPrototypeSlice(baseUrl.pathname + , 0, -1), mappingKey, subpath, target); + const resolved = new URL(subpath, resolvedTarget); + const resolvedPath = resolved.pathname; + if (StringPrototypeStartsWith(resolvedPath, resolvedTargetPath) && + StringPrototypeIndexOf(resolvedPath, '/node_modules/', + pkgPathPath.length - 1) === -1) { + return fileURLToPath(resolved); + } + throw new ERR_INVALID_MODULE_SPECIFIER(StringPrototypeSlice(baseUrl.pathname + , 0, -1), mappingKey); } else if (ArrayIsArray(target)) { + if (target.length === 0) + throw new ERR_INVALID_PACKAGE_TARGET(StringPrototypeSlice(baseUrl.pathname + , 0, -1), mappingKey, subpath, target); for (const targetValue of target) { - if (ArrayIsArray(targetValue)) continue; try { - return resolveExportsTarget(pkgPath, targetValue, subpath, basePath, - mappingKey); + return resolveExportsTarget(baseUrl, targetValue, subpath, mappingKey); } catch (e) { - if (e.code !== 'MODULE_NOT_FOUND') throw e; + if (e.code !== 'ERR_PACKAGE_PATH_NOT_EXPORTED' && + e.code !== 'ERR_INVALID_PACKAGE_TARGET') + throw e; } } + // Throw last fallback error + resolveExportsTarget(baseUrl, target[target.length - 1], subpath, + mappingKey); + assert(false); } else if (typeof target === 'object' && target !== null) { const keys = ObjectKeys(target); if (keys.some(isArrayIndex)) { - throw new ERR_INVALID_PACKAGE_CONFIG(basePath, '"exports" cannot ' + + throw new ERR_INVALID_PACKAGE_CONFIG(baseUrl, '"exports" cannot ' + 'contain numeric property keys.'); } for (const p of keys) { @@ -605,34 +611,26 @@ function resolveExportsTarget(pkgPath, target, subpath, basePath, mappingKey) { case 'require': try { emitExperimentalWarning('Conditional exports'); - const result = resolveExportsTarget(pkgPath, target[p], subpath, - basePath, mappingKey); - return result; + return resolveExportsTarget(baseUrl, target[p], subpath, + mappingKey); } catch (e) { - if (e.code !== 'MODULE_NOT_FOUND') throw e; + if (e.code !== 'ERR_PACKAGE_PATH_NOT_EXPORTED') throw e; } break; case 'default': try { - return resolveExportsTarget(pkgPath, target.default, subpath, - basePath, mappingKey); + return resolveExportsTarget(baseUrl, target.default, subpath, + mappingKey); } catch (e) { - if (e.code !== 'MODULE_NOT_FOUND') throw e; + if (e.code !== 'ERR_PACKAGE_PATH_NOT_EXPORTED') throw e; } } } + throw new ERR_PACKAGE_PATH_NOT_EXPORTED( + StringPrototypeSlice(baseUrl.pathname, 0, -1), mappingKey + subpath); } - let e; - if (mappingKey !== '.') { - // eslint-disable-next-line no-restricted-syntax - e = new Error(`Package exports for '${basePath}' do not define a ` + - `valid '${mappingKey}' target${subpath ? ' for ' + subpath : ''}`); - } else { - // eslint-disable-next-line no-restricted-syntax - e = new Error(`No valid exports main found for '${basePath}'`); - } - e.code = 'MODULE_NOT_FOUND'; - throw e; + throw new ERR_INVALID_PACKAGE_TARGET( + StringPrototypeSlice(baseUrl.pathname, 0, -1), mappingKey, subpath, target); } Module._findPath = function(request, paths, isMain) { diff --git a/src/module_wrap.cc b/src/module_wrap.cc index 68359178f4ab38..436a6e98e73fe5 100644 --- a/src/module_wrap.cc +++ b/src/module_wrap.cc @@ -858,10 +858,20 @@ void ThrowExportsNotFound(Environment* env, const std::string& subpath, const URL& pjson_url, const URL& base) { - const std::string msg = "Package exports for " + - pjson_url.ToFilePath() + " do not define a '" + subpath + - "' subpath, imported from " + base.ToFilePath(); - node::THROW_ERR_MODULE_NOT_FOUND(env, msg.c_str()); + const std::string msg = "Package subpath '" + subpath + "' is not defined" + + " by \"exports\" in " + pjson_url.ToFilePath() + " imported from " + + base.ToFilePath(); + node::THROW_ERR_PACKAGE_PATH_NOT_EXPORTED(env, msg.c_str()); +} + +void ThrowSubpathInvalid(Environment* env, + const std::string& subpath, + const URL& pjson_url, + const URL& base) { + const std::string msg = "Package subpath '" + subpath + "' is not a valid " + + "module request for the \"exports\" resolution of " + + pjson_url.ToFilePath() + " imported from " + base.ToFilePath(); + node::THROW_ERR_INVALID_MODULE_SPECIFIER(env, msg.c_str()); } void ThrowExportsInvalid(Environment* env, @@ -870,14 +880,15 @@ void ThrowExportsInvalid(Environment* env, const URL& pjson_url, const URL& base) { if (subpath.length()) { - const std::string msg = "Cannot resolve package exports target '" + target + - "' matched for '" + subpath + "' in " + pjson_url.ToFilePath() + - ", imported from " + base.ToFilePath(); - node::THROW_ERR_MODULE_NOT_FOUND(env, msg.c_str()); + const std::string msg = "Invalid \"exports\" target \"" + target + + "\" defined for '" + subpath + "' in the package config " + + pjson_url.ToFilePath() + " imported from " + base.ToFilePath(); + node::THROW_ERR_INVALID_PACKAGE_TARGET(env, msg.c_str()); } else { - const std::string msg = "Cannot resolve package main '" + target + "' in" + - pjson_url.ToFilePath() + ", imported from " + base.ToFilePath(); - node::THROW_ERR_MODULE_NOT_FOUND(env, msg.c_str()); + const std::string msg = "Invalid \"exports\" main target " + target + + " defined in the package config " + pjson_url.ToFilePath() + + " imported from " + base.ToFilePath(); + node::THROW_ERR_INVALID_PACKAGE_TARGET(env, msg.c_str()); } } @@ -887,14 +898,20 @@ void ThrowExportsInvalid(Environment* env, const URL& pjson_url, const URL& base) { Local target_string; - if (target->ToString(env->context()).ToLocal(&target_string)) { - Utf8Value target_utf8(env->isolate(), target_string); - std::string target_str(*target_utf8, target_utf8.length()); - if (target->IsArray()) { - target_str = '[' + target_str + ']'; - } - ThrowExportsInvalid(env, subpath, target_str, pjson_url, base); + if (target->IsObject()) { + if (!v8::JSON::Stringify(env->context(), target.As(), + v8::String::Empty(env->isolate())).ToLocal(&target_string)) + return; + } else { + if (!target->ToString(env->context()).ToLocal(&target_string)) + return; + } + Utf8Value target_utf8(env->isolate(), target_string); + std::string target_str(*target_utf8, target_utf8.length()); + if (target->IsArray()) { + target_str = '[' + target_str + ']'; } + ThrowExportsInvalid(env, subpath, target_str, pjson_url, base); } Maybe ResolveExportsTargetString(Environment* env, @@ -902,18 +919,13 @@ Maybe ResolveExportsTargetString(Environment* env, const std::string& subpath, const std::string& match, const URL& pjson_url, - const URL& base, - bool throw_invalid = true) { + const URL& base) { if (target.substr(0, 2) != "./") { - if (throw_invalid) { - ThrowExportsInvalid(env, match, target, pjson_url, base); - } + ThrowExportsInvalid(env, match, target, pjson_url, base); return Nothing(); } if (subpath.length() > 0 && target.back() != '/') { - if (throw_invalid) { - ThrowExportsInvalid(env, match, target, pjson_url, base); - } + ThrowExportsInvalid(env, match, target, pjson_url, base); return Nothing(); } URL resolved(target, pjson_url); @@ -922,9 +934,7 @@ Maybe ResolveExportsTargetString(Environment* env, if (resolved_path.find(pkg_path) != 0 || resolved_path.find("/node_modules/", pkg_path.length() - 1) != std::string::npos) { - if (throw_invalid) { - ThrowExportsInvalid(env, match, target, pjson_url, base); - } + ThrowExportsInvalid(env, match, target, pjson_url, base); return Nothing(); } if (subpath.length() == 0) return Just(resolved); @@ -933,9 +943,7 @@ Maybe ResolveExportsTargetString(Environment* env, if (subpath_resolved_path.find(resolved_path) != 0 || subpath_resolved_path.find("/node_modules/", pkg_path.length() - 1) != std::string::npos) { - if (throw_invalid) { - ThrowExportsInvalid(env, match, target + subpath, pjson_url, base); - } + ThrowSubpathInvalid(env, match + subpath, pjson_url, base); return Nothing(); } return Just(subpath_resolved); @@ -965,15 +973,14 @@ Maybe ResolveExportsTarget(Environment* env, Local target, const std::string& subpath, const std::string& pkg_subpath, - const URL& base, - bool throw_invalid = true) { + const URL& base) { Isolate* isolate = env->isolate(); Local context = env->context(); if (target->IsString()) { Utf8Value target_utf8(isolate, target.As()); std::string target_str(*target_utf8, target_utf8.length()); Maybe resolved = ResolveExportsTargetString(env, target_str, subpath, - pkg_subpath, pjson_url, base, throw_invalid); + pkg_subpath, pjson_url, base); if (resolved.IsNothing()) { return Nothing(); } @@ -982,40 +989,56 @@ Maybe ResolveExportsTarget(Environment* env, Local target_arr = target.As(); const uint32_t length = target_arr->Length(); if (length == 0) { - if (throw_invalid) { - ThrowExportsInvalid(env, pkg_subpath, target, pjson_url, base); - } + ThrowExportsInvalid(env, pkg_subpath, target, pjson_url, base); return Nothing(); } for (uint32_t i = 0; i < length; i++) { auto target_item = target_arr->Get(context, i).ToLocalChecked(); - if (!target_item->IsArray()) { + { + TryCatchScope try_catch(env); Maybe resolved = ResolveExportsTarget(env, pjson_url, - target_item, subpath, pkg_subpath, base, false); - if (resolved.IsNothing()) continue; + target_item, subpath, pkg_subpath, base); + if (resolved.IsNothing()) { + CHECK(try_catch.HasCaught()); + if (try_catch.Exception().IsEmpty()) return Nothing(); + Local e; + if (!try_catch.Exception()->ToObject(context).ToLocal(&e)) + return Nothing(); + Local code; + if (!e->Get(context, env->code_string()).ToLocal(&code)) + return Nothing(); + Local code_string; + if (!code->ToString(context).ToLocal(&code_string)) + return Nothing(); + Utf8Value code_utf8(env->isolate(), code_string); + if (strcmp(*code_utf8, "ERR_PACKAGE_PATH_NOT_EXPORTED") == 0 || + strcmp(*code_utf8, "ERR_INVALID_PACKAGE_TARGET") == 0) { + continue; + } + try_catch.ReThrow(); + return Nothing(); + } + CHECK(!try_catch.HasCaught()); return FinalizeResolution(env, resolved.FromJust(), base); } } - if (throw_invalid) { - auto invalid = target_arr->Get(context, length - 1).ToLocalChecked(); - Maybe resolved = ResolveExportsTarget(env, pjson_url, invalid, - subpath, pkg_subpath, base, true); - CHECK(resolved.IsNothing()); - } + auto invalid = target_arr->Get(context, length - 1).ToLocalChecked(); + Maybe resolved = ResolveExportsTarget(env, pjson_url, invalid, + subpath, pkg_subpath, base); + CHECK(resolved.IsNothing()); return Nothing(); } else if (target->IsObject()) { Local target_obj = target.As(); Local target_obj_keys = target_obj->GetOwnPropertyNames(context).ToLocalChecked(); Local conditionalTarget; - bool matched = false; for (uint32_t i = 0; i < target_obj_keys->Length(); ++i) { Local key = target_obj_keys->Get(context, i).ToLocalChecked(); if (IsArrayIndex(env, key)) { - const std::string msg = "Invalid package config for " + - pjson_url.ToFilePath() + ", \"exports\" cannot contain numeric " + - "property keys."; + const std::string msg = "Invalid package config " + + pjson_url.ToFilePath() + " imported from " + base.ToFilePath() + + ". \"exports\" cannot contain numeric property keys."; node::THROW_ERR_INVALID_PACKAGE_CONFIG(env, msg.c_str()); return Nothing(); } @@ -1026,35 +1049,60 @@ Maybe ResolveExportsTarget(Environment* env, key->ToString(context).ToLocalChecked()); std::string key_str(*key_utf8, key_utf8.length()); if (key_str == "node" || key_str == "import") { - matched = true; conditionalTarget = target_obj->Get(context, key).ToLocalChecked(); - Maybe resolved = ResolveExportsTarget(env, pjson_url, - conditionalTarget, subpath, pkg_subpath, base, false); - if (!resolved.IsNothing()) { + { + TryCatchScope try_catch(env); + Maybe resolved = ResolveExportsTarget(env, pjson_url, + conditionalTarget, subpath, pkg_subpath, base); + if (resolved.IsNothing()) { + CHECK(try_catch.HasCaught()); + if (try_catch.Exception().IsEmpty()) return Nothing(); + Local e; + if (!try_catch.Exception()->ToObject(context).ToLocal(&e)) + return Nothing(); + Local code; + if (!e->Get(context, env->code_string()).ToLocal(&code)) + return Nothing(); + Local code_string; + if (!code->ToString(context).ToLocal(&code_string)) + return Nothing(); + Utf8Value code_utf8(env->isolate(), code_string); + if (strcmp(*code_utf8, "ERR_PACKAGE_PATH_NOT_EXPORTED") == 0) + continue; + try_catch.ReThrow(); + return Nothing(); + } + CHECK(!try_catch.HasCaught()); ProcessEmitExperimentalWarning(env, "Conditional exports"); return resolved; } } else if (key_str == "default") { - matched = true; conditionalTarget = target_obj->Get(context, key).ToLocalChecked(); - Maybe resolved = ResolveExportsTarget(env, pjson_url, - conditionalTarget, subpath, pkg_subpath, base, false); - if (!resolved.IsNothing()) { + { + TryCatchScope try_catch(env); + Maybe resolved = ResolveExportsTarget(env, pjson_url, + conditionalTarget, subpath, pkg_subpath, base); + if (resolved.IsNothing()) { + CHECK(try_catch.HasCaught() && !try_catch.Exception().IsEmpty()); + auto e = try_catch.Exception()->ToObject(context).ToLocalChecked(); + auto code = e->Get(context, env->code_string()).ToLocalChecked(); + Utf8Value code_utf8(env->isolate(), + code->ToString(context).ToLocalChecked()); + std::string code_str(*code_utf8, code_utf8.length()); + if (code_str == "ERR_PACKAGE_PATH_NOT_EXPORTED") continue; + try_catch.ReThrow(); + return Nothing(); + } + CHECK(!try_catch.HasCaught()); ProcessEmitExperimentalWarning(env, "Conditional exports"); return resolved; } } } - if (matched && throw_invalid) { - Maybe resolved = ResolveExportsTarget(env, pjson_url, - conditionalTarget, subpath, pkg_subpath, base, true); - CHECK(resolved.IsNothing()); - return Nothing(); - } - } - if (throw_invalid) { - ThrowExportsInvalid(env, pkg_subpath, target, pjson_url, base); + ThrowExportsNotFound(env, pkg_subpath, pjson_url, base); + return Nothing(); } + ThrowExportsInvalid(env, pkg_subpath, target, pjson_url, base); return Nothing(); } @@ -1076,8 +1124,8 @@ Maybe IsConditionalExportsMainSugar(Environment* env, if (i == 0) { isConditionalSugar = curIsConditionalSugar; } else if (isConditionalSugar != curIsConditionalSugar) { - const std::string msg = "Cannot resolve package exports in " + - pjson_url.ToFilePath() + ", imported from " + base.ToFilePath() + ". " + + const std::string msg = "Invalid package config " + pjson_url.ToFilePath() + + " imported from " + base.ToFilePath() + ". " + "\"exports\" cannot contain some keys starting with '.' and some not." + " The exports object must either be an object of package subpath keys" + " or an object of main entry condition name keys only."; @@ -1102,8 +1150,7 @@ Maybe PackageMainResolve(Environment* env, if (isConditionalExportsMainSugar.IsNothing()) return Nothing(); if (isConditionalExportsMainSugar.FromJust()) { - return ResolveExportsTarget(env, pjson_url, exports, "", "", base, - true); + return ResolveExportsTarget(env, pjson_url, exports, "", "", base); } else if (exports->IsObject()) { Local exports_obj = exports.As(); if (exports_obj->HasOwnProperty(env->context(), env->dot_string()) @@ -1111,10 +1158,12 @@ Maybe PackageMainResolve(Environment* env, Local target = exports_obj->Get(env->context(), env->dot_string()) .ToLocalChecked(); - return ResolveExportsTarget(env, pjson_url, target, "", "", base, - true); + return ResolveExportsTarget(env, pjson_url, target, "", "", base); } } + std::string msg = "No \"exports\" main resolved in " + + pjson_url.ToFilePath(); + node::THROW_ERR_PACKAGE_PATH_NOT_EXPORTED(env, msg.c_str()); } if (pcfg.has_main == HasMain::Yes) { URL resolved(pcfg.main, pjson_url); @@ -1206,39 +1255,6 @@ Maybe PackageExportsResolve(Environment* env, return Nothing(); } -Maybe ResolveSelf(Environment* env, - const std::string& pkg_name, - const std::string& pkg_subpath, - const URL& base) { - const PackageConfig* pcfg; - if (GetPackageScopeConfig(env, base, base).To(&pcfg) && - pcfg->exists == Exists::Yes) { - // TODO(jkrems): Find a way to forward the pair/iterator already generated - // while executing GetPackageScopeConfig - URL pjson_url(""); - bool found_pjson = false; - for (auto it = env->package_json_cache.begin(); - it != env->package_json_cache.end(); - ++it) { - if (&it->second == pcfg) { - pjson_url = URL::FromFilePath(it->first); - found_pjson = true; - } - } - if (!found_pjson || pcfg->name != pkg_name) return Nothing(); - if (pcfg->exports.IsEmpty()) return Nothing(); - if (pkg_subpath == "./") { - return Just(URL("./", pjson_url)); - } else if (!pkg_subpath.length()) { - return PackageMainResolve(env, pjson_url, *pcfg, base); - } else { - return PackageExportsResolve(env, pjson_url, pkg_subpath, *pcfg, base); - } - } - - return Nothing(); -} - Maybe PackageResolve(Environment* env, const std::string& specifier, const URL& base) { @@ -1279,10 +1295,30 @@ Maybe PackageResolve(Environment* env, pkg_subpath = "." + specifier.substr(sep_index); } - Maybe self_url = ResolveSelf(env, pkg_name, pkg_subpath, base); - if (self_url.IsJust()) { - ProcessEmitExperimentalWarning(env, "Package name self resolution"); - return self_url; + // ResolveSelf + const PackageConfig* pcfg; + if (GetPackageScopeConfig(env, base, base).To(&pcfg) && + pcfg->exists == Exists::Yes) { + // TODO(jkrems): Find a way to forward the pair/iterator already generated + // while executing GetPackageScopeConfig + URL pjson_url(""); + bool found_pjson = false; + for (const auto& it : env->package_json_cache) { + if (&it.second == pcfg) { + pjson_url = URL::FromFilePath(it.first); + found_pjson = true; + } + } + if (found_pjson && pcfg->name == pkg_name && !pcfg->exports.IsEmpty()) { + ProcessEmitExperimentalWarning(env, "Package name self resolution"); + if (pkg_subpath == "./") { + return Just(URL("./", pjson_url)); + } else if (!pkg_subpath.length()) { + return PackageMainResolve(env, pjson_url, *pcfg, base); + } else { + return PackageExportsResolve(env, pjson_url, pkg_subpath, *pcfg, base); + } + } } URL pjson_url("./node_modules/" + pkg_name + "/package.json", &base); diff --git a/src/node_errors.h b/src/node_errors.h index d56bf7ef5a5a53..a05ce8f6bfb1bd 100644 --- a/src/node_errors.h +++ b/src/node_errors.h @@ -43,7 +43,8 @@ void OnFatalError(const char* location, const char* message); V(ERR_OSSL_EVP_INVALID_DIGEST, Error) \ V(ERR_INVALID_ARG_TYPE, TypeError) \ V(ERR_INVALID_MODULE_SPECIFIER, TypeError) \ - V(ERR_INVALID_PACKAGE_CONFIG, SyntaxError) \ + V(ERR_INVALID_PACKAGE_CONFIG, Error) \ + V(ERR_INVALID_PACKAGE_TARGET, Error) \ V(ERR_INVALID_TRANSFER_OBJECT, TypeError) \ V(ERR_MEMORY_ALLOCATION_FAILED, Error) \ V(ERR_MISSING_ARGS, TypeError) \ @@ -53,6 +54,7 @@ void OnFatalError(const char* location, const char* message); V(ERR_NON_CONTEXT_AWARE_DISABLED, Error) \ V(ERR_MODULE_NOT_FOUND, Error) \ V(ERR_OUT_OF_RANGE, RangeError) \ + V(ERR_PACKAGE_PATH_NOT_EXPORTED, Error) \ V(ERR_SCRIPT_EXECUTION_INTERRUPTED, Error) \ V(ERR_SCRIPT_EXECUTION_TIMEOUT, Error) \ V(ERR_STRING_TOO_LONG, Error) \ diff --git a/test/es-module/test-esm-exports.mjs b/test/es-module/test-esm-exports.mjs index bdd4a975cf748e..7dbc963502824e 100644 --- a/test/es-module/test-esm-exports.mjs +++ b/test/es-module/test-esm-exports.mjs @@ -32,7 +32,7 @@ import fromInside from '../fixtures/node_modules/pkgexports/lib/hole.js'; ['pkgexports/resolve-self', isRequire ? { default: 'self-cjs' } : { default: 'self-mjs' }], // Resolve self sugar - ['pkgexports-sugar', { default: 'main' }] + ['pkgexports-sugar', { default: 'main' }], ]); for (const [validSpecifier, expected] of validSpecifiers) { @@ -53,48 +53,59 @@ import fromInside from '../fixtures/node_modules/pkgexports/lib/hole.js'; // Sugar cases still encapsulate ['pkgexports-sugar/not-exported.js', './not-exported.js'], ['pkgexports-sugar2/not-exported.js', './not-exported.js'], + // Conditional exports with no match are "not exported" errors + ['pkgexports/invalid1', './invalid1'], + ['pkgexports/invalid4', './invalid4'], ]); const invalidExports = new Map([ - // Even though 'pkgexports/sub/asdf.js' works, alternate "path-like" - // variants do not to prevent confusion and accidental loopholes. - ['pkgexports/sub/./../asdf.js', './sub/./../asdf.js'], + // Directory mappings require a trailing / to work + ['pkgexports/missingtrailer/x', './missingtrailer/'], // This path steps back inside the package but goes through an exports // target that escapes the package, so we still catch that as invalid - ['pkgexports/belowdir/pkgexports/asdf.js', './belowdir/pkgexports/asdf.js'], + ['pkgexports/belowdir/pkgexports/asdf.js', './belowdir/'], // This target file steps below the package ['pkgexports/belowfile', './belowfile'], - // Directory mappings require a trailing / to work - ['pkgexports/missingtrailer/x', './missingtrailer/x'], // Invalid target handling ['pkgexports/null', './null'], - ['pkgexports/invalid1', './invalid1'], ['pkgexports/invalid2', './invalid2'], ['pkgexports/invalid3', './invalid3'], - ['pkgexports/invalid4', './invalid4'], // Missing / invalid fallbacks ['pkgexports/nofallback1', './nofallback1'], ['pkgexports/nofallback2', './nofallback2'], // Reaching into nested node_modules ['pkgexports/nodemodules', './nodemodules'], + // Self resolve invalid + ['pkgexports/resolve-self-invalid', './invalid2'], + ]); + + const invalidSpecifiers = new Map([ + // Even though 'pkgexports/sub/asdf.js' works, alternate "path-like" + // variants do not to prevent confusion and accidental loopholes. + ['pkgexports/sub/./../asdf.js', './sub/./../asdf.js'], ]); for (const [specifier, subpath] of undefinedExports) { loadFixture(specifier).catch(mustCall((err) => { - strictEqual(err.code, (isRequire ? '' : 'ERR_') + 'MODULE_NOT_FOUND'); - assertStartsWith(err.message, 'Package exports'); - assertIncludes(err.message, `do not define a '${subpath}' subpath`); + strictEqual(err.code, 'ERR_PACKAGE_PATH_NOT_EXPORTED'); + assertStartsWith(err.message, 'Package subpath '); + assertIncludes(err.message, subpath); })); } for (const [specifier, subpath] of invalidExports) { loadFixture(specifier).catch(mustCall((err) => { - strictEqual(err.code, (isRequire ? '' : 'ERR_') + 'MODULE_NOT_FOUND'); - assertStartsWith(err.message, (isRequire ? 'Package exports' : - 'Cannot resolve')); - assertIncludes(err.message, isRequire ? - `do not define a valid '${subpath}' target` : - `matched for '${subpath}'`); + strictEqual(err.code, 'ERR_INVALID_PACKAGE_TARGET'); + assertStartsWith(err.message, 'Invalid "exports"'); + assertIncludes(err.message, subpath); + })); + } + + for (const [specifier, subpath] of invalidSpecifiers) { + loadFixture(specifier).catch(mustCall((err) => { + strictEqual(err.code, 'ERR_INVALID_MODULE_SPECIFIER'); + assertStartsWith(err.message, 'Package subpath '); + assertIncludes(err.message, subpath); })); } @@ -102,8 +113,8 @@ import fromInside from '../fixtures/node_modules/pkgexports/lib/hole.js'; // of falling back to main if (isRequire) { loadFixture('pkgexports-main').catch(mustCall((err) => { - strictEqual(err.code, 'MODULE_NOT_FOUND'); - assertStartsWith(err.message, 'No valid export'); + strictEqual(err.code, 'ERR_PACKAGE_PATH_NOT_EXPORTED'); + assertStartsWith(err.message, 'No "exports" main '); })); } @@ -130,8 +141,7 @@ import fromInside from '../fixtures/node_modules/pkgexports/lib/hole.js'; // Sugar conditional exports main mixed failure case loadFixture('pkgexports-sugar-fail').catch(mustCall((err) => { strictEqual(err.code, 'ERR_INVALID_PACKAGE_CONFIG'); - assertStartsWith(err.message, (isRequire ? 'Invalid package' : - 'Cannot resolve')); + assertStartsWith(err.message, 'Invalid package'); assertIncludes(err.message, '"exports" cannot contain some keys starting ' + 'with \'.\' and some not. The exports object must either be an object of ' + 'package subpath keys or an object of main entry condition name keys ' + diff --git a/test/fixtures/node_modules/pkgexports/package.json b/test/fixtures/node_modules/pkgexports/package.json index 02e06f0ebe5f3c..7f417ad5457bfc 100644 --- a/test/fixtures/node_modules/pkgexports/package.json +++ b/test/fixtures/node_modules/pkgexports/package.json @@ -35,6 +35,10 @@ "./resolve-self": { "require": "./resolve-self.js", "import": "./resolve-self.mjs" + }, + "./resolve-self-invalid": { + "require": "./resolve-self-invalid.js", + "import": "./resolve-self-invalid.mjs" } } } diff --git a/test/fixtures/node_modules/pkgexports/resolve-self-invalid.js b/test/fixtures/node_modules/pkgexports/resolve-self-invalid.js new file mode 100644 index 00000000000000..c3ebf76fc1b2f3 --- /dev/null +++ b/test/fixtures/node_modules/pkgexports/resolve-self-invalid.js @@ -0,0 +1 @@ +require('pkg-exports/invalid2'); diff --git a/test/fixtures/node_modules/pkgexports/resolve-self-invalid.mjs b/test/fixtures/node_modules/pkgexports/resolve-self-invalid.mjs new file mode 100644 index 00000000000000..1edbf62c4b0114 --- /dev/null +++ b/test/fixtures/node_modules/pkgexports/resolve-self-invalid.mjs @@ -0,0 +1 @@ +import 'pkg-exports/invalid2'; From 9079bb42ea20e25d384d0320caeed9cafe5131b4 Mon Sep 17 00:00:00 2001 From: Robert Nagy Date: Tue, 13 Nov 2018 23:25:51 +0100 Subject: [PATCH 21/91] http2: make compat finished match http/1 finished should true directly after end(). PR-URL: https://github.com/nodejs/node/pull/24347 Refs: https://github.com/nodejs/node/issues/24743 Reviewed-By: Matteo Collina Reviewed-By: Ujjwal Sharma Reviewed-By: Anatoli Papirovski Reviewed-By: James M Snell --- lib/internal/http2/compat.js | 11 ++++------- test/parallel/test-http2-compat-serverresponse-end.js | 4 +++- 2 files changed, 7 insertions(+), 8 deletions(-) diff --git a/lib/internal/http2/compat.js b/lib/internal/http2/compat.js index 8ef5f49a3dbd77..479510152c9522 100644 --- a/lib/internal/http2/compat.js +++ b/lib/internal/http2/compat.js @@ -471,10 +471,8 @@ class Http2ServerResponse extends Stream { } get finished() { - const stream = this[kStream]; - return stream.destroyed || - stream._writableState.ended || - this[kState].closed; + const state = this[kState]; + return state.ending; } get socket() { @@ -700,12 +698,11 @@ class Http2ServerResponse extends Stream { if (chunk !== null && chunk !== undefined) this.write(chunk, encoding); - const isFinished = this.finished; state.headRequest = stream.headRequest; state.ending = true; if (typeof cb === 'function') { - if (isFinished) + if (stream.writableEnded) this.once('finish', cb); else stream.once('finish', cb); @@ -714,7 +711,7 @@ class Http2ServerResponse extends Stream { if (!stream.headersSent) this.writeHead(this[kState].statusCode); - if (isFinished) + if (this[kState].closed || stream.destroyed) onStreamCloseResponse.call(stream); else stream.end(); diff --git a/test/parallel/test-http2-compat-serverresponse-end.js b/test/parallel/test-http2-compat-serverresponse-end.js index 5bbb24bb2edb31..8505d6c4969db1 100644 --- a/test/parallel/test-http2-compat-serverresponse-end.js +++ b/test/parallel/test-http2-compat-serverresponse-end.js @@ -149,11 +149,13 @@ const { // Http2ServerResponse.end is necessary on HEAD requests in compat // for http1 compatibility const server = createServer(mustCall((request, response) => { - strictEqual(response.finished, true); strictEqual(response.writableEnded, false); + strictEqual(response.finished, false); response.writeHead(HTTP_STATUS_OK, { foo: 'bar' }); + strictEqual(response.finished, false); response.end('data', mustCall()); strictEqual(response.writableEnded, true); + strictEqual(response.finished, true); })); server.listen(0, mustCall(() => { const { port } = server.address(); From 3438937a37cd27ba982af7e40a55f5664a9af219 Mon Sep 17 00:00:00 2001 From: Shelley Vohr Date: Tue, 18 Feb 2020 13:13:59 -0800 Subject: [PATCH 22/91] doc: fix notable changes for v13.9.0 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/31857 Reviewed-By: Beth Griggs Reviewed-By: Anna Henningsen Reviewed-By: Tobias Nießen Reviewed-By: Myles Borins --- doc/changelogs/CHANGELOG_V13.md | 29 +++++++++++++++++++++-------- 1 file changed, 21 insertions(+), 8 deletions(-) diff --git a/doc/changelogs/CHANGELOG_V13.md b/doc/changelogs/CHANGELOG_V13.md index a9dfd87bf10735..3f255fb09d2df3 100644 --- a/doc/changelogs/CHANGELOG_V13.md +++ b/doc/changelogs/CHANGELOG_V13.md @@ -44,14 +44,27 @@ ### Notable changes -* [[`6be51296e4`](https://github.com/nodejs/node/commit/6be51296e4)] - **(SEMVER-MINOR)** **async_hooks**: add executionAsyncResource (Matteo Collina) [#30959](https://github.com/nodejs/node/pull/30959) -* [[`15b24b71ce`](https://github.com/nodejs/node/commit/15b24b71ce)] - **doc**: add ronag to collaborators (Robert Nagy) [#31498](https://github.com/nodejs/node/pull/31498) -* [[`1bcf2f9423`](https://github.com/nodejs/node/commit/1bcf2f9423)] - **report**: add support for Workers (Anna Henningsen) [#31386](https://github.com/nodejs/node/pull/31386) -* [[`676b84a803`](https://github.com/nodejs/node/commit/676b84a803)] - **(SEMVER-MINOR)** **test**: skip keygen tests on arm systems (Tobias Nießen) [#31178](https://github.com/nodejs/node/pull/31178) -* [[`bf46c304dd`](https://github.com/nodejs/node/commit/bf46c304dd)] - **(SEMVER-MINOR)** **crypto**: add crypto.diffieHellman (Tobias Nießen) [#31178](https://github.com/nodejs/node/pull/31178) -* [[`0d3e095941`](https://github.com/nodejs/node/commit/0d3e095941)] - **(SEMVER-MINOR)** **crypto**: add DH support to generateKeyPair (Tobias Nießen) [#31178](https://github.com/nodejs/node/pull/31178) -* [[`15bd2c9f0c`](https://github.com/nodejs/node/commit/15bd2c9f0c)] - **(SEMVER-MINOR)** **crypto**: simplify DH groups (Tobias Nießen) [#31178](https://github.com/nodejs/node/pull/31178) -* [[`572322fddf`](https://github.com/nodejs/node/commit/572322fddf)] - **(SEMVER-MINOR)** **crypto**: add key type 'dh' (Tobias Nießen) [#31178](https://github.com/nodejs/node/pull/31178) +* **async_hooks** + * add executionAsyncResource (Matteo Collina) [#30959](https://github.com/nodejs/node/pull/30959) +* **crypto** + * add crypto.diffieHellman (Tobias Nießen) [#31178](https://github.com/nodejs/node/pull/31178) + * add DH support to generateKeyPair (Tobias Nießen) [#31178](https://github.com/nodejs/node/pull/31178) + * simplify DH groups (Tobias Nießen) [#31178](https://github.com/nodejs/node/pull/31178) + * add key type 'dh' (Tobias Nießen) [#31178](https://github.com/nodejs/node/pull/31178) +* **test** + * skip keygen tests on arm systems (Tobias Nießen) [#31178](https://github.com/nodejs/node/pull/31178) +* **perf_hooks** + * add property flags to GCPerformanceEntry (Kirill Fomichev) [#29547](https://github.com/nodejs/node/pull/29547) +* **process** + * report ArrayBuffer memory in `memoryUsage()` (Anna Henningsen) [#31550](https://github.com/nodejs/node/pull/31550) +* **readline** + * make tab size configurable (Ruben Bridgewater) [#31318](https://github.com/nodejs/node/pull/31318) +* **report** + * add support for Workers (Anna Henningsen) [#31386](https://github.com/nodejs/node/pull/31386) +* **worker** + * add ability to take heap snapshot from parent thread (Anna Henningsen) [#31569](https://github.com/nodejs/node/pull/31569) +* **added new collaborators** + * add ronag to collaborators (Robert Nagy) [#31498](https://github.com/nodejs/node/pull/31498) ### Commits From 94a471a42241e6c61619c3523fd8d5743223314c Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Tue, 11 Feb 2020 00:42:17 -1000 Subject: [PATCH 23/91] meta: move eljefedelrodeodeljefe to emeritus MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit eljefedelrodeodeljefe confirmed in email that moving to emeritus was fine at this time. PR-URL: https://github.com/nodejs/node/pull/31735 Reviewed-By: Colin Ihrig Reviewed-By: James M Snell Reviewed-By: Luigi Pinca Reviewed-By: Ruben Bridgewater Reviewed-By: Gireesh Punathil Reviewed-By: Michaël Zasso --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index a06c51f22a4cd1..bff42f1114a3a9 100644 --- a/README.md +++ b/README.md @@ -283,8 +283,6 @@ For information about the governance of the Node.js project, see **Hitesh Kanwathirtha** <digitalinfinity@gmail.com> (he/him) * [edsadr](https://github.com/edsadr) - **Adrian Estrada** <edsadr@gmail.com> (he/him) -* [eljefedelrodeodeljefe](https://github.com/eljefedelrodeodeljefe) - -**Robert Jefe Lindstaedt** <robert.lindstaedt@gmail.com> * [eugeneo](https://github.com/eugeneo) - **Eugene Ostroukhov** <eostroukhov@google.com> * [evanlucas](https://github.com/evanlucas) - @@ -458,6 +456,8 @@ For information about the governance of the Node.js project, see **Chris Dickinson** <christopher.s.dickinson@gmail.com> * [DavidCai1993](https://github.com/DavidCai1993) - **David Cai** <davidcai1993@yahoo.com> (he/him) +* [eljefedelrodeodeljefe](https://github.com/eljefedelrodeodeljefe) - +**Robert Jefe Lindstaedt** <robert.lindstaedt@gmail.com> * [estliberitas](https://github.com/estliberitas) - **Alexander Makarenko** <estliberitas@gmail.com> * [firedfox](https://github.com/firedfox) - From a86cb0e480b0a47eed5cc36366d7b90737d587f4 Mon Sep 17 00:00:00 2001 From: Joyee Cheung Date: Tue, 11 Feb 2020 22:49:24 +0800 Subject: [PATCH 24/91] vm: lazily initialize primordials for vm contexts Lazily initialize primordials when cross-context support for builtins is needed to fix the performance regression in context creation. PR-URL: https://github.com/nodejs/node/pull/31738 Fixes: https://github.com/nodejs/node/issues/29842 Reviewed-By: Gus Caplan Reviewed-By: Anna Henningsen Reviewed-By: David Carlier --- src/api/environment.cc | 84 +++++++++++++++++++++--------------------- src/node_contextify.cc | 7 +++- src/node_internals.h | 1 + 3 files changed, 49 insertions(+), 43 deletions(-) diff --git a/src/api/environment.cc b/src/api/environment.cc index f25a4cf183292f..0096b498e1c511 100644 --- a/src/api/environment.cc +++ b/src/api/environment.cc @@ -411,7 +411,8 @@ MaybeLocal GetPerContextExports(Local context) { return handle_scope.Escape(existing_value.As()); Local exports = Object::New(isolate); - if (context->Global()->SetPrivate(context, key, exports).IsNothing()) + if (context->Global()->SetPrivate(context, key, exports).IsNothing() || + !InitializePrimordials(context)) return MaybeLocal(); return handle_scope.Escape(exports); } @@ -467,49 +468,50 @@ bool InitializeContextForSnapshot(Local context) { context->SetEmbedderData(ContextEmbedderIndex::kAllowWasmCodeGeneration, True(isolate)); + return InitializePrimordials(context); +} + +bool InitializePrimordials(Local context) { + // Run per-context JS files. + Isolate* isolate = context->GetIsolate(); + Context::Scope context_scope(context); + Local exports; + + Local primordials_string = + FIXED_ONE_BYTE_STRING(isolate, "primordials"); + Local global_string = FIXED_ONE_BYTE_STRING(isolate, "global"); + Local exports_string = FIXED_ONE_BYTE_STRING(isolate, "exports"); + + // Create primordials first and make it available to per-context scripts. + Local primordials = Object::New(isolate); + if (!primordials->SetPrototype(context, Null(isolate)).FromJust() || + !GetPerContextExports(context).ToLocal(&exports) || + !exports->Set(context, primordials_string, primordials).FromJust()) { + return false; + } - { - // Run per-context JS files. - Context::Scope context_scope(context); - Local exports; - - Local primordials_string = - FIXED_ONE_BYTE_STRING(isolate, "primordials"); - Local global_string = FIXED_ONE_BYTE_STRING(isolate, "global"); - Local exports_string = FIXED_ONE_BYTE_STRING(isolate, "exports"); - - // Create primordials first and make it available to per-context scripts. - Local primordials = Object::New(isolate); - if (!primordials->SetPrototype(context, Null(isolate)).FromJust() || - !GetPerContextExports(context).ToLocal(&exports) || - !exports->Set(context, primordials_string, primordials).FromJust()) { + static const char* context_files[] = {"internal/per_context/primordials", + "internal/per_context/domexception", + "internal/per_context/messageport", + nullptr}; + + for (const char** module = context_files; *module != nullptr; module++) { + std::vector> parameters = { + global_string, exports_string, primordials_string}; + Local arguments[] = {context->Global(), exports, primordials}; + MaybeLocal maybe_fn = + native_module::NativeModuleEnv::LookupAndCompile( + context, *module, ¶meters, nullptr); + if (maybe_fn.IsEmpty()) { return false; } - - static const char* context_files[] = {"internal/per_context/primordials", - "internal/per_context/domexception", - "internal/per_context/messageport", - nullptr}; - - for (const char** module = context_files; *module != nullptr; module++) { - std::vector> parameters = { - global_string, exports_string, primordials_string}; - Local arguments[] = {context->Global(), exports, primordials}; - MaybeLocal maybe_fn = - native_module::NativeModuleEnv::LookupAndCompile( - context, *module, ¶meters, nullptr); - if (maybe_fn.IsEmpty()) { - return false; - } - Local fn = maybe_fn.ToLocalChecked(); - MaybeLocal result = - fn->Call(context, Undefined(isolate), - arraysize(arguments), arguments); - // Execution failed during context creation. - // TODO(joyeecheung): deprecate this signature and return a MaybeLocal. - if (result.IsEmpty()) { - return false; - } + Local fn = maybe_fn.ToLocalChecked(); + MaybeLocal result = + fn->Call(context, Undefined(isolate), arraysize(arguments), arguments); + // Execution failed during context creation. + // TODO(joyeecheung): deprecate this signature and return a MaybeLocal. + if (result.IsEmpty()) { + return false; } } diff --git a/src/node_contextify.cc b/src/node_contextify.cc index 46a1d7c8ef0691..5f289aecb3414c 100644 --- a/src/node_contextify.cc +++ b/src/node_contextify.cc @@ -185,8 +185,11 @@ MaybeLocal ContextifyContext::CreateV8Context( object_template->SetHandler(config); object_template->SetHandler(indexed_config); - - Local ctx = NewContext(env->isolate(), object_template); + Local ctx = Context::New(env->isolate(), nullptr, object_template); + if (ctx.IsEmpty()) return MaybeLocal(); + // Only partially initialize the context - the primordials are left out + // and only initialized when necessary. + InitializeContextRuntime(ctx); if (ctx.IsEmpty()) { return MaybeLocal(); diff --git a/src/node_internals.h b/src/node_internals.h index 8d63f023c1ffac..91ba2a58a755ae 100644 --- a/src/node_internals.h +++ b/src/node_internals.h @@ -99,6 +99,7 @@ std::string GetProcessTitle(const char* default_title); std::string GetHumanReadableProcessName(); void InitializeContextRuntime(v8::Local); +bool InitializePrimordials(v8::Local context); namespace task_queue { void PromiseRejectCallback(v8::PromiseRejectMessage message); From f2389eba99909b6fcdcf76d28a687208341e50f4 Mon Sep 17 00:00:00 2001 From: Harshitha KP Date: Mon, 3 Feb 2020 00:06:35 -0500 Subject: [PATCH 25/91] worker: emit runtime error on loop creation failure Instead of hard asserting throw a runtime error, that is more consumable. Fixes: https://github.com/nodejs/node/issues/31614 PR-URL: https://github.com/nodejs/node/pull/31621 Reviewed-By: Anna Henningsen --- doc/api/errors.md | 5 +++ lib/internal/errors.js | 5 ++- lib/internal/worker.js | 13 +++++-- src/node_worker.cc | 39 +++++++++++++++----- src/node_worker.h | 2 + test/parallel/test-worker-resource-limits.js | 3 +- 6 files changed, 50 insertions(+), 17 deletions(-) diff --git a/doc/api/errors.md b/doc/api/errors.md index d231cf28857dc0..09f43e5036729d 100644 --- a/doc/api/errors.md +++ b/doc/api/errors.md @@ -2066,6 +2066,11 @@ meaning of the error depends on the specific function. The WASI instance has already started. + +### `ERR_WORKER_INIT_FAILED` + +The `Worker` initialization failed. + ### `ERR_WORKER_INVALID_EXEC_ARGV` diff --git a/lib/internal/errors.js b/lib/internal/errors.js index 8c18eabb7a8d06..dcadbedb9e2f77 100644 --- a/lib/internal/errors.js +++ b/lib/internal/errors.js @@ -1381,12 +1381,13 @@ E('ERR_VM_MODULE_NOT_MODULE', 'Provided module is not an instance of Module', Error); E('ERR_VM_MODULE_STATUS', 'Module status %s', Error); E('ERR_WASI_ALREADY_STARTED', 'WASI instance has already started', Error); +E('ERR_WORKER_INIT_FAILED', 'Worker initialization failure: %s', Error); E('ERR_WORKER_INVALID_EXEC_ARGV', (errors, msg = 'invalid execArgv flags') => `Initiated Worker with ${msg}: ${errors.join(', ')}`, Error); E('ERR_WORKER_NOT_RUNNING', 'Worker instance not running', Error); -E('ERR_WORKER_OUT_OF_MEMORY', 'Worker terminated due to reaching memory limit', - Error); +E('ERR_WORKER_OUT_OF_MEMORY', + 'Worker terminated due to reaching memory limit: %s', Error); E('ERR_WORKER_PATH', 'The worker script filename must be an absolute path or a relative ' + 'path starting with \'./\' or \'../\'. Received "%s"', diff --git a/lib/internal/worker.js b/lib/internal/worker.js index de626af1bef962..7fc45695851f8c 100644 --- a/lib/internal/worker.js +++ b/lib/internal/worker.js @@ -25,6 +25,8 @@ const { ERR_WORKER_UNSUPPORTED_EXTENSION, ERR_WORKER_INVALID_EXEC_ARGV, ERR_INVALID_ARG_TYPE, + // eslint-disable-next-line no-unused-vars + ERR_WORKER_INIT_FAILED, } = errorCodes; const { validateString } = require('internal/validators'); const { getOptionValue } = require('internal/options'); @@ -136,7 +138,9 @@ class Worker extends EventEmitter { throw new ERR_WORKER_INVALID_EXEC_ARGV( this[kHandle].invalidNodeOptions, 'invalid NODE_OPTIONS env variable'); } - this[kHandle].onexit = (code, customErr) => this[kOnExit](code, customErr); + this[kHandle].onexit = (code, customErr, customErrReason) => { + this[kOnExit](code, customErr, customErrReason); + }; this[kPort] = this[kHandle].messagePort; this[kPort].on('message', (data) => this[kOnMessage](data)); this[kPort].start(); @@ -181,14 +185,15 @@ class Worker extends EventEmitter { this[kHandle].startThread(); } - [kOnExit](code, customErr) { + [kOnExit](code, customErr, customErrReason) { debug(`[${threadId}] hears end event for Worker ${this.threadId}`); drainMessagePort(this[kPublicPort]); drainMessagePort(this[kPort]); this[kDispose](); if (customErr) { - debug(`[${threadId}] failing with custom error ${customErr}`); - this.emit('error', new errorCodes[customErr]()); + debug(`[${threadId}] failing with custom error ${customErr} \ + and with reason {customErrReason}`); + this.emit('error', new errorCodes[customErr](customErrReason)); } this.emit('exit', code); this.removeAllListeners(); diff --git a/src/node_worker.cc b/src/node_worker.cc index 2c984373308d91..a5dcec250e7a72 100644 --- a/src/node_worker.cc +++ b/src/node_worker.cc @@ -133,7 +133,16 @@ class WorkerThreadData { public: explicit WorkerThreadData(Worker* w) : w_(w) { - CHECK_EQ(uv_loop_init(&loop_), 0); + int ret = uv_loop_init(&loop_); + if (ret != 0) { + char err_buf[128]; + uv_err_name_r(ret, err_buf, sizeof(err_buf)); + w->custom_error_ = "ERR_WORKER_INIT_FAILED"; + w->custom_error_str_ = err_buf; + w->loop_init_failed_ = true; + w->stopped_ = true; + return; + } std::shared_ptr allocator = ArrayBufferAllocator::Create(); @@ -146,6 +155,8 @@ class WorkerThreadData { Isolate* isolate = Isolate::Allocate(); if (isolate == nullptr) { w->custom_error_ = "ERR_WORKER_OUT_OF_MEMORY"; + w->custom_error_str_ = "Failed to create new Isolate"; + w->stopped_ = true; return; } @@ -204,11 +215,14 @@ class WorkerThreadData { isolate->Dispose(); // Wait until the platform has cleaned up all relevant resources. - while (!platform_finished) + while (!platform_finished) { + CHECK(!w_->loop_init_failed_); uv_run(&loop_, UV_RUN_ONCE); + } + } + if (!w_->loop_init_failed_) { + CheckedUvLoopClose(&loop_); } - - CheckedUvLoopClose(&loop_); } private: @@ -223,6 +237,7 @@ size_t Worker::NearHeapLimit(void* data, size_t current_heap_limit, size_t initial_heap_limit) { Worker* worker = static_cast(data); worker->custom_error_ = "ERR_WORKER_OUT_OF_MEMORY"; + worker->custom_error_str_ = "JS heap out of memory"; worker->Exit(1); // Give the current GC some extra leeway to let it finish rather than // crash hard. We are not going to perform further allocations anyway. @@ -242,6 +257,7 @@ void Worker::Run() { WorkerThreadData data(this); if (isolate_ == nullptr) return; + CHECK(!data.w_->loop_init_failed_); Debug(this, "Starting worker with id %llu", thread_id_); { @@ -287,9 +303,8 @@ void Worker::Run() { TryCatch try_catch(isolate_); context = NewContext(isolate_); if (context.IsEmpty()) { - // TODO(addaleax): Inform the target about the actual underlying - // failure. custom_error_ = "ERR_WORKER_OUT_OF_MEMORY"; + custom_error_str_ = "Failed to create new Context"; return; } } @@ -417,10 +432,14 @@ void Worker::JoinThread() { Undefined(env()->isolate())).Check(); Local args[] = { - Integer::New(env()->isolate(), exit_code_), - custom_error_ != nullptr ? - OneByteString(env()->isolate(), custom_error_).As() : - Null(env()->isolate()).As(), + Integer::New(env()->isolate(), exit_code_), + custom_error_ != nullptr + ? OneByteString(env()->isolate(), custom_error_).As() + : Null(env()->isolate()).As(), + !custom_error_str_.empty() + ? OneByteString(env()->isolate(), custom_error_str_.c_str()) + .As() + : Null(env()->isolate()).As(), }; MakeCallback(env()->onexit_string(), arraysize(args), args); diff --git a/src/node_worker.h b/src/node_worker.h index 0c6fd35c0ab032..dbd286109948da 100644 --- a/src/node_worker.h +++ b/src/node_worker.h @@ -85,6 +85,8 @@ class Worker : public AsyncWrap { bool thread_joined_ = true; const char* custom_error_ = nullptr; + std::string custom_error_str_; + bool loop_init_failed_ = false; int exit_code_ = 0; uint64_t thread_id_ = -1; uintptr_t stack_base_ = 0; diff --git a/test/parallel/test-worker-resource-limits.js b/test/parallel/test-worker-resource-limits.js index 2d4ebbc0ce6011..9332a132694e78 100644 --- a/test/parallel/test-worker-resource-limits.js +++ b/test/parallel/test-worker-resource-limits.js @@ -25,7 +25,8 @@ if (!process.env.HAS_STARTED_WORKER) { })); w.on('error', common.expectsError({ code: 'ERR_WORKER_OUT_OF_MEMORY', - message: 'Worker terminated due to reaching memory limit' + message: 'Worker terminated due to reaching memory limit: ' + + 'JS heap out of memory' })); return; } From 1933efa62fa83f91166a0d7dc53c44f3adb42c0c Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Mon, 17 Feb 2020 18:29:38 -0800 Subject: [PATCH 26/91] test: remove common.PORT from test-net-write-callbacks.js Switch test-net-write-callbacks.js from common.PORT to a port assigned by the operating system. PR-URL: https://github.com/nodejs/node/pull/31839 Reviewed-By: Anna Henningsen Reviewed-By: Luigi Pinca Reviewed-By: Yongsheng Zhang Reviewed-By: James M Snell Reviewed-By: Colin Ihrig Reviewed-By: Shelley Vohr Reviewed-By: Richard Lau --- test/pummel/test-net-write-callbacks.js | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/test/pummel/test-net-write-callbacks.js b/test/pummel/test-net-write-callbacks.js index 0bcc9e2dec121a..cb011ab0022251 100644 --- a/test/pummel/test-net-write-callbacks.js +++ b/test/pummel/test-net-write-callbacks.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. 'use strict'; -const common = require('../common'); +require('../common'); const net = require('net'); const assert = require('assert'); @@ -55,8 +55,8 @@ function makeCallback(c) { }; } -server.listen(common.PORT, function() { - const client = net.createConnection(common.PORT); +server.listen(0, function() { + const client = net.createConnection(server.address().port); client.on('connect', function() { for (let i = 0; i < N; i++) { From 7d5776e119f532493c8197705542e1e1a4ba464f Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Mon, 17 Feb 2020 21:48:38 -0800 Subject: [PATCH 27/91] test: remove flaky designation for test-net-connect-options-port Closes: https://github.com/nodejs/node/issues/23207 PR-URL: https://github.com/nodejs/node/pull/31841 Fixes: https://github.com/nodejs/node/issues/23207 Reviewed-By: Luigi Pinca Reviewed-By: Colin Ihrig Reviewed-By: David Carlier --- test/parallel/parallel.status | 2 -- 1 file changed, 2 deletions(-) diff --git a/test/parallel/parallel.status b/test/parallel/parallel.status index 9b8c4807d07646..6d5e233d561869 100644 --- a/test/parallel/parallel.status +++ b/test/parallel/parallel.status @@ -5,8 +5,6 @@ prefix parallel # sample-test : PASS,FLAKY [true] # This section applies to all platforms -# https://github.com/nodejs/node/issues/23207 -test-net-connect-options-port: PASS,FLAKY [$system==win32] # https://github.com/nodejs/node/issues/20750 From f858f2366c179b9611b7c7378eb5e1c0e49091a5 Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Tue, 18 Feb 2020 19:57:13 -0800 Subject: [PATCH 28/91] tools: update lint-md task to lint for possessives of Node.js Add a markdown lint rule to prohibit "Node.js'" and "Node.js's". Instead, of "Node.js' module system", use "the Node.js module system". Refs: https://github.com/nodejs/node/pull/31748#issuecomment-585087745 PR-URL: https://github.com/nodejs/node/pull/31862 Reviewed-By: Daijiro Wachi Reviewed-By: Ruben Bridgewater --- tools/lint-md.js | 533 +++++++++++------- .../node-lint-md-cli-rollup/package-lock.json | 74 ++- tools/node-lint-md-cli-rollup/package.json | 2 +- .../node-lint-md-cli-rollup/rollup.config.js | 3 +- 4 files changed, 394 insertions(+), 218 deletions(-) diff --git a/tools/lint-md.js b/tools/lint-md.js index 495f9536db6dc9..e6eecf265c5173 100644 --- a/tools/lint-md.js +++ b/tools/lint-md.js @@ -40838,62 +40838,10 @@ var remark = unified_1() .use(remarkStringify) .freeze(); -const _from = "remark@^11.0.2"; -const _id = "remark@11.0.2"; -const _inBundle = false; -const _integrity = "sha512-bh+eJgn8wgmbHmIBOuwJFdTVRVpl3fcVP6HxmpPWO0ULGP9Qkh6INJh0N5Uy7GqlV7DQYGoqaKiEIpM5LLvJ8w=="; -const _location = "/remark"; -const _phantomChildren = { -}; -const _requested = { - type: "range", - registry: true, - raw: "remark@^11.0.2", - name: "remark", - escapedName: "remark", - rawSpec: "^11.0.2", - saveSpec: null, - fetchSpec: "^11.0.2" -}; -const _requiredBy = [ - "/" -]; -const _resolved = "https://registry.npmjs.org/remark/-/remark-11.0.2.tgz"; -const _shasum = "12b90ea100ac3362b1976fa87a6e4e0ab5968202"; -const _spec = "remark@^11.0.2"; -const _where = "/mnt/c/orgs/nodejs/node-runtime/tools/node-lint-md-cli-rollup"; -const author = { - name: "Titus Wormer", - email: "tituswormer@gmail.com", - url: "https://wooorm.com" -}; -const bugs = { - url: "https://github.com/remarkjs/remark/issues" -}; -const bundleDependencies = false; -const contributors = [ - { - name: "Titus Wormer", - email: "tituswormer@gmail.com", - url: "https://wooorm.com" - } -]; -const dependencies = { - "remark-parse": "^7.0.0", - "remark-stringify": "^7.0.0", - unified: "^8.2.0" -}; -const deprecated$1 = false; +const name$1 = "remark"; +const version$1 = "11.0.2"; const description = "Markdown processor powered by plugins"; -const files = [ - "index.js", - "types/index.d.ts" -]; -const funding = { - type: "opencollective", - url: "https://opencollective.com/unified" -}; -const homepage = "https://remark.js.org"; +const license = "MIT"; const keywords = [ "unified", "remark", @@ -40907,83 +40855,77 @@ const keywords = [ "stringify", "process" ]; -const license = "MIT"; -const name$1 = "remark"; -const repository = { - type: "git", - url: "https://github.com/remarkjs/remark/tree/master/packages/remark" +const homepage = "https://remark.js.org"; +const repository = "https://github.com/remarkjs/remark/tree/master/packages/remark"; +const bugs = "https://github.com/remarkjs/remark/issues"; +const funding = { + type: "opencollective", + url: "https://opencollective.com/unified" +}; +const author = "Titus Wormer (https://wooorm.com)"; +const contributors = [ + "Titus Wormer (https://wooorm.com)" +]; +const files = [ + "index.js", + "types/index.d.ts" +]; +const types = "types/index.d.ts"; +const dependencies = { + "remark-parse": "^7.0.0", + "remark-stringify": "^7.0.0", + unified: "^8.2.0" }; const scripts = { test: "tape test.js" }; -const types = "types/index.d.ts"; -const version$1 = "11.0.2"; const xo = false; +const _resolved = "https://registry.npmjs.org/remark/-/remark-11.0.2.tgz"; +const _integrity = "sha512-bh+eJgn8wgmbHmIBOuwJFdTVRVpl3fcVP6HxmpPWO0ULGP9Qkh6INJh0N5Uy7GqlV7DQYGoqaKiEIpM5LLvJ8w=="; +const _from = "remark@11.0.2"; var _package = { - _from: _from, - _id: _id, - _inBundle: _inBundle, - _integrity: _integrity, - _location: _location, - _phantomChildren: _phantomChildren, - _requested: _requested, - _requiredBy: _requiredBy, - _resolved: _resolved, - _shasum: _shasum, - _spec: _spec, - _where: _where, - author: author, - bugs: bugs, - bundleDependencies: bundleDependencies, - contributors: contributors, - dependencies: dependencies, - deprecated: deprecated$1, + name: name$1, + version: version$1, description: description, - files: files, - funding: funding, - homepage: homepage, - keywords: keywords, license: license, - name: name$1, + keywords: keywords, + homepage: homepage, repository: repository, - scripts: scripts, + bugs: bugs, + funding: funding, + author: author, + contributors: contributors, + files: files, types: types, - version: version$1, - xo: xo + dependencies: dependencies, + scripts: scripts, + xo: xo, + _resolved: _resolved, + _integrity: _integrity, + _from: _from }; var _package$1 = /*#__PURE__*/Object.freeze({ __proto__: null, - _from: _from, - _id: _id, - _inBundle: _inBundle, - _integrity: _integrity, - _location: _location, - _phantomChildren: _phantomChildren, - _requested: _requested, - _requiredBy: _requiredBy, - _resolved: _resolved, - _shasum: _shasum, - _spec: _spec, - _where: _where, - author: author, - bugs: bugs, - bundleDependencies: bundleDependencies, - contributors: contributors, - dependencies: dependencies, - deprecated: deprecated$1, + name: name$1, + version: version$1, description: description, - files: files, - funding: funding, - homepage: homepage, - keywords: keywords, license: license, - name: name$1, + keywords: keywords, + homepage: homepage, repository: repository, - scripts: scripts, + bugs: bugs, + funding: funding, + author: author, + contributors: contributors, + files: files, types: types, - version: version$1, + dependencies: dependencies, + scripts: scripts, xo: xo, + _resolved: _resolved, + _integrity: _integrity, + _from: _from, 'default': _package }); @@ -41001,7 +40943,7 @@ const dependencies$1 = { "markdown-extensions": "^1.1.1", remark: "^11.0.2", "remark-lint": "^6.0.5", - "remark-preset-lint-node": "^1.12.0", + "remark-preset-lint-node": "^1.13.0", "unified-args": "^7.1.0" }; const main = "dist/index.js"; @@ -42476,14 +42418,19 @@ var plur = (word, plural, count) => { return Math.abs(count) === 1 ? word : plural; }; -var unistUtilPosition = createCommonjsModule(function (module, exports) { +var start$1 = factory$8('start'); +var end = factory$8('end'); -var position = exports; +var unistUtilPosition = position$1; -position.start = factory('start'); -position.end = factory('end'); +position$1.start = start$1; +position$1.end = end; -function factory(type) { +function position$1(node) { + return {start: start$1(node), end: end(node)} +} + +function factory$8(type) { point.displayName = type; return point @@ -42498,7 +42445,6 @@ function factory(type) { } } } -}); var unistUtilGenerated = generated; @@ -42519,7 +42465,7 @@ var remarkLintListItemBulletIndent = unifiedLintRule( listItemBulletIndent ); -var start$1 = unistUtilPosition.start; +var start$2 = unistUtilPosition.start; function listItemBulletIndent(tree, file) { var contents = String(file); @@ -42536,8 +42482,8 @@ function listItemBulletIndent(tree, file) { var reason; if (!unistUtilGenerated(item)) { - final = start$1(item.children[0]); - indent = contents.slice(start$1(item).offset, final.offset).match(/^\s*/)[0] + final = start$2(item.children[0]); + indent = contents.slice(start$2(item).offset, final.offset).match(/^\s*/)[0] .length; if (indent !== 0) { @@ -42558,7 +42504,7 @@ function listItemBulletIndent(tree, file) { var remarkLintListItemIndent = unifiedLintRule('remark-lint:list-item-indent', listItemIndent); -var start$2 = unistUtilPosition.start; +var start$3 = unistUtilPosition.start; var styles = {'tab-size': true, mixed: true, space: true}; @@ -42586,7 +42532,7 @@ function listItemIndent(tree, file, pref) { function visitItem(item) { var head = item.children[0]; - var final = start$2(head); + var final = start$3(head); var marker; var bulletSize; var style; @@ -42594,7 +42540,7 @@ function listItemIndent(tree, file, pref) { var reason; marker = contents - .slice(start$2(item).offset, final.offset) + .slice(start$3(item).offset, final.offset) .replace(/\[[x ]?]\s*$/i, ''); bulletSize = marker.trimRight().length; @@ -42645,8 +42591,8 @@ var remarkLintNoAutoLinkWithoutProtocol = unifiedLintRule( noAutoLinkWithoutProtocol ); -var start$3 = unistUtilPosition.start; -var end = unistUtilPosition.end; +var start$4 = unistUtilPosition.start; +var end$1 = unistUtilPosition.end; // Protocol expression. // See: . @@ -42664,8 +42610,8 @@ function noAutoLinkWithoutProtocol(tree, file) { children = node.children; if ( - start$3(node).column === start$3(children[0]).column - 1 && - end(node).column === end(children[children.length - 1]).column + 1 && + start$4(node).column === start$4(children[0]).column - 1 && + end$1(node).column === end$1(children[children.length - 1]).column + 1 && !protocol$2.test(mdastUtilToString(node)) ) { file.message(reason, node); @@ -42730,8 +42676,8 @@ function noBlockquoteWithoutMarker(tree, file) { var remarkLintNoLiteralUrls = unifiedLintRule('remark-lint:no-literal-urls', noLiteralURLs); -var start$4 = unistUtilPosition.start; -var end$1 = unistUtilPosition.end; +var start$5 = unistUtilPosition.start; +var end$2 = unistUtilPosition.end; var mailto$3 = 'mailto:'; var reason$2 = 'Don’t use literal URLs without angle brackets'; @@ -42744,8 +42690,8 @@ function noLiteralURLs(tree, file) { if ( !unistUtilGenerated(node) && - start$4(node).column === start$4(children[0]).column && - end$1(node).column === end$1(children[children.length - 1]).column && + start$5(node).column === start$5(children[0]).column && + end$2(node).column === end$2(children[children.length - 1]).column && (node.url === mailto$3 + value || node.url === value) ) { file.message(reason$2, node); @@ -42758,7 +42704,7 @@ var remarkLintOrderedListMarkerStyle = unifiedLintRule( orderedListMarkerStyle ); -var start$5 = unistUtilPosition.start; +var start$6 = unistUtilPosition.start; var styles$1 = { ')': true, @@ -42793,7 +42739,7 @@ function orderedListMarkerStyle(tree, file, pref) { if (!unistUtilGenerated(child)) { marker = contents - .slice(start$5(child).offset, start$5(child.children[0]).offset) + .slice(start$6(child).offset, start$6(child.children[0]).offset) .replace(/\s|\d/g, '') .replace(/\[[x ]?]\s*$/i, ''); @@ -42914,8 +42860,8 @@ var remarkLintNoHeadingContentIndent = unifiedLintRule( noHeadingContentIndent ); -var start$6 = unistUtilPosition.start; -var end$2 = unistUtilPosition.end; +var start$7 = unistUtilPosition.start; +var end$3 = unistUtilPosition.end; function noHeadingContentIndent(tree, file) { var contents = String(file); @@ -42943,7 +42889,7 @@ function noHeadingContentIndent(tree, file) { type = mdastUtilHeadingStyle(node, 'atx'); if (type === 'atx' || type === 'atx-closed') { - initial = start$6(node); + initial = start$7(node); index = initial.offset; char = contents.charAt(index); @@ -42957,7 +42903,7 @@ function noHeadingContentIndent(tree, file) { } index = depth + (index - initial.offset); - head = start$6(children[0]).column; + head = start$7(children[0]).column; // Ignore empty headings. if (!head) { @@ -42975,15 +42921,15 @@ function noHeadingContentIndent(tree, file) { plur('space', diff) + ' before this heading’s content'; - file.message(reason, start$6(children[0])); + file.message(reason, start$7(children[0])); } } // Closed ATX-heading always must have a space between their content and the // final hashes, thus, there is no `add x spaces`. if (type === 'atx-closed') { - final = end$2(children[children.length - 1]); - diff = end$2(node).column - final.column - 1 - depth; + final = end$3(children[children.length - 1]); + diff = end$3(node).column - final.column - 1 - depth; if (diff) { reason = @@ -43213,8 +43159,8 @@ var remarkLintCheckboxCharacterStyle = unifiedLintRule( checkboxCharacterStyle ); -var start$7 = unistUtilPosition.start; -var end$3 = unistUtilPosition.end; +var start$8 = unistUtilPosition.start; +var end$4 = unistUtilPosition.end; var checked = {x: true, X: true}; var unchecked = {' ': true, '\t': true}; @@ -43259,8 +43205,8 @@ function checkboxCharacterStyle(tree, file, pref) { } type = types$1[node.checked]; - initial = start$7(node).offset; - final = (node.children.length === 0 ? end$3(node) : start$7(node.children[0])) + initial = start$8(node).offset; + final = (node.children.length === 0 ? end$4(node) : start$8(node.children[0])) .offset; // For a checkbox to be parsed, it must be followed by a whitespace. @@ -43298,8 +43244,8 @@ var remarkLintCheckboxContentIndent = unifiedLintRule( checkboxContentIndent ); -var start$8 = unistUtilPosition.start; -var end$4 = unistUtilPosition.end; +var start$9 = unistUtilPosition.start; +var end$5 = unistUtilPosition.end; var reason$9 = 'Checkboxes should be followed by a single character'; @@ -43319,9 +43265,9 @@ function checkboxContentIndent(tree, file) { return } - initial = start$8(node).offset; + initial = start$9(node).offset; /* istanbul ignore next - hard to test, couldn’t find a case. */ - final = (node.children.length === 0 ? end$4(node) : start$8(node.children[0])) + final = (node.children.length === 0 ? end$5(node) : start$9(node.children[0])) .offset; while (/[^\S\n]/.test(contents.charAt(final))) { @@ -43343,8 +43289,8 @@ function checkboxContentIndent(tree, file) { var remarkLintCodeBlockStyle = unifiedLintRule('remark-lint:code-block-style', codeBlockStyle); -var start$9 = unistUtilPosition.start; -var end$5 = unistUtilPosition.end; +var start$a = unistUtilPosition.start; +var end$6 = unistUtilPosition.end; var styles$2 = {null: true, fenced: true, indented: true}; @@ -43377,8 +43323,8 @@ function codeBlockStyle(tree, file, pref) { // Get the style of `node`. function check(node) { - var initial = start$9(node).offset; - var final = end$5(node).offset; + var initial = start$a(node).offset; + var final = end$6(node).offset; if (unistUtilGenerated(node)) { return null @@ -43415,8 +43361,8 @@ function definitionSpacing(tree, file) { var remarkLintFencedCodeFlag = unifiedLintRule('remark-lint:fenced-code-flag', fencedCodeFlag); -var start$a = unistUtilPosition.start; -var end$6 = unistUtilPosition.end; +var start$b = unistUtilPosition.start; +var end$7 = unistUtilPosition.end; var fence$2 = /^ {0,3}([~`])\1{2,}/; var reasonInvalid = 'Invalid code-language flag'; @@ -43447,7 +43393,7 @@ function fencedCodeFlag(tree, file, pref) { file.message(reasonInvalid, node); } } else { - value = contents.slice(start$a(node).offset, end$6(node).offset); + value = contents.slice(start$b(node).offset, end$7(node).offset); if (!allowEmpty && fence$2.test(value)) { file.message(reasonMissing, node); @@ -43520,7 +43466,7 @@ function fileExtension(tree, file, pref) { var remarkLintFinalDefinition = unifiedLintRule('remark-lint:final-definition', finalDefinition); -var start$b = unistUtilPosition.start; +var start$c = unistUtilPosition.start; function finalDefinition(tree, file) { var last = null; @@ -43528,7 +43474,7 @@ function finalDefinition(tree, file) { unistUtilVisit(tree, visitor, true); function visitor(node) { - var line = start$b(node).line; + var line = start$c(node).line; // Ignore generated nodes. if (node.type === 'root' || unistUtilGenerated(node)) { @@ -43609,8 +43555,8 @@ function headingStyle(tree, file, pref) { var remarkLintMaximumLineLength = unifiedLintRule('remark-lint:maximum-line-length', maximumLineLength); -var start$c = unistUtilPosition.start; -var end$7 = unistUtilPosition.end; +var start$d = unistUtilPosition.start; +var end$8 = unistUtilPosition.end; function maximumLineLength(tree, file, pref) { var style = typeof pref === 'number' && !isNaN(pref) ? pref : 80; @@ -43650,8 +43596,8 @@ function maximumLineLength(tree, file, pref) { return } - initial = start$c(node); - final = end$7(node); + initial = start$d(node); + final = end$8(node); // No whitelisting when starting after the border, or ending before it. if (initial.column > style || final.column < style) { @@ -43661,7 +43607,7 @@ function maximumLineLength(tree, file, pref) { // No whitelisting when there’s whitespace after the link. if ( next && - start$c(next).line === initial.line && + start$d(next).line === initial.line && (!next.value || /^(.+?[ \t].+?)/.test(next.value)) ) { return @@ -43673,7 +43619,7 @@ function maximumLineLength(tree, file, pref) { function ignore(node) { /* istanbul ignore else - Hard to test, as we only run this case on `position: true` */ if (!unistUtilGenerated(node)) { - whitelist(start$c(node).line - 1, end$7(node).line); + whitelist(start$d(node).line - 1, end$8(node).line); } } @@ -43794,7 +43740,7 @@ function noFileNameOuterDashes(tree, file) { var remarkLintNoHeadingIndent = unifiedLintRule('remark-lint:no-heading-indent', noHeadingIndent); -var start$d = unistUtilPosition.start; +var start$e = unistUtilPosition.start; function noHeadingIndent(tree, file) { var contents = String(file); @@ -43813,7 +43759,7 @@ function noHeadingIndent(tree, file) { return } - initial = start$d(node); + initial = start$e(node); begin = initial.offset; index = begin - 1; @@ -43839,7 +43785,7 @@ function noHeadingIndent(tree, file) { } } -var start$e = unistUtilPosition.start; +var start$f = unistUtilPosition.start; @@ -43862,7 +43808,7 @@ function noMultipleToplevelHeadings(tree, file, pref) { node ); } else { - duplicate = unistUtilStringifyPosition(start$e(node)); + duplicate = unistUtilStringifyPosition(start$f(node)); } } } @@ -43987,10 +43933,214 @@ function noTrailingSpaces(ast, file) { } } +var convert_1$1 = convert$2; + +function convert$2(test) { + if (typeof test === 'string') { + return typeFactory$1(test) + } + + if (test === null || test === undefined) { + return ok$2 + } + + if (typeof test === 'object') { + return ('length' in test ? anyFactory$1 : matchesFactory$1)(test) + } + + if (typeof test === 'function') { + return test + } + + throw new Error('Expected function, string, or object as test') +} + +function convertAll$1(tests) { + var results = []; + var length = tests.length; + var index = -1; + + while (++index < length) { + results[index] = convert$2(tests[index]); + } + + return results +} + +// Utility assert each property in `test` is represented in `node`, and each +// values are strictly equal. +function matchesFactory$1(test) { + return matches + + function matches(node) { + var key; + + for (key in test) { + if (node[key] !== test[key]) { + return false + } + } + + return true + } +} + +function anyFactory$1(tests) { + var checks = convertAll$1(tests); + var length = checks.length; + + return matches + + function matches() { + var index = -1; + + while (++index < length) { + if (checks[index].apply(this, arguments)) { + return true + } + } + + return false + } +} + +// Utility to convert a string into a function which checks a given node’s type +// for said string. +function typeFactory$1(test) { + return type + + function type(node) { + return Boolean(node && node.type === test) + } +} + +// Utility to return true. +function ok$2() { + return true +} + +var unistUtilVisitParents$1 = visitParents$1; + + + +var CONTINUE$2 = true; +var SKIP$2 = 'skip'; +var EXIT$2 = false; + +visitParents$1.CONTINUE = CONTINUE$2; +visitParents$1.SKIP = SKIP$2; +visitParents$1.EXIT = EXIT$2; + +function visitParents$1(tree, test, visitor, reverse) { + var is; + + if (typeof test === 'function' && typeof visitor !== 'function') { + reverse = visitor; + visitor = test; + test = null; + } + + is = convert_1$1(test); + + one(tree, null, []); + + // Visit a single node. + function one(node, index, parents) { + var result = []; + var subresult; + + if (!test || is(node, index, parents[parents.length - 1] || null)) { + result = toResult$1(visitor(node, parents)); + + if (result[0] === EXIT$2) { + return result + } + } + + if (node.children && result[0] !== SKIP$2) { + subresult = toResult$1(all(node.children, parents.concat(node))); + return subresult[0] === EXIT$2 ? subresult : result + } + + return result + } + + // Visit children in `parent`. + function all(children, parents) { + var min = -1; + var step = reverse ? -1 : 1; + var index = (reverse ? children.length : min) + step; + var result; + + while (index > min && index < children.length) { + result = one(children[index], index, parents); + + if (result[0] === EXIT$2) { + return result + } + + index = typeof result[1] === 'number' ? result[1] : index + step; + } + } +} + +function toResult$1(value) { + if (value !== null && typeof value === 'object' && 'length' in value) { + return value + } + + if (typeof value === 'number') { + return [CONTINUE$2, value] + } + + return [value] +} + +var unistUtilVisit$1 = visit$1; + + + +var CONTINUE$3 = unistUtilVisitParents$1.CONTINUE; +var SKIP$3 = unistUtilVisitParents$1.SKIP; +var EXIT$3 = unistUtilVisitParents$1.EXIT; + +visit$1.CONTINUE = CONTINUE$3; +visit$1.SKIP = SKIP$3; +visit$1.EXIT = EXIT$3; + +function visit$1(tree, test, visitor, reverse) { + if (typeof test === 'function' && typeof visitor !== 'function') { + reverse = visitor; + visitor = test; + test = null; + } + + unistUtilVisitParents$1(tree, test, overload, reverse); + + function overload(node, parents) { + var parent = parents[parents.length - 1]; + var index = parent ? parent.children.indexOf(node) : null; + return visitor(node, index, parent) + } +} + var remarkLintProhibitedStrings = unifiedLintRule('remark-lint:prohibited-strings', prohibitedStrings); function testProhibited(val, content) { - const re = new RegExp(`(\\.|@[a-z0-9/-]*)?\\b(${val.no})\\b(\\.\\w)?`, 'g'); + let regexpString = '(\\.|@[a-z0-9/-]*)?'; + + // If it starts with a letter, make sure it is a word break. + if (/^\b/.test(val.no)) { + regexpString += '\\b'; + } + regexpString += `(${val.no})`; + + // If it ends with a letter, make sure it is a word break. + if (/\b$/.test(val.no)) { + regexpString += '\\b'; + } + regexpString += '(\\.\\w)?'; + const re = new RegExp(regexpString, 'g'); let result = null; while (result = re.exec(content)) { @@ -44003,7 +44153,7 @@ function testProhibited(val, content) { } function prohibitedStrings(ast, file, strings) { - unistUtilVisit(ast, 'text', checkText); + unistUtilVisit$1(ast, 'text', checkText); function checkText(node) { const content = node.value; @@ -44024,8 +44174,8 @@ var rule = unifiedLintRule; var remarkLintRuleStyle = rule('remark-lint:rule-style', ruleStyle); -var start$f = unistUtilPosition.start; -var end$8 = unistUtilPosition.end; +var start$g = unistUtilPosition.start; +var end$9 = unistUtilPosition.end; function ruleStyle(tree, file, pref) { var contents = String(file); @@ -44041,8 +44191,8 @@ function ruleStyle(tree, file, pref) { unistUtilVisit(tree, 'thematicBreak', visitor); function visitor(node) { - var initial = start$f(node).offset; - var final = end$8(node).offset; + var initial = start$g(node).offset; + var final = end$9(node).offset; var rule; if (!unistUtilGenerated(node)) { @@ -44095,8 +44245,8 @@ function strongMarker(tree, file, pref) { var remarkLintTableCellPadding = unifiedLintRule('remark-lint:table-cell-padding', tableCellPadding); -var start$g = unistUtilPosition.start; -var end$9 = unistUtilPosition.end; +var start$h = unistUtilPosition.start; +var end$a = unistUtilPosition.end; var styles$3 = {null: true, padded: true, compact: true}; @@ -44144,8 +44294,8 @@ function tableCellPadding(tree, file, pref) { next = cells[column + 1]; fence = contents.slice( - cell ? end$9(cell).offset : start$g(row).offset, - next ? start$g(next).offset : end$9(row).offset + cell ? end$a(cell).offset : start$h(row).offset, + next ? start$h(next).offset : end$a(row).offset ); pos = fence.indexOf('|'); @@ -44222,13 +44372,13 @@ function tableCellPadding(tree, file, pref) { } function size(node) { - return end$9(node).offset - start$g(node).offset + return end$a(node).offset - start$h(node).offset } var remarkLintTablePipes = unifiedLintRule('remark-lint:table-pipes', tablePipes); -var start$h = unistUtilPosition.start; -var end$a = unistUtilPosition.end; +var start$i = unistUtilPosition.start; +var end$b = unistUtilPosition.end; var reasonStart = 'Missing initial pipe in table fence'; var reasonEnd = 'Missing final pipe in table fence'; @@ -44256,15 +44406,15 @@ function tablePipes(tree, file) { cells = row.children; head = cells[0]; tail = cells[cells.length - 1]; - initial = contents.slice(start$h(row).offset, start$h(head).offset); - final = contents.slice(end$a(tail).offset, end$a(row).offset); + initial = contents.slice(start$i(row).offset, start$i(head).offset); + final = contents.slice(end$b(tail).offset, end$b(row).offset); if (initial.indexOf('|') === -1) { - file.message(reasonStart, start$h(row)); + file.message(reasonStart, start$i(row)); } if (final.indexOf('|') === -1) { - file.message(reasonEnd, end$a(row)); + file.message(reasonEnd, end$b(row)); } } } @@ -44276,7 +44426,7 @@ var remarkLintUnorderedListMarkerStyle = unifiedLintRule( unorderedListMarkerStyle ); -var start$i = unistUtilPosition.start; +var start$j = unistUtilPosition.start; var styles$4 = { '-': true, @@ -44312,7 +44462,7 @@ function unorderedListMarkerStyle(tree, file, pref) { if (!unistUtilGenerated(child)) { marker = contents - .slice(start$i(child).offset, start$i(child.children[0]).offset) + .slice(start$j(child).offset, start$j(child.children[0]).offset) .replace(/\[[x ]?]\s*$/i, '') .replace(/\s/g, ''); @@ -44372,8 +44522,9 @@ var plugins$2 = [ { no: "hostname", yes: "host name" }, { no: "[Jj]avascript", yes: "JavaScript" }, { no: "Node", yes: "Node.js" }, - { no: "Node.JS", yes: "Node.js" }, - { no: "node.js", yes: "Node.js" }, + { no: "Node\\.JS", yes: "Node.js" }, + { no: "node\\.js", yes: "Node.js" }, + { no: "Node\\.js's?", yes: "the Node.js" }, { no: "[Nn]ote that", yes: "" }, { no: "Rfc", yes: "RFC" }, { no: "[Rr][Ff][Cc]\\d+", yes: "RFC " }, diff --git a/tools/node-lint-md-cli-rollup/package-lock.json b/tools/node-lint-md-cli-rollup/package-lock.json index af4ac3812a00e1..f619276709e70a 100644 --- a/tools/node-lint-md-cli-rollup/package-lock.json +++ b/tools/node-lint-md-cli-rollup/package-lock.json @@ -301,17 +301,17 @@ "integrity": "sha512-fjquC59cD7CyW6urNXK0FBufkZcoiGG80wTuPujX590cB5Ttln20E2UB4S/WARVqhXffZl2LNgS+gQdPIIim/g==" }, "fault": { - "version": "1.0.3", - "resolved": "https://registry.npmjs.org/fault/-/fault-1.0.3.tgz", - "integrity": "sha512-sfFuP4X0hzrbGKjAUNXYvNqsZ5F6ohx/dZ9I0KQud/aiZNwg263r5L9yGB0clvXHCkzXh5W3t7RSHchggYIFmA==", + "version": "1.0.4", + "resolved": "https://registry.npmjs.org/fault/-/fault-1.0.4.tgz", + "integrity": "sha512-CJ0HCB5tL5fYTEA7ToAq5+kTwd++Borf1/bifxd9iT70QcXr4MRrO3Llf8Ifs70q+SJcGHFtnIE/Nw6giCtECA==", "requires": { - "format": "^0.2.2" + "format": "^0.2.0" } }, "figures": { - "version": "3.1.0", - "resolved": "https://registry.npmjs.org/figures/-/figures-3.1.0.tgz", - "integrity": "sha512-ravh8VRXqHuMvZt/d8GblBeqDMkdJMBdv/2KntFH+ra5MXkO7nxNKpzQ3n6QD/2da1kH0aWmNISdvhM7gl2gVg==", + "version": "3.2.0", + "resolved": "https://registry.npmjs.org/figures/-/figures-3.2.0.tgz", + "integrity": "sha512-yaduQFRKLXYOGgEn6AZau90j3ggSOyiqXU0F9JZfeXYhNa+Jk4X+s45A2zg5jns87GAFa34BBm2kXw4XpNcbdg==", "requires": { "escape-string-regexp": "^1.0.5" } @@ -477,9 +477,9 @@ "integrity": "sha512-zxQ9//Q3D/34poZf8fiy3m3XVpbQc7ren15iKqrTtLPwkPD/t3Scy9Imp63FujULGxuK0ZlCwoo5xNpktFgbOA==" }, "is-hidden": { - "version": "1.1.2", - "resolved": "https://registry.npmjs.org/is-hidden/-/is-hidden-1.1.2.tgz", - "integrity": "sha512-kytBeNVW2QTIqZdJBDKIjP+EkUTzDT07rsc111w/gxqR6wK3ODkOswcpxgED6HU6t7fEhOxqojVZ2a2kU9rj+A==" + "version": "1.1.3", + "resolved": "https://registry.npmjs.org/is-hidden/-/is-hidden-1.1.3.tgz", + "integrity": "sha512-FFzhGKA9h59OFxeaJl0W5ILTYetI8WsdqdofKr69uLKZdV6hbDKxj8vkpG3L9uS/6Q/XYh1tkXm6xwRGFweETA==" }, "is-module": { "version": "1.0.0", @@ -708,9 +708,9 @@ } }, "readable-stream": { - "version": "3.5.0", - "resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-3.5.0.tgz", - "integrity": "sha512-gSz026xs2LfxBPudDuI41V1lka8cxg64E66SGe78zJlsUofOg/yqwezdIcdfwik6B4h8LFmWPA9ef9X3FiNFLA==", + "version": "3.6.0", + "resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-3.6.0.tgz", + "integrity": "sha512-BViHy7LKeTz4oNnkcLJ+lVSL6vpiFeX6/d3oSH8zCW7UxP2onchk+vTGB143xuFjHS3deTgkKoXXymXqymiIdA==", "requires": { "inherits": "^2.0.3", "string_decoder": "^1.1.1", @@ -1150,12 +1150,38 @@ } }, "remark-lint-prohibited-strings": { - "version": "1.2.0", - "resolved": "https://registry.npmjs.org/remark-lint-prohibited-strings/-/remark-lint-prohibited-strings-1.2.0.tgz", - "integrity": "sha512-k3Sa0Kk+OJHMnsaRmLzq85BomgVOHbDBq3s5v4BJ6bVNWwYM9KrunNb0iAGomM7l+HfosYoa9Q31xfCuwsWZ4A==", + "version": "1.2.1", + "resolved": "https://registry.npmjs.org/remark-lint-prohibited-strings/-/remark-lint-prohibited-strings-1.2.1.tgz", + "integrity": "sha512-i3LatoJn/eHkgawdi3eoynikQa5zIEDX+GYcvu4ns5LsOvIrT8WcuvgYQ2kbEFbV0KTy7yBAGLJ9040xs1ssXA==", "requires": { "unified-lint-rule": "^1.0.2", - "unist-util-visit": "^1.2.0" + "unist-util-visit": "^2.0.0" + }, + "dependencies": { + "unist-util-is": { + "version": "4.0.2", + "resolved": "https://registry.npmjs.org/unist-util-is/-/unist-util-is-4.0.2.tgz", + "integrity": "sha512-Ofx8uf6haexJwI1gxWMGg6I/dLnF2yE+KibhD3/diOqY2TinLcqHXCV6OI5gFVn3xQqDH+u0M625pfKwIwgBKQ==" + }, + "unist-util-visit": { + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/unist-util-visit/-/unist-util-visit-2.0.2.tgz", + "integrity": "sha512-HoHNhGnKj6y+Sq+7ASo2zpVdfdRifhTgX2KTU3B/sO/TTlZchp7E3S4vjRzDJ7L60KmrCPsQkVK3lEF3cz36XQ==", + "requires": { + "@types/unist": "^2.0.0", + "unist-util-is": "^4.0.0", + "unist-util-visit-parents": "^3.0.0" + } + }, + "unist-util-visit-parents": { + "version": "3.0.2", + "resolved": "https://registry.npmjs.org/unist-util-visit-parents/-/unist-util-visit-parents-3.0.2.tgz", + "integrity": "sha512-yJEfuZtzFpQmg1OSCyS9M5NJRrln/9FbYosH3iW0MG402QbdbaB8ZESwUv9RO6nRfLAKvWcMxCwdLWOov36x/g==", + "requires": { + "@types/unist": "^2.0.0", + "unist-util-is": "^4.0.0" + } + } } }, "remark-lint-rule-style": { @@ -1246,9 +1272,9 @@ } }, "remark-preset-lint-node": { - "version": "1.12.0", - "resolved": "https://registry.npmjs.org/remark-preset-lint-node/-/remark-preset-lint-node-1.12.0.tgz", - "integrity": "sha512-Un9RH6cSLgI/fECdgFh9cxRjYVtnwmxsRPwJIsKjX9aOIVM0ohRCPeJ/Sh4nhBtL7PUnF2qMsIwt9b8OlL9HnA==", + "version": "1.13.0", + "resolved": "https://registry.npmjs.org/remark-preset-lint-node/-/remark-preset-lint-node-1.13.0.tgz", + "integrity": "sha512-UNAoY4wl672d0qE+LM5rA0ILOTJN+siNGj3/qa5Zvl7nMIUwqMcz0G266Ck6OL6GOrpys/e4EOrkXiitEdEqNA==", "requires": { "remark-lint": "^6.0.5", "remark-lint-blockquote-indentation": "^1.0.3", @@ -1275,7 +1301,7 @@ "remark-lint-no-table-indentation": "^1.0.4", "remark-lint-no-tabs": "^1.0.3", "remark-lint-no-trailing-spaces": "^2.0.1", - "remark-lint-prohibited-strings": "^1.2.0", + "remark-lint-prohibited-strings": "^1.2.1", "remark-lint-rule-style": "^1.0.3", "remark-lint-strong-marker": "^1.0.3", "remark-lint-table-cell-padding": "^1.0.4", @@ -1618,9 +1644,9 @@ "integrity": "sha512-sVZZX3+kspVNmLWBPAB6r+7D9ZgAFPNWm66f7YNb420RlQSbn+n8rG8dGZSkrER7ZIXGQYNm5pqC3v3HopH24A==" }, "unist-util-position": { - "version": "3.0.4", - "resolved": "https://registry.npmjs.org/unist-util-position/-/unist-util-position-3.0.4.tgz", - "integrity": "sha512-tWvIbV8goayTjobxDIr4zVTyG+Q7ragMSMeKC3xnPl9xzIc0+she8mxXLM3JVNDDsfARPbCd3XdzkyLdo7fF3g==" + "version": "3.1.0", + "resolved": "https://registry.npmjs.org/unist-util-position/-/unist-util-position-3.1.0.tgz", + "integrity": "sha512-w+PkwCbYSFw8vpgWD0v7zRCl1FpY3fjDSQ3/N/wNd9Ffa4gPi8+4keqt99N3XW6F99t/mUzp2xAhNmfKWp95QA==" }, "unist-util-remove-position": { "version": "1.1.4", diff --git a/tools/node-lint-md-cli-rollup/package.json b/tools/node-lint-md-cli-rollup/package.json index 67cad50ce21516..be155b49e95fce 100644 --- a/tools/node-lint-md-cli-rollup/package.json +++ b/tools/node-lint-md-cli-rollup/package.json @@ -13,7 +13,7 @@ "markdown-extensions": "^1.1.1", "remark": "^11.0.2", "remark-lint": "^6.0.5", - "remark-preset-lint-node": "^1.12.0", + "remark-preset-lint-node": "^1.13.0", "unified-args": "^7.1.0" }, "main": "dist/index.js", diff --git a/tools/node-lint-md-cli-rollup/rollup.config.js b/tools/node-lint-md-cli-rollup/rollup.config.js index 73770d8836a9df..49b9817ca581bd 100644 --- a/tools/node-lint-md-cli-rollup/rollup.config.js +++ b/tools/node-lint-md-cli-rollup/rollup.config.js @@ -34,8 +34,7 @@ module.exports = { if (normID === '/node_modules/unified-args/lib/options.js') { return code.replace('\'./schema\'', '\'./schema.json\''); } - if (normID === '/node_modules/chokidar/lib/fsevents-handler.js' && - process.platform !== 'darwin') { + if (normID === '/node_modules/chokidar/lib/fsevents-handler.js') { return code.replace( 'fsevents = require(\'fsevents\');', 'fsevents = undefined;' ); From 091b4bfe2d73861c829d0dcddc4c4dfabae1df9e Mon Sep 17 00:00:00 2001 From: Shelley Vohr Date: Tue, 18 Feb 2020 13:08:31 -0800 Subject: [PATCH 29/91] doc: add note about ssh key to releases PR-URL: https://github.com/nodejs/node/pull/31856 Reviewed-By: Myles Borins Reviewed-By: Beth Griggs Reviewed-By: Ruben Bridgewater Reviewed-By: James M Snell --- doc/releases.md | 26 ++++++++++++++++++++++++-- 1 file changed, 24 insertions(+), 2 deletions(-) diff --git a/doc/releases.md b/doc/releases.md index b20bf3662e3f8c..76f6ad57412ab7 100644 --- a/doc/releases.md +++ b/doc/releases.md @@ -553,8 +553,30 @@ formatting passes the lint rules on `master`. to promote the builds as the `SHASUMS256.txt` file needs to be signed with the same GPG key!** -Use `tools/release.sh` to promote and sign the build. When run, it will perform -the following actions: +Use `tools/release.sh` to promote and sign the build. Before doing this, you'll +need to ensure you've loaded the correct ssh key, or you'll see the following: + +```sh +# Checking for releases ... +Enter passphrase for key '/Users//.ssh/id_rsa': +dist@direct.nodejs.org's password: +``` + +The key can be loaded either with `ssh-add`: + +```sh +# Substitute node_id_rsa with whatever you've named the key +$ ssh-add ~/.ssh/node_id_rsa +``` + +or at runtime with: + +```sh +# Substitute node_id_rsa with whatever you've named the key +$ ./tools/release.sh -i ~/.ssh/node_id_rsa +``` + +`tools/release.sh` will perform the following actions when run: **a.** Select a GPG key from your private keys. It will use a command similar to: `gpg --list-secret-keys` to list your keys. If you don't have any keys, it From 79b1f04b153d3d80799e6669f036a443fc011f07 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Micha=C3=ABl=20Zasso?= Date: Wed, 20 Nov 2019 19:54:04 +0100 Subject: [PATCH 30/91] tools: sync gyp code base with node-gyp repo PR-URL: https://github.com/nodejs/node/pull/30563 Reviewed-By: Ujjwal Sharma Reviewed-By: Christian Clauss Reviewed-By: Ruben Bridgewater Reviewed-By: Rich Trott --- tools/gyp/DEPS | 23 - tools/gyp/buildbot/buildbot_run.py | 137 -- tools/gyp/buildbot/commit_queue/OWNERS | 6 - tools/gyp/buildbot/commit_queue/README | 3 - .../gyp/buildbot/commit_queue/cq_config.json | 15 - tools/gyp/codereview.settings | 6 - tools/gyp/gyp_main.py | 38 +- tools/gyp/gyptest.py | 243 ---- tools/gyp/pylib/gyp/MSVSSettings.py | 8 +- tools/gyp/pylib/gyp/MSVSSettings_test.py | 2 + tools/gyp/pylib/gyp/MSVSVersion.py | 6 + tools/gyp/pylib/gyp/common.py | 28 +- tools/gyp/pylib/gyp/generator/android.py | 1097 +++++++++++++++++ tools/gyp/pylib/gyp/generator/cmake.py | 9 +- tools/gyp/pylib/gyp/generator/eclipse.py | 10 +- tools/gyp/pylib/gyp/generator/make.py | 70 +- tools/gyp/pylib/gyp/generator/msvs.py | 49 +- tools/gyp/pylib/gyp/generator/ninja.py | 19 +- tools/gyp/pylib/gyp/generator/xcode.py | 4 +- tools/gyp/pylib/gyp/input.py | 45 +- tools/gyp/pylib/gyp/mac_tool.py | 6 +- tools/gyp/pylib/gyp/msvs_emulation.py | 29 +- tools/gyp/pylib/gyp/ordered_dict.py | 289 ----- tools/gyp/pylib/gyp/win_tool.py | 11 + tools/gyp/pylib/gyp/xcode_emulation.py | 16 +- tools/gyp/pylib/gyp/xcodeproj_file.py | 2 +- tools/gyp/tools/pretty_gyp.py | 2 +- tools/gyp/tools/pretty_vcproj.py | 2 +- 28 files changed, 1354 insertions(+), 821 deletions(-) delete mode 100644 tools/gyp/DEPS delete mode 100755 tools/gyp/buildbot/buildbot_run.py delete mode 100644 tools/gyp/buildbot/commit_queue/OWNERS delete mode 100644 tools/gyp/buildbot/commit_queue/README delete mode 100644 tools/gyp/buildbot/commit_queue/cq_config.json delete mode 100644 tools/gyp/codereview.settings delete mode 100755 tools/gyp/gyptest.py create mode 100644 tools/gyp/pylib/gyp/generator/android.py delete mode 100644 tools/gyp/pylib/gyp/ordered_dict.py diff --git a/tools/gyp/DEPS b/tools/gyp/DEPS deleted file mode 100644 index 167fb779b0e1be..00000000000000 --- a/tools/gyp/DEPS +++ /dev/null @@ -1,23 +0,0 @@ -# DEPS file for gclient use in buildbot execution of gyp tests. -# -# (You don't need to use gclient for normal GYP development work.) - -vars = { - "chromium_git": "https://chromium.googlesource.com/", -} - -deps = { -} - -deps_os = { - "win": { - "third_party/cygwin": - Var("chromium_git") + "chromium/deps/cygwin@4fbd5b9", - - "third_party/python_26": - Var("chromium_git") + "chromium/deps/python_26@5bb4080", - - "src/third_party/pefile": - Var("chromium_git") + "external/pefile@72c6ae4", - }, -} diff --git a/tools/gyp/buildbot/buildbot_run.py b/tools/gyp/buildbot/buildbot_run.py deleted file mode 100755 index cdd347d0bcc95a..00000000000000 --- a/tools/gyp/buildbot/buildbot_run.py +++ /dev/null @@ -1,137 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) 2012 Google Inc. All rights reserved. -# Use of this source code is governed by a BSD-style license that can be -# found in the LICENSE file. - -"""Argument-less script to select what to run on the buildbots.""" -from __future__ import print_function - -import os -import shutil -import subprocess -import sys - - -BUILDBOT_DIR = os.path.dirname(os.path.abspath(__file__)) -TRUNK_DIR = os.path.dirname(BUILDBOT_DIR) -ROOT_DIR = os.path.dirname(TRUNK_DIR) -CMAKE_DIR = os.path.join(ROOT_DIR, 'cmake') -CMAKE_BIN_DIR = os.path.join(CMAKE_DIR, 'bin') -OUT_DIR = os.path.join(TRUNK_DIR, 'out') - - -def CallSubProcess(*args, **kwargs): - """Wrapper around subprocess.call which treats errors as build exceptions.""" - with open(os.devnull) as devnull_fd: - retcode = subprocess.call(stdin=devnull_fd, *args, **kwargs) - if retcode != 0: - print('@@@STEP_EXCEPTION@@@') - sys.exit(1) - - -def PrepareCmake(): - """Build CMake 2.8.8 since the version in Precise is 2.8.7.""" - if os.environ['BUILDBOT_CLOBBER'] == '1': - print('@@@BUILD_STEP Clobber CMake checkout@@@') - shutil.rmtree(CMAKE_DIR) - - # We always build CMake 2.8.8, so no need to do anything - # if the directory already exists. - if os.path.isdir(CMAKE_DIR): - return - - print('@@@BUILD_STEP Initialize CMake checkout@@@') - os.mkdir(CMAKE_DIR) - - print('@@@BUILD_STEP Sync CMake@@@') - CallSubProcess( - ['git', 'clone', - '--depth', '1', - '--single-branch', - '--branch', 'v2.8.8', - '--', - 'git://cmake.org/cmake.git', - CMAKE_DIR], - cwd=CMAKE_DIR) - - print('@@@BUILD_STEP Build CMake@@@') - CallSubProcess( - ['/bin/bash', 'bootstrap', '--prefix=%s' % CMAKE_DIR], - cwd=CMAKE_DIR) - - CallSubProcess( ['make', 'cmake'], cwd=CMAKE_DIR) - - -def GypTestFormat(title, format=None, msvs_version=None, tests=[]): - """Run the gyp tests for a given format, emitting annotator tags. - - See annotator docs at: - https://sites.google.com/a/chromium.org/dev/developers/testing/chromium-build-infrastructure/buildbot-annotations - Args: - format: gyp format to test. - Returns: - 0 for sucesss, 1 for failure. - """ - if not format: - format = title - - print('@@@BUILD_STEP ' + title + '@@@') - sys.stdout.flush() - env = os.environ.copy() - if msvs_version: - env['GYP_MSVS_VERSION'] = msvs_version - command = ' '.join( - [sys.executable, 'gyp/gyptest.py', - '--all', - '--passed', - '--format', format, - '--path', CMAKE_BIN_DIR, - '--chdir', 'gyp'] + tests) - retcode = subprocess.call(command, cwd=ROOT_DIR, env=env, shell=True) - if retcode: - # Emit failure tag, and keep going. - print('@@@STEP_FAILURE@@@') - return 1 - return 0 - - -def GypBuild(): - # Dump out/ directory. - print('@@@BUILD_STEP cleanup@@@') - print('Removing %s...' % OUT_DIR) - shutil.rmtree(OUT_DIR, ignore_errors=True) - print('Done.') - - retcode = 0 - if sys.platform.startswith('linux'): - retcode += GypTestFormat('ninja') - retcode += GypTestFormat('make') - PrepareCmake() - retcode += GypTestFormat('cmake') - elif sys.platform == 'darwin': - retcode += GypTestFormat('ninja') - retcode += GypTestFormat('xcode') - retcode += GypTestFormat('make') - elif sys.platform == 'win32': - retcode += GypTestFormat('ninja') - if os.environ['BUILDBOT_BUILDERNAME'] == 'gyp-win64': - retcode += GypTestFormat('msvs-ninja-2013', format='msvs-ninja', - msvs_version='2013', - tests=[ - r'test\generator-output\gyptest-actions.py', - r'test\generator-output\gyptest-relocate.py', - r'test\generator-output\gyptest-rules.py']) - retcode += GypTestFormat('msvs-2013', format='msvs', msvs_version='2013') - else: - raise Exception('Unknown platform') - if retcode: - # TODO(bradnelson): once the annotator supports a postscript (section for - # after the build proper that could be used for cumulative failures), - # use that instead of this. This isolates the final return value so - # that it isn't misattributed to the last stage. - print('@@@BUILD_STEP failures@@@') - sys.exit(retcode) - - -if __name__ == '__main__': - GypBuild() diff --git a/tools/gyp/buildbot/commit_queue/OWNERS b/tools/gyp/buildbot/commit_queue/OWNERS deleted file mode 100644 index b269c198b43e3e..00000000000000 --- a/tools/gyp/buildbot/commit_queue/OWNERS +++ /dev/null @@ -1,6 +0,0 @@ -set noparent -bradnelson@chromium.org -bradnelson@google.com -iannucci@chromium.org -scottmg@chromium.org -thakis@chromium.org diff --git a/tools/gyp/buildbot/commit_queue/README b/tools/gyp/buildbot/commit_queue/README deleted file mode 100644 index 94284978832702..00000000000000 --- a/tools/gyp/buildbot/commit_queue/README +++ /dev/null @@ -1,3 +0,0 @@ -cq_config.json describes the trybots that must pass in order -to land a change through the commit queue. -Comments are here as the file is strictly JSON. diff --git a/tools/gyp/buildbot/commit_queue/cq_config.json b/tools/gyp/buildbot/commit_queue/cq_config.json deleted file mode 100644 index 656c21e54fb12f..00000000000000 --- a/tools/gyp/buildbot/commit_queue/cq_config.json +++ /dev/null @@ -1,15 +0,0 @@ -{ - "trybots": { - "launched": { - "tryserver.nacl": { - "gyp-presubmit": ["defaulttests"], - "gyp-linux": ["defaulttests"], - "gyp-mac": ["defaulttests"], - "gyp-win32": ["defaulttests"], - "gyp-win64": ["defaulttests"] - } - }, - "triggered": { - } - } -} diff --git a/tools/gyp/codereview.settings b/tools/gyp/codereview.settings deleted file mode 100644 index 27fb9f99e25111..00000000000000 --- a/tools/gyp/codereview.settings +++ /dev/null @@ -1,6 +0,0 @@ -# This file is used by git cl to get repository specific information. -CC_LIST: gyp-developer@googlegroups.com -CODE_REVIEW_SERVER: codereview.chromium.org -GERRIT_HOST: True -PROJECT: gyp -VIEW_VC: https://chromium.googlesource.com/external/gyp/+/ diff --git a/tools/gyp/gyp_main.py b/tools/gyp/gyp_main.py index 25a6eba94aae7d..f738e8009f71e7 100755 --- a/tools/gyp/gyp_main.py +++ b/tools/gyp/gyp_main.py @@ -6,10 +6,44 @@ import os import sys +import subprocess + +PY3 = bytes != str + +# Below IsCygwin() function copied from pylib/gyp/common.py +def IsCygwin(): + try: + out = subprocess.Popen("uname", + stdout=subprocess.PIPE, + stderr=subprocess.STDOUT) + stdout, stderr = out.communicate() + if PY3: + stdout = stdout.decode("utf-8") + return "CYGWIN" in str(stdout) + except Exception: + return False + + +def UnixifyPath(path): + try: + if not IsCygwin(): + return path + out = subprocess.Popen(["cygpath", "-u", path], + stdout=subprocess.PIPE, + stderr=subprocess.STDOUT) + stdout, _ = out.communicate() + if PY3: + stdout = stdout.decode("utf-8") + return str(stdout) + except Exception: + return path + # Make sure we're using the version of pylib in this repo, not one installed -# elsewhere on the system. -sys.path.insert(0, os.path.join(os.path.dirname(sys.argv[0]), 'pylib')) +# elsewhere on the system. Also convert to Unix style path on Cygwin systems, +# else the 'gyp' library will not be found +path = UnixifyPath(sys.argv[0]) +sys.path.insert(0, os.path.join(os.path.dirname(path), 'pylib')) import gyp if __name__ == '__main__': diff --git a/tools/gyp/gyptest.py b/tools/gyp/gyptest.py deleted file mode 100755 index 1a9ffca7a134ae..00000000000000 --- a/tools/gyp/gyptest.py +++ /dev/null @@ -1,243 +0,0 @@ -#!/usr/bin/env python -# Copyright (c) 2012 Google Inc. All rights reserved. -# Use of this source code is governed by a BSD-style license that can be -# found in the LICENSE file. - -"""gyptest.py -- test runner for GYP tests.""" - -from __future__ import print_function - -import argparse -import math -import os -import platform -import subprocess -import sys -import time - - -def is_test_name(f): - return f.startswith('gyptest') and f.endswith('.py') - - -def find_all_gyptest_files(directory): - result = [] - for root, dirs, files in os.walk(directory): - result.extend([ os.path.join(root, f) for f in files if is_test_name(f) ]) - result.sort() - return result - - -def main(argv=None): - if argv is None: - argv = sys.argv - - parser = argparse.ArgumentParser() - parser.add_argument("-a", "--all", action="store_true", - help="run all tests") - parser.add_argument("-C", "--chdir", action="store", - help="change to directory") - parser.add_argument("-f", "--format", action="store", default='', - help="run tests with the specified formats") - parser.add_argument("-G", '--gyp_option', action="append", default=[], - help="Add -G options to the gyp command line") - parser.add_argument("-l", "--list", action="store_true", - help="list available tests and exit") - parser.add_argument("-n", "--no-exec", action="store_true", - help="no execute, just print the command line") - parser.add_argument("--path", action="append", default=[], - help="additional $PATH directory") - parser.add_argument("-q", "--quiet", action="store_true", - help="quiet, don't print anything unless there are failures") - parser.add_argument("-v", "--verbose", action="store_true", - help="print configuration info and test results.") - parser.add_argument('tests', nargs='*') - args = parser.parse_args(argv[1:]) - - if args.chdir: - os.chdir(args.chdir) - - if args.path: - extra_path = [os.path.abspath(p) for p in args.path] - extra_path = os.pathsep.join(extra_path) - os.environ['PATH'] = extra_path + os.pathsep + os.environ['PATH'] - - if not args.tests: - if not args.all: - sys.stderr.write('Specify -a to get all tests.\n') - return 1 - args.tests = ['test'] - - tests = [] - for arg in args.tests: - if os.path.isdir(arg): - tests.extend(find_all_gyptest_files(os.path.normpath(arg))) - else: - if not is_test_name(os.path.basename(arg)): - print(arg, 'is not a valid gyp test name.', file=sys.stderr) - sys.exit(1) - tests.append(arg) - - if args.list: - for test in tests: - print(test) - sys.exit(0) - - os.environ['PYTHONPATH'] = os.path.abspath('test/lib') - - if args.verbose: - print_configuration_info() - - if args.gyp_option and not args.quiet: - print('Extra Gyp options: %s\n' % args.gyp_option) - - if args.format: - format_list = args.format.split(',') - else: - format_list = { - 'aix5': ['make'], - 'freebsd7': ['make'], - 'freebsd8': ['make'], - 'openbsd5': ['make'], - 'cygwin': ['msvs'], - 'win32': ['msvs', 'ninja'], - 'linux': ['make', 'ninja'], - 'linux2': ['make', 'ninja'], - 'linux3': ['make', 'ninja'], - - # TODO: Re-enable xcode-ninja. - # https://bugs.chromium.org/p/gyp/issues/detail?id=530 - # 'darwin': ['make', 'ninja', 'xcode', 'xcode-ninja'], - 'darwin': ['make', 'ninja', 'xcode'], - }[sys.platform] - - gyp_options = [] - for option in args.gyp_option: - gyp_options += ['-G', option] - - runner = Runner(format_list, tests, gyp_options, args.verbose) - runner.run() - - if not args.quiet: - runner.print_results() - - if runner.failures: - return 1 - else: - return 0 - - -def print_configuration_info(): - print('Test configuration:') - if sys.platform == 'darwin': - sys.path.append(os.path.abspath('test/lib')) - import TestMac - print(' Mac %s %s' % (platform.mac_ver()[0], platform.mac_ver()[2])) - print(' Xcode %s' % TestMac.Xcode.Version()) - elif sys.platform == 'win32': - sys.path.append(os.path.abspath('pylib')) - import gyp.MSVSVersion - print(' Win %s %s\n' % platform.win32_ver()[0:2]) - print(' MSVS %s' % - gyp.MSVSVersion.SelectVisualStudioVersion().Description()) - elif sys.platform in ('linux', 'linux2'): - print(' Linux %s' % ' '.join(platform.linux_distribution())) - print(' Python %s' % platform.python_version()) - print(' PYTHONPATH=%s' % os.environ['PYTHONPATH']) - print() - - -class Runner(object): - def __init__(self, formats, tests, gyp_options, verbose): - self.formats = formats - self.tests = tests - self.verbose = verbose - self.gyp_options = gyp_options - self.failures = [] - self.num_tests = len(formats) * len(tests) - num_digits = len(str(self.num_tests)) - self.fmt_str = '[%%%dd/%%%dd] (%%s) %%s' % (num_digits, num_digits) - self.isatty = sys.stdout.isatty() and not self.verbose - self.env = os.environ.copy() - self.hpos = 0 - - def run(self): - run_start = time.time() - - i = 1 - for fmt in self.formats: - for test in self.tests: - self.run_test(test, fmt, i) - i += 1 - - if self.isatty: - self.erase_current_line() - - self.took = time.time() - run_start - - def run_test(self, test, fmt, i): - if self.isatty: - self.erase_current_line() - - msg = self.fmt_str % (i, self.num_tests, fmt, test) - self.print_(msg) - - start = time.time() - cmd = [sys.executable, test] + self.gyp_options - self.env['TESTGYP_FORMAT'] = fmt - proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, - stderr=subprocess.STDOUT, env=self.env) - proc.wait() - took = time.time() - start - - stdout = proc.stdout.read().decode('utf8') - if proc.returncode == 2: - res = 'skipped' - elif proc.returncode: - res = 'failed' - self.failures.append('(%s) %s' % (test, fmt)) - else: - res = 'passed' - res_msg = ' %s %.3fs' % (res, took) - self.print_(res_msg) - - if (stdout and - not stdout.endswith('PASSED\n') and - not (stdout.endswith('NO RESULT\n'))): - print() - for l in stdout.splitlines(): - print(' %s' % l) - elif not self.isatty: - print() - - def print_(self, msg): - print(msg, end='') - index = msg.rfind('\n') - if index == -1: - self.hpos += len(msg) - else: - self.hpos = len(msg) - index - sys.stdout.flush() - - def erase_current_line(self): - print('\b' * self.hpos + ' ' * self.hpos + '\b' * self.hpos, end='') - sys.stdout.flush() - self.hpos = 0 - - def print_results(self): - num_failures = len(self.failures) - if num_failures: - print() - if num_failures == 1: - print("Failed the following test:") - else: - print("Failed the following %d tests:" % num_failures) - print("\t" + "\n\t".join(sorted(self.failures))) - print() - print('Ran %d tests in %.3fs, %d failed.' % (self.num_tests, self.took, - num_failures)) - print() - - -if __name__ == "__main__": - sys.exit(main()) diff --git a/tools/gyp/pylib/gyp/MSVSSettings.py b/tools/gyp/pylib/gyp/MSVSSettings.py index 0f53ff87c77d54..5dd8f8c1e6e242 100644 --- a/tools/gyp/pylib/gyp/MSVSSettings.py +++ b/tools/gyp/pylib/gyp/MSVSSettings.py @@ -522,8 +522,8 @@ def _ValidateSettings(validators, settings, stderr): try: tool_validators[setting](value) except ValueError as e: - print('Warning: for %s/%s, %s' % (tool_name, setting, e), - file=stderr) + print('Warning: for %s/%s, %s' % + (tool_name, setting, e), file=stderr) else: _ValidateExclusionSetting(setting, tool_validators, @@ -976,7 +976,9 @@ def _ValidateSettings(validators, settings, stderr): _Enumeration(['NotSet', 'Win32', # /env win32 'Itanium', # /env ia64 - 'X64'])) # /env x64 + 'X64', # /env x64 + 'ARM64', # /env arm64 + ])) _Same(_midl, 'EnableErrorChecks', _Enumeration(['EnableCustom', 'None', # /error none diff --git a/tools/gyp/pylib/gyp/MSVSSettings_test.py b/tools/gyp/pylib/gyp/MSVSSettings_test.py index 245478c8dae4ed..77b79e650d8b4e 100755 --- a/tools/gyp/pylib/gyp/MSVSSettings_test.py +++ b/tools/gyp/pylib/gyp/MSVSSettings_test.py @@ -1085,6 +1085,7 @@ def testConvertToMSBuildSettings_full_synthetic(self): 'GenerateManifest': 'true', 'IgnoreImportLibrary': 'true', 'LinkIncremental': 'false'}} + self.maxDiff = 9999 # on failure display a long diff actual_msbuild_settings = MSVSSettings.ConvertToMSBuildSettings( msvs_settings, self.stderr) @@ -1476,6 +1477,7 @@ def testConvertToMSBuildSettings_actual(self): 'ResourceOutputFileName': '$(IntDir)$(TargetFileName).embed.manifest.resfdsf'} } + self.maxDiff = 9999 # on failure display a long diff actual_msbuild_settings = MSVSSettings.ConvertToMSBuildSettings( msvs_settings, self.stderr) diff --git a/tools/gyp/pylib/gyp/MSVSVersion.py b/tools/gyp/pylib/gyp/MSVSVersion.py index f89f1d0fc2efd6..ce9b349834c947 100644 --- a/tools/gyp/pylib/gyp/MSVSVersion.py +++ b/tools/gyp/pylib/gyp/MSVSVersion.py @@ -12,6 +12,8 @@ import gyp import glob +PY3 = bytes != str + def JoinPath(*args): return os.path.normpath(os.path.join(*args)) @@ -163,6 +165,8 @@ def _RegistryQueryBase(sysdir, key, value): # Obtain the stdout from reg.exe, reading to the end so p.returncode is valid # Note that the error text may be in [1] in some cases text = p.communicate()[0] + if PY3: + text = text.decode('utf-8') # Check return code from reg.exe; officially 0==success and 1==error if p.returncode: return None @@ -385,6 +389,8 @@ def _ConvertToCygpath(path): if sys.platform == 'cygwin': p = subprocess.Popen(['cygpath', path], stdout=subprocess.PIPE) path = p.communicate()[0].strip() + if PY3: + path = path.decode('utf-8') return path diff --git a/tools/gyp/pylib/gyp/common.py b/tools/gyp/pylib/gyp/common.py index 351800ee25e23e..aa410e1dfddbab 100644 --- a/tools/gyp/pylib/gyp/common.py +++ b/tools/gyp/pylib/gyp/common.py @@ -8,12 +8,15 @@ import re import tempfile import sys +import subprocess try: from collections.abc import MutableSet except ImportError: from collections import MutableSet +PY3 = bytes != str + # A minimal memoizing decorator. It'll blow up if the args aren't immutable, # among other "problems". @@ -341,11 +344,16 @@ def WriteOnDiff(filename): class Writer(object): """Wrapper around file which only covers the target if it differs.""" def __init__(self): + # On Cygwin remove the "dir" argument because `C:` prefixed paths are treated as relative, + # consequently ending up with current dir "/cygdrive/c/..." being prefixed to those, which was + # obviously a non-existent path, for example: "/cygdrive/c//C:\". + # See https://docs.python.org/2/library/tempfile.html#tempfile.mkstemp for more details + base_temp_dir = "" if IsCygwin() else os.path.dirname(filename) # Pick temporary file. tmp_fd, self.tmp_path = tempfile.mkstemp( suffix='.tmp', prefix=os.path.split(filename)[1] + '.gyp.', - dir=os.path.split(filename)[0]) + dir=base_temp_dir) try: self.tmp_file = os.fdopen(tmp_fd, 'wb') except Exception: @@ -426,9 +434,7 @@ def GetFlavor(params): return flavors[sys.platform] if sys.platform.startswith('sunos'): return 'solaris' - if sys.platform.startswith('freebsd'): - return 'freebsd' - if sys.platform.startswith('dragonfly'): + if sys.platform.startswith(('dragonfly', 'freebsd')): return 'freebsd' if sys.platform.startswith('openbsd'): return 'openbsd' @@ -436,6 +442,8 @@ def GetFlavor(params): return 'netbsd' if sys.platform.startswith('aix'): return 'aix' + if sys.platform.startswith(('os390', 'zos')): + return 'zos' return 'linux' @@ -620,3 +628,15 @@ def CrossCompileRequested(): os.environ.get('AR_target') or os.environ.get('CC_target') or os.environ.get('CXX_target')) + +def IsCygwin(): + try: + out = subprocess.Popen("uname", + stdout=subprocess.PIPE, + stderr=subprocess.STDOUT) + stdout, stderr = out.communicate() + if PY3: + stdout = stdout.decode("utf-8") + return "CYGWIN" in str(stdout) + except Exception: + return False diff --git a/tools/gyp/pylib/gyp/generator/android.py b/tools/gyp/pylib/gyp/generator/android.py new file mode 100644 index 00000000000000..cecb28c3660b5e --- /dev/null +++ b/tools/gyp/pylib/gyp/generator/android.py @@ -0,0 +1,1097 @@ +# Copyright (c) 2012 Google Inc. All rights reserved. +# Use of this source code is governed by a BSD-style license that can be +# found in the LICENSE file. + +# Notes: +# +# This generates makefiles suitable for inclusion into the Android build system +# via an Android.mk file. It is based on make.py, the standard makefile +# generator. +# +# The code below generates a separate .mk file for each target, but +# all are sourced by the top-level GypAndroid.mk. This means that all +# variables in .mk-files clobber one another, and furthermore that any +# variables set potentially clash with other Android build system variables. +# Try to avoid setting global variables where possible. + +from __future__ import print_function + +import gyp +import gyp.common +import gyp.generator.make as make # Reuse global functions from make backend. +import os +import re +import subprocess + +generator_default_variables = { + 'OS': 'android', + 'EXECUTABLE_PREFIX': '', + 'EXECUTABLE_SUFFIX': '', + 'STATIC_LIB_PREFIX': 'lib', + 'SHARED_LIB_PREFIX': 'lib', + 'STATIC_LIB_SUFFIX': '.a', + 'SHARED_LIB_SUFFIX': '.so', + 'INTERMEDIATE_DIR': '$(gyp_intermediate_dir)', + 'SHARED_INTERMEDIATE_DIR': '$(gyp_shared_intermediate_dir)', + 'PRODUCT_DIR': '$(gyp_shared_intermediate_dir)', + 'SHARED_LIB_DIR': '$(builddir)/lib.$(TOOLSET)', + 'LIB_DIR': '$(obj).$(TOOLSET)', + 'RULE_INPUT_ROOT': '%(INPUT_ROOT)s', # This gets expanded by Python. + 'RULE_INPUT_DIRNAME': '%(INPUT_DIRNAME)s', # This gets expanded by Python. + 'RULE_INPUT_PATH': '$(RULE_SOURCES)', + 'RULE_INPUT_EXT': '$(suffix $<)', + 'RULE_INPUT_NAME': '$(notdir $<)', + 'CONFIGURATION_NAME': '$(GYP_CONFIGURATION)', +} + +# Make supports multiple toolsets +generator_supports_multiple_toolsets = True + + +# Generator-specific gyp specs. +generator_additional_non_configuration_keys = [ + # Boolean to declare that this target does not want its name mangled. + 'android_unmangled_name', + # Map of android build system variables to set. + 'aosp_build_settings', +] +generator_additional_path_sections = [] +generator_extra_sources_for_rules = [] + + +ALL_MODULES_FOOTER = """\ +# "gyp_all_modules" is a concatenation of the "gyp_all_modules" targets from +# all the included sub-makefiles. This is just here to clarify. +gyp_all_modules: +""" + +header = """\ +# This file is generated by gyp; do not edit. + +""" + +# Map gyp target types to Android module classes. +MODULE_CLASSES = { + 'static_library': 'STATIC_LIBRARIES', + 'shared_library': 'SHARED_LIBRARIES', + 'executable': 'EXECUTABLES', +} + + +def IsCPPExtension(ext): + return make.COMPILABLE_EXTENSIONS.get(ext) == 'cxx' + + +def Sourceify(path): + """Convert a path to its source directory form. The Android backend does not + support options.generator_output, so this function is a noop.""" + return path + + +# Map from qualified target to path to output. +# For Android, the target of these maps is a tuple ('static', 'modulename'), +# ('dynamic', 'modulename'), or ('path', 'some/path') instead of a string, +# since we link by module. +target_outputs = {} +# Map from qualified target to any linkable output. A subset +# of target_outputs. E.g. when mybinary depends on liba, we want to +# include liba in the linker line; when otherbinary depends on +# mybinary, we just want to build mybinary first. +target_link_deps = {} + + +class AndroidMkWriter(object): + """AndroidMkWriter packages up the writing of one target-specific Android.mk. + + Its only real entry point is Write(), and is mostly used for namespacing. + """ + + def __init__(self, android_top_dir): + self.android_top_dir = android_top_dir + + def Write(self, qualified_target, relative_target, base_path, output_filename, + spec, configs, part_of_all, write_alias_target, sdk_version): + """The main entry point: writes a .mk file for a single target. + + Arguments: + qualified_target: target we're generating + relative_target: qualified target name relative to the root + base_path: path relative to source root we're building in, used to resolve + target-relative paths + output_filename: output .mk file name to write + spec, configs: gyp info + part_of_all: flag indicating this target is part of 'all' + write_alias_target: flag indicating whether to create short aliases for + this target + sdk_version: what to emit for LOCAL_SDK_VERSION in output + """ + gyp.common.EnsureDirExists(output_filename) + + self.fp = open(output_filename, 'w') + + self.fp.write(header) + + self.qualified_target = qualified_target + self.relative_target = relative_target + self.path = base_path + self.target = spec['target_name'] + self.type = spec['type'] + self.toolset = spec['toolset'] + + deps, link_deps = self.ComputeDeps(spec) + + # Some of the generation below can add extra output, sources, or + # link dependencies. All of the out params of the functions that + # follow use names like extra_foo. + extra_outputs = [] + extra_sources = [] + + self.android_class = MODULE_CLASSES.get(self.type, 'GYP') + self.android_module = self.ComputeAndroidModule(spec) + (self.android_stem, self.android_suffix) = self.ComputeOutputParts(spec) + self.output = self.output_binary = self.ComputeOutput(spec) + + # Standard header. + self.WriteLn('include $(CLEAR_VARS)\n') + + # Module class and name. + self.WriteLn('LOCAL_MODULE_CLASS := ' + self.android_class) + self.WriteLn('LOCAL_MODULE := ' + self.android_module) + # Only emit LOCAL_MODULE_STEM if it's different to LOCAL_MODULE. + # The library module classes fail if the stem is set. ComputeOutputParts + # makes sure that stem == modulename in these cases. + if self.android_stem != self.android_module: + self.WriteLn('LOCAL_MODULE_STEM := ' + self.android_stem) + self.WriteLn('LOCAL_MODULE_SUFFIX := ' + self.android_suffix) + if self.toolset == 'host': + self.WriteLn('LOCAL_IS_HOST_MODULE := true') + self.WriteLn('LOCAL_MULTILIB := $(GYP_HOST_MULTILIB)') + elif sdk_version > 0: + self.WriteLn('LOCAL_MODULE_TARGET_ARCH := ' + '$(TARGET_$(GYP_VAR_PREFIX)ARCH)') + self.WriteLn('LOCAL_SDK_VERSION := %s' % sdk_version) + + # Grab output directories; needed for Actions and Rules. + if self.toolset == 'host': + self.WriteLn('gyp_intermediate_dir := ' + '$(call local-intermediates-dir,,$(GYP_HOST_VAR_PREFIX))') + else: + self.WriteLn('gyp_intermediate_dir := ' + '$(call local-intermediates-dir,,$(GYP_VAR_PREFIX))') + self.WriteLn('gyp_shared_intermediate_dir := ' + '$(call intermediates-dir-for,GYP,shared,,,$(GYP_VAR_PREFIX))') + self.WriteLn() + + # List files this target depends on so that actions/rules/copies/sources + # can depend on the list. + # TODO: doesn't pull in things through transitive link deps; needed? + target_dependencies = [x[1] for x in deps if x[0] == 'path'] + self.WriteLn('# Make sure our deps are built first.') + self.WriteList(target_dependencies, 'GYP_TARGET_DEPENDENCIES', + local_pathify=True) + + # Actions must come first, since they can generate more OBJs for use below. + if 'actions' in spec: + self.WriteActions(spec['actions'], extra_sources, extra_outputs) + + # Rules must be early like actions. + if 'rules' in spec: + self.WriteRules(spec['rules'], extra_sources, extra_outputs) + + if 'copies' in spec: + self.WriteCopies(spec['copies'], extra_outputs) + + # GYP generated outputs. + self.WriteList(extra_outputs, 'GYP_GENERATED_OUTPUTS', local_pathify=True) + + # Set LOCAL_ADDITIONAL_DEPENDENCIES so that Android's build rules depend + # on both our dependency targets and our generated files. + self.WriteLn('# Make sure our deps and generated files are built first.') + self.WriteLn('LOCAL_ADDITIONAL_DEPENDENCIES := $(GYP_TARGET_DEPENDENCIES) ' + '$(GYP_GENERATED_OUTPUTS)') + self.WriteLn() + + # Sources. + if spec.get('sources', []) or extra_sources: + self.WriteSources(spec, configs, extra_sources) + + self.WriteTarget(spec, configs, deps, link_deps, part_of_all, + write_alias_target) + + # Update global list of target outputs, used in dependency tracking. + target_outputs[qualified_target] = ('path', self.output_binary) + + # Update global list of link dependencies. + if self.type == 'static_library': + target_link_deps[qualified_target] = ('static', self.android_module) + elif self.type == 'shared_library': + target_link_deps[qualified_target] = ('shared', self.android_module) + + self.fp.close() + return self.android_module + + + def WriteActions(self, actions, extra_sources, extra_outputs): + """Write Makefile code for any 'actions' from the gyp input. + + extra_sources: a list that will be filled in with newly generated source + files, if any + extra_outputs: a list that will be filled in with any outputs of these + actions (used to make other pieces dependent on these + actions) + """ + for action in actions: + name = make.StringToMakefileVariable('%s_%s' % (self.relative_target, + action['action_name'])) + self.WriteLn('### Rules for action "%s":' % action['action_name']) + inputs = action['inputs'] + outputs = action['outputs'] + + # Build up a list of outputs. + # Collect the output dirs we'll need. + dirs = set() + for out in outputs: + if not out.startswith('$'): + print('WARNING: Action for target "%s" writes output to local path ' + '"%s".' % (self.target, out)) + dir = os.path.split(out)[0] + if dir: + dirs.add(dir) + if int(action.get('process_outputs_as_sources', False)): + extra_sources += outputs + + # Prepare the actual command. + command = gyp.common.EncodePOSIXShellList(action['action']) + if 'message' in action: + quiet_cmd = 'Gyp action: %s ($@)' % action['message'] + else: + quiet_cmd = 'Gyp action: %s ($@)' % name + if len(dirs) > 0: + command = 'mkdir -p %s' % ' '.join(dirs) + '; ' + command + + cd_action = 'cd $(gyp_local_path)/%s; ' % self.path + command = cd_action + command + + # The makefile rules are all relative to the top dir, but the gyp actions + # are defined relative to their containing dir. This replaces the gyp_* + # variables for the action rule with an absolute version so that the + # output goes in the right place. + # Only write the gyp_* rules for the "primary" output (:1); + # it's superfluous for the "extra outputs", and this avoids accidentally + # writing duplicate dummy rules for those outputs. + main_output = make.QuoteSpaces(self.LocalPathify(outputs[0])) + self.WriteLn('%s: gyp_local_path := $(LOCAL_PATH)' % main_output) + self.WriteLn('%s: gyp_var_prefix := $(GYP_VAR_PREFIX)' % main_output) + self.WriteLn('%s: gyp_intermediate_dir := ' + '$(abspath $(gyp_intermediate_dir))' % main_output) + self.WriteLn('%s: gyp_shared_intermediate_dir := ' + '$(abspath $(gyp_shared_intermediate_dir))' % main_output) + + # Android's envsetup.sh adds a number of directories to the path including + # the built host binary directory. This causes actions/rules invoked by + # gyp to sometimes use these instead of system versions, e.g. bison. + # The built host binaries may not be suitable, and can cause errors. + # So, we remove them from the PATH using the ANDROID_BUILD_PATHS variable + # set by envsetup. + self.WriteLn('%s: export PATH := $(subst $(ANDROID_BUILD_PATHS),,$(PATH))' + % main_output) + + # Don't allow spaces in input/output filenames, but make an exception for + # filenames which start with '$(' since it's okay for there to be spaces + # inside of make function/macro invocations. + for input in inputs: + if not input.startswith('$(') and ' ' in input: + raise gyp.common.GypError( + 'Action input filename "%s" in target %s contains a space' % + (input, self.target)) + for output in outputs: + if not output.startswith('$(') and ' ' in output: + raise gyp.common.GypError( + 'Action output filename "%s" in target %s contains a space' % + (output, self.target)) + + self.WriteLn('%s: %s $(GYP_TARGET_DEPENDENCIES)' % + (main_output, ' '.join(map(self.LocalPathify, inputs)))) + self.WriteLn('\t@echo "%s"' % quiet_cmd) + self.WriteLn('\t$(hide)%s\n' % command) + for output in outputs[1:]: + # Make each output depend on the main output, with an empty command + # to force make to notice that the mtime has changed. + self.WriteLn('%s: %s ;' % (self.LocalPathify(output), main_output)) + + extra_outputs += outputs + self.WriteLn() + + self.WriteLn() + + + def WriteRules(self, rules, extra_sources, extra_outputs): + """Write Makefile code for any 'rules' from the gyp input. + + extra_sources: a list that will be filled in with newly generated source + files, if any + extra_outputs: a list that will be filled in with any outputs of these + rules (used to make other pieces dependent on these rules) + """ + if len(rules) == 0: + return + + for rule in rules: + if len(rule.get('rule_sources', [])) == 0: + continue + name = make.StringToMakefileVariable('%s_%s' % (self.relative_target, + rule['rule_name'])) + self.WriteLn('\n### Generated for rule "%s":' % name) + self.WriteLn('# "%s":' % rule) + + inputs = rule.get('inputs') + for rule_source in rule.get('rule_sources', []): + (rule_source_dirname, rule_source_basename) = os.path.split(rule_source) + (rule_source_root, rule_source_ext) = \ + os.path.splitext(rule_source_basename) + + outputs = [self.ExpandInputRoot(out, rule_source_root, + rule_source_dirname) + for out in rule['outputs']] + + dirs = set() + for out in outputs: + if not out.startswith('$'): + print('WARNING: Rule for target %s writes output to local path %s' + % (self.target, out)) + dir = os.path.dirname(out) + if dir: + dirs.add(dir) + extra_outputs += outputs + if int(rule.get('process_outputs_as_sources', False)): + extra_sources.extend(outputs) + + components = [] + for component in rule['action']: + component = self.ExpandInputRoot(component, rule_source_root, + rule_source_dirname) + if '$(RULE_SOURCES)' in component: + component = component.replace('$(RULE_SOURCES)', + rule_source) + components.append(component) + + command = gyp.common.EncodePOSIXShellList(components) + cd_action = 'cd $(gyp_local_path)/%s; ' % self.path + command = cd_action + command + if dirs: + command = 'mkdir -p %s' % ' '.join(dirs) + '; ' + command + + # We set up a rule to build the first output, and then set up + # a rule for each additional output to depend on the first. + outputs = map(self.LocalPathify, outputs) + main_output = outputs[0] + self.WriteLn('%s: gyp_local_path := $(LOCAL_PATH)' % main_output) + self.WriteLn('%s: gyp_var_prefix := $(GYP_VAR_PREFIX)' % main_output) + self.WriteLn('%s: gyp_intermediate_dir := ' + '$(abspath $(gyp_intermediate_dir))' % main_output) + self.WriteLn('%s: gyp_shared_intermediate_dir := ' + '$(abspath $(gyp_shared_intermediate_dir))' % main_output) + + # See explanation in WriteActions. + self.WriteLn('%s: export PATH := ' + '$(subst $(ANDROID_BUILD_PATHS),,$(PATH))' % main_output) + + main_output_deps = self.LocalPathify(rule_source) + if inputs: + main_output_deps += ' ' + main_output_deps += ' '.join([self.LocalPathify(f) for f in inputs]) + + self.WriteLn('%s: %s $(GYP_TARGET_DEPENDENCIES)' % + (main_output, main_output_deps)) + self.WriteLn('\t%s\n' % command) + for output in outputs[1:]: + # Make each output depend on the main output, with an empty command + # to force make to notice that the mtime has changed. + self.WriteLn('%s: %s ;' % (output, main_output)) + self.WriteLn() + + self.WriteLn() + + + def WriteCopies(self, copies, extra_outputs): + """Write Makefile code for any 'copies' from the gyp input. + + extra_outputs: a list that will be filled in with any outputs of this action + (used to make other pieces dependent on this action) + """ + self.WriteLn('### Generated for copy rule.') + + variable = make.StringToMakefileVariable(self.relative_target + '_copies') + outputs = [] + for copy in copies: + for path in copy['files']: + # The Android build system does not allow generation of files into the + # source tree. The destination should start with a variable, which will + # typically be $(gyp_intermediate_dir) or + # $(gyp_shared_intermediate_dir). Note that we can't use an assertion + # because some of the gyp tests depend on this. + if not copy['destination'].startswith('$'): + print('WARNING: Copy rule for target %s writes output to ' + 'local path %s' % (self.target, copy['destination'])) + + # LocalPathify() calls normpath, stripping trailing slashes. + path = Sourceify(self.LocalPathify(path)) + filename = os.path.split(path)[1] + output = Sourceify(self.LocalPathify(os.path.join(copy['destination'], + filename))) + + self.WriteLn('%s: %s $(GYP_TARGET_DEPENDENCIES) | $(ACP)' % + (output, path)) + self.WriteLn('\t@echo Copying: $@') + self.WriteLn('\t$(hide) mkdir -p $(dir $@)') + self.WriteLn('\t$(hide) $(ACP) -rpf $< $@') + self.WriteLn() + outputs.append(output) + self.WriteLn('%s = %s' % (variable, + ' '.join(map(make.QuoteSpaces, outputs)))) + extra_outputs.append('$(%s)' % variable) + self.WriteLn() + + + def WriteSourceFlags(self, spec, configs): + """Write out the flags and include paths used to compile source files for + the current target. + + Args: + spec, configs: input from gyp. + """ + for configname, config in sorted(configs.items()): + extracted_includes = [] + + self.WriteLn('\n# Flags passed to both C and C++ files.') + cflags, includes_from_cflags = self.ExtractIncludesFromCFlags( + config.get('cflags', []) + config.get('cflags_c', [])) + extracted_includes.extend(includes_from_cflags) + self.WriteList(cflags, 'MY_CFLAGS_%s' % configname) + + self.WriteList(config.get('defines'), 'MY_DEFS_%s' % configname, + prefix='-D', quoter=make.EscapeCppDefine) + + self.WriteLn('\n# Include paths placed before CFLAGS/CPPFLAGS') + includes = list(config.get('include_dirs', [])) + includes.extend(extracted_includes) + includes = map(Sourceify, map(self.LocalPathify, includes)) + includes = self.NormalizeIncludePaths(includes) + self.WriteList(includes, 'LOCAL_C_INCLUDES_%s' % configname) + + self.WriteLn('\n# Flags passed to only C++ (and not C) files.') + self.WriteList(config.get('cflags_cc'), 'LOCAL_CPPFLAGS_%s' % configname) + + self.WriteLn('\nLOCAL_CFLAGS := $(MY_CFLAGS_$(GYP_CONFIGURATION)) ' + '$(MY_DEFS_$(GYP_CONFIGURATION))') + # Undefine ANDROID for host modules + # TODO: the source code should not use macro ANDROID to tell if it's host + # or target module. + if self.toolset == 'host': + self.WriteLn('# Undefine ANDROID for host modules') + self.WriteLn('LOCAL_CFLAGS += -UANDROID') + self.WriteLn('LOCAL_C_INCLUDES := $(GYP_COPIED_SOURCE_ORIGIN_DIRS) ' + '$(LOCAL_C_INCLUDES_$(GYP_CONFIGURATION))') + self.WriteLn('LOCAL_CPPFLAGS := $(LOCAL_CPPFLAGS_$(GYP_CONFIGURATION))') + # Android uses separate flags for assembly file invocations, but gyp expects + # the same CFLAGS to be applied: + self.WriteLn('LOCAL_ASFLAGS := $(LOCAL_CFLAGS)') + + + def WriteSources(self, spec, configs, extra_sources): + """Write Makefile code for any 'sources' from the gyp input. + These are source files necessary to build the current target. + We need to handle shared_intermediate directory source files as + a special case by copying them to the intermediate directory and + treating them as a genereated sources. Otherwise the Android build + rules won't pick them up. + + Args: + spec, configs: input from gyp. + extra_sources: Sources generated from Actions or Rules. + """ + sources = filter(make.Compilable, spec.get('sources', [])) + generated_not_sources = [x for x in extra_sources if not make.Compilable(x)] + extra_sources = filter(make.Compilable, extra_sources) + + # Determine and output the C++ extension used by these sources. + # We simply find the first C++ file and use that extension. + all_sources = sources + extra_sources + local_cpp_extension = '.cpp' + for source in all_sources: + (root, ext) = os.path.splitext(source) + if IsCPPExtension(ext): + local_cpp_extension = ext + break + if local_cpp_extension != '.cpp': + self.WriteLn('LOCAL_CPP_EXTENSION := %s' % local_cpp_extension) + + # We need to move any non-generated sources that are coming from the + # shared intermediate directory out of LOCAL_SRC_FILES and put them + # into LOCAL_GENERATED_SOURCES. We also need to move over any C++ files + # that don't match our local_cpp_extension, since Android will only + # generate Makefile rules for a single LOCAL_CPP_EXTENSION. + local_files = [] + for source in sources: + (root, ext) = os.path.splitext(source) + if '$(gyp_shared_intermediate_dir)' in source: + extra_sources.append(source) + elif '$(gyp_intermediate_dir)' in source: + extra_sources.append(source) + elif IsCPPExtension(ext) and ext != local_cpp_extension: + extra_sources.append(source) + else: + local_files.append(os.path.normpath(os.path.join(self.path, source))) + + # For any generated source, if it is coming from the shared intermediate + # directory then we add a Make rule to copy them to the local intermediate + # directory first. This is because the Android LOCAL_GENERATED_SOURCES + # must be in the local module intermediate directory for the compile rules + # to work properly. If the file has the wrong C++ extension, then we add + # a rule to copy that to intermediates and use the new version. + final_generated_sources = [] + # If a source file gets copied, we still need to add the original source + # directory as header search path, for GCC searches headers in the + # directory that contains the source file by default. + origin_src_dirs = [] + for source in extra_sources: + local_file = source + if not '$(gyp_intermediate_dir)/' in local_file: + basename = os.path.basename(local_file) + local_file = '$(gyp_intermediate_dir)/' + basename + (root, ext) = os.path.splitext(local_file) + if IsCPPExtension(ext) and ext != local_cpp_extension: + local_file = root + local_cpp_extension + if local_file != source: + self.WriteLn('%s: %s' % (local_file, self.LocalPathify(source))) + self.WriteLn('\tmkdir -p $(@D); cp $< $@') + origin_src_dirs.append(os.path.dirname(source)) + final_generated_sources.append(local_file) + + # We add back in all of the non-compilable stuff to make sure that the + # make rules have dependencies on them. + final_generated_sources.extend(generated_not_sources) + self.WriteList(final_generated_sources, 'LOCAL_GENERATED_SOURCES') + + origin_src_dirs = gyp.common.uniquer(origin_src_dirs) + origin_src_dirs = map(Sourceify, map(self.LocalPathify, origin_src_dirs)) + self.WriteList(origin_src_dirs, 'GYP_COPIED_SOURCE_ORIGIN_DIRS') + + self.WriteList(local_files, 'LOCAL_SRC_FILES') + + # Write out the flags used to compile the source; this must be done last + # so that GYP_COPIED_SOURCE_ORIGIN_DIRS can be used as an include path. + self.WriteSourceFlags(spec, configs) + + + def ComputeAndroidModule(self, spec): + """Return the Android module name used for a gyp spec. + + We use the complete qualified target name to avoid collisions between + duplicate targets in different directories. We also add a suffix to + distinguish gyp-generated module names. + """ + + if int(spec.get('android_unmangled_name', 0)): + assert self.type != 'shared_library' or self.target.startswith('lib') + return self.target + + if self.type == 'shared_library': + # For reasons of convention, the Android build system requires that all + # shared library modules are named 'libfoo' when generating -l flags. + prefix = 'lib_' + else: + prefix = '' + + if spec['toolset'] == 'host': + suffix = '_$(TARGET_$(GYP_VAR_PREFIX)ARCH)_host_gyp' + else: + suffix = '_gyp' + + if self.path: + middle = make.StringToMakefileVariable('%s_%s' % (self.path, self.target)) + else: + middle = make.StringToMakefileVariable(self.target) + + return ''.join([prefix, middle, suffix]) + + + def ComputeOutputParts(self, spec): + """Return the 'output basename' of a gyp spec, split into filename + ext. + + Android libraries must be named the same thing as their module name, + otherwise the linker can't find them, so product_name and so on must be + ignored if we are building a library, and the "lib" prepending is + not done for Android. + """ + assert self.type != 'loadable_module' # TODO: not supported? + + target = spec['target_name'] + target_prefix = '' + target_ext = '' + if self.type == 'static_library': + target = self.ComputeAndroidModule(spec) + target_ext = '.a' + elif self.type == 'shared_library': + target = self.ComputeAndroidModule(spec) + target_ext = '.so' + elif self.type == 'none': + target_ext = '.stamp' + elif self.type != 'executable': + print("ERROR: What output file should be generated?", + "type", self.type, "target", target) + + if self.type != 'static_library' and self.type != 'shared_library': + target_prefix = spec.get('product_prefix', target_prefix) + target = spec.get('product_name', target) + product_ext = spec.get('product_extension') + if product_ext: + target_ext = '.' + product_ext + + target_stem = target_prefix + target + return (target_stem, target_ext) + + + def ComputeOutputBasename(self, spec): + """Return the 'output basename' of a gyp spec. + + E.g., the loadable module 'foobar' in directory 'baz' will produce + 'libfoobar.so' + """ + return ''.join(self.ComputeOutputParts(spec)) + + + def ComputeOutput(self, spec): + """Return the 'output' (full output path) of a gyp spec. + + E.g., the loadable module 'foobar' in directory 'baz' will produce + '$(obj)/baz/libfoobar.so' + """ + if self.type == 'executable': + # We install host executables into shared_intermediate_dir so they can be + # run by gyp rules that refer to PRODUCT_DIR. + path = '$(gyp_shared_intermediate_dir)' + elif self.type == 'shared_library': + if self.toolset == 'host': + path = '$($(GYP_HOST_VAR_PREFIX)HOST_OUT_INTERMEDIATE_LIBRARIES)' + else: + path = '$($(GYP_VAR_PREFIX)TARGET_OUT_INTERMEDIATE_LIBRARIES)' + else: + # Other targets just get built into their intermediate dir. + if self.toolset == 'host': + path = ('$(call intermediates-dir-for,%s,%s,true,,' + '$(GYP_HOST_VAR_PREFIX))' % (self.android_class, + self.android_module)) + else: + path = ('$(call intermediates-dir-for,%s,%s,,,$(GYP_VAR_PREFIX))' + % (self.android_class, self.android_module)) + + assert spec.get('product_dir') is None # TODO: not supported? + return os.path.join(path, self.ComputeOutputBasename(spec)) + + def NormalizeIncludePaths(self, include_paths): + """ Normalize include_paths. + Convert absolute paths to relative to the Android top directory. + + Args: + include_paths: A list of unprocessed include paths. + Returns: + A list of normalized include paths. + """ + normalized = [] + for path in include_paths: + if path[0] == '/': + path = gyp.common.RelativePath(path, self.android_top_dir) + normalized.append(path) + return normalized + + def ExtractIncludesFromCFlags(self, cflags): + """Extract includes "-I..." out from cflags + + Args: + cflags: A list of compiler flags, which may be mixed with "-I.." + Returns: + A tuple of lists: (clean_clfags, include_paths). "-I.." is trimmed. + """ + clean_cflags = [] + include_paths = [] + for flag in cflags: + if flag.startswith('-I'): + include_paths.append(flag[2:]) + else: + clean_cflags.append(flag) + + return (clean_cflags, include_paths) + + def FilterLibraries(self, libraries): + """Filter the 'libraries' key to separate things that shouldn't be ldflags. + + Library entries that look like filenames should be converted to android + module names instead of being passed to the linker as flags. + + Args: + libraries: the value of spec.get('libraries') + Returns: + A tuple (static_lib_modules, dynamic_lib_modules, ldflags) + """ + static_lib_modules = [] + dynamic_lib_modules = [] + ldflags = [] + for libs in libraries: + # Libs can have multiple words. + for lib in libs.split(): + # Filter the system libraries, which are added by default by the Android + # build system. + if (lib == '-lc' or lib == '-lstdc++' or lib == '-lm' or + lib.endswith('libgcc.a')): + continue + match = re.search(r'([^/]+)\.a$', lib) + if match: + static_lib_modules.append(match.group(1)) + continue + match = re.search(r'([^/]+)\.so$', lib) + if match: + dynamic_lib_modules.append(match.group(1)) + continue + if lib.startswith('-l'): + ldflags.append(lib) + return (static_lib_modules, dynamic_lib_modules, ldflags) + + + def ComputeDeps(self, spec): + """Compute the dependencies of a gyp spec. + + Returns a tuple (deps, link_deps), where each is a list of + filenames that will need to be put in front of make for either + building (deps) or linking (link_deps). + """ + deps = [] + link_deps = [] + if 'dependencies' in spec: + deps.extend([target_outputs[dep] for dep in spec['dependencies'] + if target_outputs[dep]]) + for dep in spec['dependencies']: + if dep in target_link_deps: + link_deps.append(target_link_deps[dep]) + deps.extend(link_deps) + return (gyp.common.uniquer(deps), gyp.common.uniquer(link_deps)) + + + def WriteTargetFlags(self, spec, configs, link_deps): + """Write Makefile code to specify the link flags and library dependencies. + + spec, configs: input from gyp. + link_deps: link dependency list; see ComputeDeps() + """ + # Libraries (i.e. -lfoo) + # These must be included even for static libraries as some of them provide + # implicit include paths through the build system. + libraries = gyp.common.uniquer(spec.get('libraries', [])) + static_libs, dynamic_libs, ldflags_libs = self.FilterLibraries(libraries) + + if self.type != 'static_library': + for configname, config in sorted(configs.items()): + ldflags = list(config.get('ldflags', [])) + self.WriteLn('') + self.WriteList(ldflags, 'LOCAL_LDFLAGS_%s' % configname) + self.WriteList(ldflags_libs, 'LOCAL_GYP_LIBS') + self.WriteLn('LOCAL_LDFLAGS := $(LOCAL_LDFLAGS_$(GYP_CONFIGURATION)) ' + '$(LOCAL_GYP_LIBS)') + + # Link dependencies (i.e. other gyp targets this target depends on) + # These need not be included for static libraries as within the gyp build + # we do not use the implicit include path mechanism. + if self.type != 'static_library': + static_link_deps = [x[1] for x in link_deps if x[0] == 'static'] + shared_link_deps = [x[1] for x in link_deps if x[0] == 'shared'] + else: + static_link_deps = [] + shared_link_deps = [] + + # Only write the lists if they are non-empty. + if static_libs or static_link_deps: + self.WriteLn('') + self.WriteList(static_libs + static_link_deps, + 'LOCAL_STATIC_LIBRARIES') + self.WriteLn('# Enable grouping to fix circular references') + self.WriteLn('LOCAL_GROUP_STATIC_LIBRARIES := true') + if dynamic_libs or shared_link_deps: + self.WriteLn('') + self.WriteList(dynamic_libs + shared_link_deps, + 'LOCAL_SHARED_LIBRARIES') + + + def WriteTarget(self, spec, configs, deps, link_deps, part_of_all, + write_alias_target): + """Write Makefile code to produce the final target of the gyp spec. + + spec, configs: input from gyp. + deps, link_deps: dependency lists; see ComputeDeps() + part_of_all: flag indicating this target is part of 'all' + write_alias_target: flag indicating whether to create short aliases for this + target + """ + self.WriteLn('### Rules for final target.') + + if self.type != 'none': + self.WriteTargetFlags(spec, configs, link_deps) + + settings = spec.get('aosp_build_settings', {}) + if settings: + self.WriteLn('### Set directly by aosp_build_settings.') + for k, v in settings.items(): + if isinstance(v, list): + self.WriteList(v, k) + else: + self.WriteLn('%s := %s' % (k, make.QuoteIfNecessary(v))) + self.WriteLn('') + + # Add to the set of targets which represent the gyp 'all' target. We use the + # name 'gyp_all_modules' as the Android build system doesn't allow the use + # of the Make target 'all' and because 'all_modules' is the equivalent of + # the Make target 'all' on Android. + if part_of_all and write_alias_target: + self.WriteLn('# Add target alias to "gyp_all_modules" target.') + self.WriteLn('.PHONY: gyp_all_modules') + self.WriteLn('gyp_all_modules: %s' % self.android_module) + self.WriteLn('') + + # Add an alias from the gyp target name to the Android module name. This + # simplifies manual builds of the target, and is required by the test + # framework. + if self.target != self.android_module and write_alias_target: + self.WriteLn('# Alias gyp target name.') + self.WriteLn('.PHONY: %s' % self.target) + self.WriteLn('%s: %s' % (self.target, self.android_module)) + self.WriteLn('') + + # Add the command to trigger build of the target type depending + # on the toolset. Ex: BUILD_STATIC_LIBRARY vs. BUILD_HOST_STATIC_LIBRARY + # NOTE: This has to come last! + modifier = '' + if self.toolset == 'host': + modifier = 'HOST_' + if self.type == 'static_library': + self.WriteLn('include $(BUILD_%sSTATIC_LIBRARY)' % modifier) + elif self.type == 'shared_library': + self.WriteLn('LOCAL_PRELINK_MODULE := false') + self.WriteLn('include $(BUILD_%sSHARED_LIBRARY)' % modifier) + elif self.type == 'executable': + self.WriteLn('LOCAL_CXX_STL := libc++_static') + # Executables are for build and test purposes only, so they're installed + # to a directory that doesn't get included in the system image. + self.WriteLn('LOCAL_MODULE_PATH := $(gyp_shared_intermediate_dir)') + self.WriteLn('include $(BUILD_%sEXECUTABLE)' % modifier) + else: + self.WriteLn('LOCAL_MODULE_PATH := $(PRODUCT_OUT)/gyp_stamp') + self.WriteLn('LOCAL_UNINSTALLABLE_MODULE := true') + if self.toolset == 'target': + self.WriteLn('LOCAL_2ND_ARCH_VAR_PREFIX := $(GYP_VAR_PREFIX)') + else: + self.WriteLn('LOCAL_2ND_ARCH_VAR_PREFIX := $(GYP_HOST_VAR_PREFIX)') + self.WriteLn() + self.WriteLn('include $(BUILD_SYSTEM)/base_rules.mk') + self.WriteLn() + self.WriteLn('$(LOCAL_BUILT_MODULE): $(LOCAL_ADDITIONAL_DEPENDENCIES)') + self.WriteLn('\t$(hide) echo "Gyp timestamp: $@"') + self.WriteLn('\t$(hide) mkdir -p $(dir $@)') + self.WriteLn('\t$(hide) touch $@') + self.WriteLn() + self.WriteLn('LOCAL_2ND_ARCH_VAR_PREFIX :=') + + + def WriteList(self, value_list, variable=None, prefix='', + quoter=make.QuoteIfNecessary, local_pathify=False): + """Write a variable definition that is a list of values. + + E.g. WriteList(['a','b'], 'foo', prefix='blah') writes out + foo = blaha blahb + but in a pretty-printed style. + """ + values = '' + if value_list: + value_list = [quoter(prefix + l) for l in value_list] + if local_pathify: + value_list = [self.LocalPathify(l) for l in value_list] + values = ' \\\n\t' + ' \\\n\t'.join(value_list) + self.fp.write('%s :=%s\n\n' % (variable, values)) + + + def WriteLn(self, text=''): + self.fp.write(text + '\n') + + + def LocalPathify(self, path): + """Convert a subdirectory-relative path into a normalized path which starts + with the make variable $(LOCAL_PATH) (i.e. the top of the project tree). + Absolute paths, or paths that contain variables, are just normalized.""" + if '$(' in path or os.path.isabs(path): + # path is not a file in the project tree in this case, but calling + # normpath is still important for trimming trailing slashes. + return os.path.normpath(path) + local_path = os.path.join('$(LOCAL_PATH)', self.path, path) + local_path = os.path.normpath(local_path) + # Check that normalizing the path didn't ../ itself out of $(LOCAL_PATH) + # - i.e. that the resulting path is still inside the project tree. The + # path may legitimately have ended up containing just $(LOCAL_PATH), though, + # so we don't look for a slash. + assert local_path.startswith('$(LOCAL_PATH)'), ( + 'Path %s attempts to escape from gyp path %s !)' % (path, self.path)) + return local_path + + + def ExpandInputRoot(self, template, expansion, dirname): + if '%(INPUT_ROOT)s' not in template and '%(INPUT_DIRNAME)s' not in template: + return template + path = template % { + 'INPUT_ROOT': expansion, + 'INPUT_DIRNAME': dirname, + } + return os.path.normpath(path) + + +def PerformBuild(data, configurations, params): + # The android backend only supports the default configuration. + options = params['options'] + makefile = os.path.abspath(os.path.join(options.toplevel_dir, + 'GypAndroid.mk')) + env = dict(os.environ) + env['ONE_SHOT_MAKEFILE'] = makefile + arguments = ['make', '-C', os.environ['ANDROID_BUILD_TOP'], 'gyp_all_modules'] + print('Building: %s' % arguments) + subprocess.check_call(arguments, env=env) + + +def GenerateOutput(target_list, target_dicts, data, params): + options = params['options'] + generator_flags = params.get('generator_flags', {}) + builddir_name = generator_flags.get('output_dir', 'out') + limit_to_target_all = generator_flags.get('limit_to_target_all', False) + write_alias_targets = generator_flags.get('write_alias_targets', True) + sdk_version = generator_flags.get('aosp_sdk_version', 0) + android_top_dir = os.environ.get('ANDROID_BUILD_TOP') + assert android_top_dir, '$ANDROID_BUILD_TOP not set; you need to run lunch.' + + def CalculateMakefilePath(build_file, base_name): + """Determine where to write a Makefile for a given gyp file.""" + # Paths in gyp files are relative to the .gyp file, but we want + # paths relative to the source root for the master makefile. Grab + # the path of the .gyp file as the base to relativize against. + # E.g. "foo/bar" when we're constructing targets for "foo/bar/baz.gyp". + base_path = gyp.common.RelativePath(os.path.dirname(build_file), + options.depth) + # We write the file in the base_path directory. + output_file = os.path.join(options.depth, base_path, base_name) + assert not options.generator_output, ( + 'The Android backend does not support options.generator_output.') + base_path = gyp.common.RelativePath(os.path.dirname(build_file), + options.toplevel_dir) + return base_path, output_file + + # TODO: search for the first non-'Default' target. This can go + # away when we add verification that all targets have the + # necessary configurations. + default_configuration = None + toolsets = set([target_dicts[target]['toolset'] for target in target_list]) + for target in target_list: + spec = target_dicts[target] + if spec['default_configuration'] != 'Default': + default_configuration = spec['default_configuration'] + break + if not default_configuration: + default_configuration = 'Default' + + srcdir = '.' + makefile_name = 'GypAndroid' + options.suffix + '.mk' + makefile_path = os.path.join(options.toplevel_dir, makefile_name) + assert not options.generator_output, ( + 'The Android backend does not support options.generator_output.') + gyp.common.EnsureDirExists(makefile_path) + root_makefile = open(makefile_path, 'w') + + root_makefile.write(header) + + # We set LOCAL_PATH just once, here, to the top of the project tree. This + # allows all the other paths we use to be relative to the Android.mk file, + # as the Android build system expects. + root_makefile.write('\nLOCAL_PATH := $(call my-dir)\n') + + # Find the list of targets that derive from the gyp file(s) being built. + needed_targets = set() + for build_file in params['build_files']: + for target in gyp.common.AllTargets(target_list, target_dicts, build_file): + needed_targets.add(target) + + build_files = set() + include_list = set() + android_modules = {} + for qualified_target in target_list: + build_file, target, toolset = gyp.common.ParseQualifiedTarget( + qualified_target) + relative_build_file = gyp.common.RelativePath(build_file, + options.toplevel_dir) + build_files.add(relative_build_file) + included_files = data[build_file]['included_files'] + for included_file in included_files: + # The included_files entries are relative to the dir of the build file + # that included them, so we have to undo that and then make them relative + # to the root dir. + relative_include_file = gyp.common.RelativePath( + gyp.common.UnrelativePath(included_file, build_file), + options.toplevel_dir) + abs_include_file = os.path.abspath(relative_include_file) + # If the include file is from the ~/.gyp dir, we should use absolute path + # so that relocating the src dir doesn't break the path. + if (params['home_dot_gyp'] and + abs_include_file.startswith(params['home_dot_gyp'])): + build_files.add(abs_include_file) + else: + build_files.add(relative_include_file) + + base_path, output_file = CalculateMakefilePath(build_file, + target + '.' + toolset + options.suffix + '.mk') + + spec = target_dicts[qualified_target] + configs = spec['configurations'] + + part_of_all = qualified_target in needed_targets + if limit_to_target_all and not part_of_all: + continue + + relative_target = gyp.common.QualifiedTarget(relative_build_file, target, + toolset) + writer = AndroidMkWriter(android_top_dir) + android_module = writer.Write(qualified_target, relative_target, base_path, + output_file, spec, configs, + part_of_all=part_of_all, + write_alias_target=write_alias_targets, + sdk_version=sdk_version) + if android_module in android_modules: + print('ERROR: Android module names must be unique. The following ' + 'targets both generate Android module name %s.\n %s\n %s' % + (android_module, android_modules[android_module], + qualified_target)) + return + android_modules[android_module] = qualified_target + + # Our root_makefile lives at the source root. Compute the relative path + # from there to the output_file for including. + mkfile_rel_path = gyp.common.RelativePath(output_file, + os.path.dirname(makefile_path)) + include_list.add(mkfile_rel_path) + + root_makefile.write('GYP_CONFIGURATION ?= %s\n' % default_configuration) + root_makefile.write('GYP_VAR_PREFIX ?=\n') + root_makefile.write('GYP_HOST_VAR_PREFIX ?=\n') + root_makefile.write('GYP_HOST_MULTILIB ?= first\n') + + # Write out the sorted list of includes. + root_makefile.write('\n') + for include_file in sorted(include_list): + root_makefile.write('include $(LOCAL_PATH)/' + include_file + '\n') + root_makefile.write('\n') + + if write_alias_targets: + root_makefile.write(ALL_MODULES_FOOTER) + + root_makefile.close() diff --git a/tools/gyp/pylib/gyp/generator/cmake.py b/tools/gyp/pylib/gyp/generator/cmake.py index 149268711b8b9b..e966a8f23e1f09 100644 --- a/tools/gyp/pylib/gyp/generator/cmake.py +++ b/tools/gyp/pylib/gyp/generator/cmake.py @@ -240,7 +240,10 @@ def StringToCMakeTargetName(a): Invalid for make: ':' Invalid for unknown reasons but cause failures: '.' """ - return a.translate(string.maketrans(' /():."', '_______')) + try: + return a.translate(str.maketrans(' /():."', '_______')) + except AttributeError: + return a.translate(string.maketrans(' /():."', '_______')) def WriteActions(target_name, actions, extra_sources, extra_deps, @@ -575,7 +578,7 @@ class CMakeNamer(object): """Converts Gyp target names into CMake target names. CMake requires that target names be globally unique. One way to ensure - this is to fully qualify the names of the targets. Unfortunatly, this + this is to fully qualify the names of the targets. Unfortunately, this ends up with all targets looking like "chrome_chrome_gyp_chrome" instead of just "chrome". If this generator were only interested in building, it would be possible to fully qualify all target names, then create @@ -647,7 +650,7 @@ def WriteTarget(namer, qualified_target, target_dicts, build_dir, config_to_use, cmake_target_type = cmake_target_type_from_gyp_target_type.get(target_type) if cmake_target_type is None: print('Target %s has unknown target type %s, skipping.' % - ( target_name, target_type)) + ( target_name, target_type )) return SetVariable(output, 'TARGET', target_name) diff --git a/tools/gyp/pylib/gyp/generator/eclipse.py b/tools/gyp/pylib/gyp/generator/eclipse.py index 372ceec246dedb..80e5fb6302c5d5 100644 --- a/tools/gyp/pylib/gyp/generator/eclipse.py +++ b/tools/gyp/pylib/gyp/generator/eclipse.py @@ -26,6 +26,8 @@ import shlex import xml.etree.cElementTree as ET +PY3 = bytes != str + generator_wants_static_library_dependencies_adjusted = False generator_default_variables = { @@ -97,6 +99,8 @@ def GetAllIncludeDirectories(target_list, target_dicts, proc = subprocess.Popen(args=command, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE) output = proc.communicate()[1] + if PY3: + output = output.decode('utf-8') # Extract the list of include dirs from the output, which has this format: # ... # #include "..." search starts here: @@ -195,8 +199,8 @@ def GetAllDefines(target_list, target_dicts, data, config_name, params, """Calculate the defines for a project. Returns: - A dict that includes explict defines declared in gyp files along with all of - the default defines that the compiler uses. + A dict that includes explicit defines declared in gyp files along with all + of the default defines that the compiler uses. """ # Get defines declared in the gyp files. @@ -234,6 +238,8 @@ def GetAllDefines(target_list, target_dicts, data, config_name, params, cpp_proc = subprocess.Popen(args=command, cwd='.', stdin=subprocess.PIPE, stdout=subprocess.PIPE) cpp_output = cpp_proc.communicate()[0] + if PY3: + cpp_output = cpp_output.decode('utf-8') cpp_lines = cpp_output.split('\n') for cpp_line in cpp_lines: if not cpp_line.strip(): diff --git a/tools/gyp/pylib/gyp/generator/make.py b/tools/gyp/pylib/gyp/generator/make.py index 91a119c5a57694..26cf88cccf275d 100644 --- a/tools/gyp/pylib/gyp/generator/make.py +++ b/tools/gyp/pylib/gyp/generator/make.py @@ -12,7 +12,7 @@ # all are sourced by the top-level Makefile. This means that all # variables in .mk-files clobber one another. Be careful to use := # where appropriate for immediate evaluation, and similarly to watch -# that you're not relying on a variable value to last beween different +# that you're not relying on a variable value to last between different # .mk files. # # TODOs: @@ -234,6 +234,25 @@ def CalculateGeneratorInputInfo(params): """ +LINK_COMMANDS_OS390 = """\ +quiet_cmd_alink = AR($(TOOLSET)) $@ +cmd_alink = rm -f $@ && $(AR.$(TOOLSET)) crs $@ $(filter %.o,$^) + +quiet_cmd_alink_thin = AR($(TOOLSET)) $@ +cmd_alink_thin = rm -f $@ && $(AR.$(TOOLSET)) crsT $@ $(filter %.o,$^) + +quiet_cmd_link = LINK($(TOOLSET)) $@ +cmd_link = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ $(LD_INPUTS) $(LIBS) + +quiet_cmd_solink = SOLINK($(TOOLSET)) $@ +cmd_solink = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ $(LD_INPUTS) $(LIBS) -Wl,DLL + +quiet_cmd_solink_module = SOLINK_MODULE($(TOOLSET)) $@ +cmd_solink_module = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ $(filter-out FORCE_DO_CMD, $^) $(LIBS) -Wl,DLL + +""" + + # Header of toplevel Makefile. # This should go into the build tree, but it's easier to keep it here for now. SHARED_HEADER = ("""\ @@ -317,7 +336,7 @@ def CalculateGeneratorInputInfo(params): # We write to a dep file on the side first and then rename at the end # so we can't end up with a broken dep file. depfile = $(depsdir)/$(call replace_spaces,$@).d -DEPFLAGS = -MMD -MF $(depfile).raw +DEPFLAGS = %(makedep_args)s -MF $(depfile).raw # We have to fixup the deps output in a few ways. # (1) the file output should mention the proper .o file. @@ -630,6 +649,9 @@ def Sourceify(path): def QuoteSpaces(s, quote=r'\ '): return s.replace(' ', quote) +def SourceifyAndQuoteSpaces(path): + """Convert a path to its source directory form and quote spaces.""" + return QuoteSpaces(Sourceify(path)) # TODO: Avoid code duplication with _ValidateSourcesForMSVSProject in msvs.py. def _ValidateSourcesForOSX(spec, all_sources): @@ -657,9 +679,8 @@ def _ValidateSourcesForOSX(spec, all_sources): error += ' %s: %s\n' % (basename, ' '.join(files)) if error: - print('static library %s has several files with the same basename:\n' % - spec['target_name'] + error + 'libtool on OS X will generate' + - ' warnings for them.') + print(('static library %s has several files with the same basename:\n' % spec['target_name']) + + error + 'libtool on OS X will generate' + ' warnings for them.') raise GypError('Duplicate basenames in sources section, see list above') @@ -1755,8 +1776,8 @@ def WriteMakeRule(self, outputs, inputs, actions=None, comment=None, # - The multi-output rule will have an do-nothing recipe. # Hash the target name to avoid generating overlong filenames. - cmddigest = hashlib.sha1((command or self.target).encode("utf-8")).hexdigest() - intermediate = "%s.intermediate" % (cmddigest) + cmddigest = hashlib.sha1((command or self.target).encode('utf-8')).hexdigest() + intermediate = "%s.intermediate" % cmddigest self.WriteLn('%s: %s' % (' '.join(outputs), intermediate)) self.WriteLn('\t%s' % '@:') self.WriteLn('%s: %s' % ('.INTERMEDIATE', intermediate)) @@ -1956,7 +1977,7 @@ def WriteAutoRegenerationRule(params, root_makefile, makefile_name, "%(makefile_name)s: %(deps)s\n" "\t$(call do_cmd,regen_makefile)\n\n" % { 'makefile_name': makefile_name, - 'deps': ' '.join(Sourceify(bf) for bf in build_files), + 'deps': ' '.join(SourceifyAndQuoteSpaces(bf) for bf in build_files), 'cmd': gyp.common.EncodePOSIXShellList( [gyp_binary, '-fmake'] + gyp.RegenerateFlags(options) + @@ -2024,6 +2045,7 @@ def CalculateMakefilePath(build_file, base_name): flock_command= 'flock' copy_archive_arguments = '-af' + makedep_arguments = '-MMD' header_params = { 'default_target': default_target, 'builddir': builddir_name, @@ -2034,6 +2056,15 @@ def CalculateMakefilePath(build_file, base_name): 'extra_commands': '', 'srcdir': srcdir, 'copy_archive_args': copy_archive_arguments, + 'makedep_args': makedep_arguments, + 'CC.target': GetEnvironFallback(('CC_target', 'CC'), '$(CC)'), + 'AR.target': GetEnvironFallback(('AR_target', 'AR'), '$(AR)'), + 'CXX.target': GetEnvironFallback(('CXX_target', 'CXX'), '$(CXX)'), + 'LINK.target': GetEnvironFallback(('LINK_target', 'LINK'), '$(LINK)'), + 'CC.host': GetEnvironFallback(('CC_host', 'CC'), 'gcc'), + 'AR.host': GetEnvironFallback(('AR_host', 'AR'), 'ar'), + 'CXX.host': GetEnvironFallback(('CXX_host', 'CXX'), 'g++'), + 'LINK.host': GetEnvironFallback(('LINK_host', 'LINK'), '$(CXX.host)'), } if flavor == 'mac': flock_command = './gyp-mac-tool flock' @@ -2047,6 +2078,18 @@ def CalculateMakefilePath(build_file, base_name): header_params.update({ 'link_commands': LINK_COMMANDS_ANDROID, }) + elif flavor == 'zos': + copy_archive_arguments = '-fPR' + makedep_arguments = '-qmakedep=gcc' + header_params.update({ + 'copy_archive_args': copy_archive_arguments, + 'makedep_args': makedep_arguments, + 'link_commands': LINK_COMMANDS_OS390, + 'CC.target': GetEnvironFallback(('CC_target', 'CC'), 'njsc'), + 'CXX.target': GetEnvironFallback(('CXX_target', 'CXX'), 'njsc++'), + 'CC.host': GetEnvironFallback(('CC_host', 'CC'), 'njsc'), + 'CXX.host': GetEnvironFallback(('CXX_host', 'CXX'), 'njsc++'), + }) elif flavor == 'solaris': header_params.update({ 'flock': './gyp-flock-tool flock', @@ -2071,17 +2114,6 @@ def CalculateMakefilePath(build_file, base_name): 'flock_index': 2, }) - header_params.update({ - 'CC.target': GetEnvironFallback(('CC_target', 'CC'), '$(CC)'), - 'AR.target': GetEnvironFallback(('AR_target', 'AR'), '$(AR)'), - 'CXX.target': GetEnvironFallback(('CXX_target', 'CXX'), '$(CXX)'), - 'LINK.target': GetEnvironFallback(('LINK_target', 'LINK'), '$(LINK)'), - 'CC.host': GetEnvironFallback(('CC_host', 'CC'), 'gcc'), - 'AR.host': GetEnvironFallback(('AR_host', 'AR'), 'ar'), - 'CXX.host': GetEnvironFallback(('CXX_host', 'CXX'), 'g++'), - 'LINK.host': GetEnvironFallback(('LINK_host', 'LINK'), '$(CXX.host)'), - }) - build_file, _, _ = gyp.common.ParseQualifiedTarget(target_list[0]) make_global_settings_array = data[build_file].get('make_global_settings', []) wrappers = {} diff --git a/tools/gyp/pylib/gyp/generator/msvs.py b/tools/gyp/pylib/gyp/generator/msvs.py index 1aed4ca8aa7e00..933042c7113c59 100644 --- a/tools/gyp/pylib/gyp/generator/msvs.py +++ b/tools/gyp/pylib/gyp/generator/msvs.py @@ -12,6 +12,8 @@ import subprocess import sys +from collections import OrderedDict + import gyp.common import gyp.easy_xml as easy_xml import gyp.generator.ninja as ninja_generator @@ -25,15 +27,7 @@ from gyp.common import GypError from gyp.common import OrderedSet -# TODO: Remove once bots are on 2.7, http://crbug.com/241769 -def _import_OrderedDict(): - import collections - try: - return collections.OrderedDict - except AttributeError: - import gyp.ordered_dict - return gyp.ordered_dict.OrderedDict -OrderedDict = _import_OrderedDict() +PY3 = bytes != str # Regular expression for validating Visual Studio GUIDs. If the GUID @@ -90,6 +84,7 @@ def _import_OrderedDict(): 'msvs_enable_winrt', 'msvs_requires_importlibrary', 'msvs_enable_winphone', + 'msvs_enable_marmasm', 'msvs_application_type_revision', 'msvs_target_platform_version', 'msvs_target_platform_minversion', @@ -126,6 +121,8 @@ def _GetDomainAndUserName(): call = subprocess.Popen(['net', 'config', 'Workstation'], stdout=subprocess.PIPE) config = call.communicate()[0] + if PY3: + config = config.decode('utf-8') username_re = re.compile(r'^User name\s+(\S+)', re.MULTILINE) username_match = username_re.search(config) if username_match: @@ -167,7 +164,7 @@ def _FixPath(path): Returns: The path with all slashes made into backslashes. """ - if fixpath_prefix and path and not os.path.isabs(path) and not path[0] == '$': + if fixpath_prefix and path and not os.path.isabs(path) and not path[0] == '$' and not _IsWindowsAbsPath(path): path = os.path.join(fixpath_prefix, path) path = path.replace('/', '\\') path = _NormalizedSource(path) @@ -176,6 +173,15 @@ def _FixPath(path): return path +def _IsWindowsAbsPath(path): + """ + On Cygwin systems Python needs a little help determining if a path is an absolute Windows path or not, so that + it does not treat those as relative, which results in bad paths like: + '..\C:\\some_source_code_file.cc' + """ + return path.startswith('c:') or path.startswith('C:') + + def _FixPaths(paths): """Fix each of the paths of the list.""" return [_FixPath(i) for i in paths] @@ -297,6 +303,9 @@ def _ConfigFullName(config_name, config_data): def _ConfigWindowsTargetPlatformVersion(config_data, version): + target_ver = config_data.get('msvs_windows_target_platform_version') + if target_ver and re.match(r'^\d+', target_ver): + return target_ver config_ver = config_data.get('msvs_windows_sdk_version') vers = [config_ver] if config_ver else version.compatible_sdks for ver in vers: @@ -775,8 +784,8 @@ def _Replace(match): # the VCProj but cause the same problem on the final command-line. Moving # the item to the end of the list does works, but that's only possible if # there's only one such item. Let's just warn the user. - print('Warning: MSVS may misinterpret the odd number of ' - 'quotes in ' + s, file=sys.stderr) + print('Warning: MSVS may misinterpret the odd number of ' + + 'quotes in ' + s, file=sys.stderr) return s @@ -996,8 +1005,8 @@ def _ValidateSourcesForMSVSProject(spec, version): error += ' %s: %s\n' % (basename, ' '.join(files)) if error: - print('static library %s has several files with the same basename:\n' % - spec['target_name'] + error + 'MSVC08 cannot handle that.') + print('static library %s has several files with the same basename:\n' % spec['target_name'] + + error + 'MSVC08 cannot handle that.') raise GypError('Duplicate basenames in sources section, see list above') @@ -1913,6 +1922,8 @@ def _InitNinjaFlavor(params, target_list, target_dicts): configuration = '$(Configuration)' if params.get('target_arch') == 'x64': configuration += '_x64' + if params.get('target_arch') == 'arm64': + configuration += '_arm64' spec['msvs_external_builder_out_dir'] = os.path.join( gyp.common.RelativePath(params['options'].toplevel_dir, gyp_dir), ninja_generator.ComputeOutputDir(params), @@ -2163,7 +2174,7 @@ def _MapFileToMsBuildSourceType(source, rule_dependencies, if ext in extension_to_rule_name: group = 'rule' element = extension_to_rule_name[ext] - elif ext in ['.cc', '.cpp', '.c', '.cxx']: + elif ext in ['.cc', '.cpp', '.c', '.cxx', '.mm']: group = 'compile' element = 'ClCompile' elif ext in ['.h', '.hxx']: @@ -3106,7 +3117,7 @@ def _FinalizeMSBuildSettings(spec, configuration): _ToolAppend(msbuild_settings, 'ResourceCompile', 'AdditionalIncludeDirectories', resource_include_dirs) # Add in libraries, note that even for empty libraries, we want this - # set, to prevent inheriting default libraries from the enviroment. + # set, to prevent inheriting default libraries from the environment. _ToolSetOrAppend(msbuild_settings, 'Link', 'AdditionalDependencies', libraries) _ToolAppend(msbuild_settings, 'Link', 'AdditionalLibraryDirectories', @@ -3411,7 +3422,8 @@ def _GenerateMSBuildProject(project, options, version, generator_flags): content += _GetMSBuildLocalProperties(project.msbuild_toolset) content += import_cpp_props_section content += import_masm_props_section - content += import_marmasm_props_section + if spec.get('msvs_enable_marmasm'): + content += import_marmasm_props_section content += _GetMSBuildExtensions(props_files_of_rules) content += _GetMSBuildPropertySheets(configurations) content += macro_section @@ -3424,7 +3436,8 @@ def _GenerateMSBuildProject(project, options, version, generator_flags): content += _GetMSBuildProjectReferences(project) content += import_cpp_targets_section content += import_masm_targets_section - content += import_marmasm_targets_section + if spec.get('msvs_enable_marmasm'): + content += import_marmasm_targets_section content += _GetMSBuildExtensionTargets(targets_files_of_rules) if spec.get('msvs_external_builder'): diff --git a/tools/gyp/pylib/gyp/generator/ninja.py b/tools/gyp/pylib/gyp/generator/ninja.py index 75743e770d16f9..d5006bf84a0b2a 100644 --- a/tools/gyp/pylib/gyp/generator/ninja.py +++ b/tools/gyp/pylib/gyp/generator/ninja.py @@ -744,7 +744,7 @@ def cygwin_munge(path): elif var == 'name': extra_bindings.append(('name', cygwin_munge(basename))) else: - assert var == None, repr(var) + assert var is None, repr(var) outputs = [self.GypPathToNinja(o, env) for o in outputs] if self.flavor == 'win': @@ -1880,7 +1880,7 @@ def GenerateOutputForConfig(target_list, target_dicts, data, params, # - The priority from low to high is gcc/g++, the 'make_global_settings' in # gyp, the environment variable. # - If there is no 'make_global_settings' for CC.host/CXX.host or - # 'CC_host'/'CXX_host' enviroment variable, cc_host/cxx_host should be set + # 'CC_host'/'CXX_host' environment variable, cc_host/cxx_host should be set # to cc/cxx. if flavor == 'win': ar = 'lib.exe' @@ -2321,15 +2321,22 @@ def GenerateOutputForConfig(target_list, target_dicts, data, params, 'stamp', description='STAMP $out', command='%s gyp-win-tool stamp $out' % sys.executable) - master_ninja.rule( - 'copy', - description='COPY $in $out', - command='%s gyp-win-tool recursive-mirror $in $out' % sys.executable) else: master_ninja.rule( 'stamp', description='STAMP $out', command='${postbuilds}touch $out') + if flavor == 'win': + master_ninja.rule( + 'copy', + description='COPY $in $out', + command='%s gyp-win-tool recursive-mirror $in $out' % sys.executable) + elif flavor == 'zos': + master_ninja.rule( + 'copy', + description='COPY $in $out', + command='rm -rf $out && cp -fRP $in $out') + else: master_ninja.rule( 'copy', description='COPY $in $out', diff --git a/tools/gyp/pylib/gyp/generator/xcode.py b/tools/gyp/pylib/gyp/generator/xcode.py index 9242324196d2d3..4917ba77b9d577 100644 --- a/tools/gyp/pylib/gyp/generator/xcode.py +++ b/tools/gyp/pylib/gyp/generator/xcode.py @@ -541,7 +541,7 @@ def ExpandXcodeVariables(string, expansions): """ matches = _xcode_variable_re.findall(string) - if matches == None: + if matches is None: return string matches.reverse() @@ -1010,7 +1010,7 @@ def GenerateOutput(target_list, target_dicts, data, params): actions.append(action) if len(concrete_outputs_all) > 0: - # TODO(mark): There's a possibilty for collision here. Consider + # TODO(mark): There's a possibility for collision here. Consider # target "t" rule "A_r" and target "t_A" rule "r". makefile_name = '%s.make' % re.sub( '[^a-zA-Z0-9_]', '_' , '%s_%s' % (target_name, rule['rule_name'])) diff --git a/tools/gyp/pylib/gyp/input.py b/tools/gyp/pylib/gyp/input.py index 6db204e4010284..1f40abb06951bb 100644 --- a/tools/gyp/pylib/gyp/input.py +++ b/tools/gyp/pylib/gyp/input.py @@ -23,6 +23,7 @@ from gyp.common import GypError from gyp.common import OrderedSet +PY3 = bytes != str # A list of types that are treated as linkable. linkable_types = [ @@ -157,7 +158,7 @@ def GetIncludedBuildFiles(build_file_path, aux_data, included=None): in the list will be relative to the current directory. """ - if included == None: + if included is None: included = [] if build_file_path in included: @@ -222,7 +223,15 @@ def LoadOneBuildFile(build_file_path, data, aux_data, includes, return data[build_file_path] if os.path.exists(build_file_path): - build_file_contents = open(build_file_path).read() + # Open the build file for read ('r') with universal-newlines mode ('U') + # to make sure platform specific newlines ('\r\n' or '\r') are converted to '\n' + # which otherwise will fail eval() + if sys.platform == 'zos': + # On z/OS, universal-newlines mode treats the file as an ascii file. But since + # node-gyp produces ebcdic files, do not use that mode. + build_file_contents = open(build_file_path, 'r').read() + else: + build_file_contents = open(build_file_path, 'rU').read() else: raise GypError("%s not found (cwd: %s)" % (build_file_path, os.getcwd())) @@ -231,7 +240,7 @@ def LoadOneBuildFile(build_file_path, data, aux_data, includes, if check: build_file_data = CheckedEval(build_file_contents) else: - build_file_data = eval(build_file_contents, {'__builtins__': None}, + build_file_data = eval(build_file_contents, {'__builtins__': {}}, None) except SyntaxError as e: e.filename = build_file_path @@ -700,9 +709,6 @@ def FixupPlatformCommand(cmd): def ExpandVariables(input, phase, variables, build_file): # Look for the pattern that gets expanded into variables - def to_utf8(s): - return s if isinstance(s, str) else s.decode('utf-8') - if phase == PHASE_EARLY: variable_re = early_variable_re expansion_symbol = '<' @@ -906,8 +912,9 @@ def to_utf8(s): (e, contents, build_file)) p_stdout, p_stderr = p.communicate('') - p_stdout = to_utf8(p_stdout) - p_stderr = to_utf8(p_stderr) + if PY3: + p_stdout = p_stdout.decode('utf-8') + p_stderr = p_stderr.decode('utf-8') if p.wait() != 0 or p_stderr: sys.stderr.write(p_stderr) @@ -1061,7 +1068,7 @@ def EvalCondition(condition, conditions_key, phase, variables, build_file): else: false_dict = None i = i + 2 - if result == None: + if result is None: result = EvalSingleCondition( cond_expr, true_dict, false_dict, phase, variables, build_file) @@ -1072,7 +1079,7 @@ def EvalSingleCondition( cond_expr, true_dict, false_dict, phase, variables, build_file): """Returns true_dict if cond_expr evaluates to true, and false_dict otherwise.""" - # Do expansions on the condition itself. Since the conditon can naturally + # Do expansions on the condition itself. Since the condition can naturally # contain variable references without needing to resort to GYP expansion # syntax, this is of dubious value for variables, but someone might want to # use a command expansion directly inside a condition. @@ -1089,7 +1096,7 @@ def EvalSingleCondition( else: ast_code = compile(cond_expr_expanded, '', 'eval') cached_conditions_asts[cond_expr_expanded] = ast_code - env = {'__builtins__': None, 'v': StrictVersion} + env = {'__builtins__': {}, 'v': StrictVersion} if eval(ast_code, env, variables): return true_dict return false_dict @@ -1178,7 +1185,7 @@ def LoadVariablesFromVariablesDict(variables, the_dict, the_dict_key): continue if the_dict_key == 'variables' and variable_name in the_dict: # If the variable is set without a % in the_dict, and the_dict is a - # variables dict (making |variables| a varaibles sub-dict of a + # variables dict (making |variables| a variables sub-dict of a # variables dict), use the_dict's definition. value = the_dict[variable_name] else: @@ -1608,7 +1615,7 @@ def Visit(node, path): def DirectDependencies(self, dependencies=None): """Returns a list of just direct dependencies.""" - if dependencies == None: + if dependencies is None: dependencies = [] for dependency in self.dependencies: @@ -1636,7 +1643,7 @@ def _AddImportedDependencies(self, targets, dependencies=None): public entry point. """ - if dependencies == None: + if dependencies is None: dependencies = [] index = 0 @@ -1870,7 +1877,7 @@ def VerifyNoGYPFileCircularDependencies(targets): continue dependency_node = dependency_nodes.get(dependency_build_file) if not dependency_node: - raise GypError("Dependancy '%s' not found" % dependency_build_file) + raise GypError("Dependency '%s' not found" % dependency_build_file) if dependency_node not in build_file_node.dependencies: build_file_node.dependencies.append(dependency_node) dependency_node.dependents.append(build_file_node) @@ -2040,7 +2047,7 @@ def MakePathRelative(to_file, fro_file, item): gyp.common.RelativePath(os.path.dirname(fro_file), os.path.dirname(to_file)), item)).replace('\\', '/') - if item[-1] == '/': + if item.endswith('/'): ret += '/' return ret @@ -2288,7 +2295,7 @@ def SetUpConfigurations(target, target_dict): merged_configurations[configuration]) # Now drop all the abstract ones. - for configuration in target_dict['configurations'].keys(): + for configuration in list(target_dict['configurations']): old_configuration_dict = target_dict['configurations'][configuration] if old_configuration_dict.get('abstract'): del target_dict['configurations'][configuration] @@ -2531,8 +2538,8 @@ def ValidateSourcesInTarget(target, target_dict, build_file, error += ' %s: %s\n' % (basename, ' '.join(files)) if error: - print('static library %s has several files with the same basename:\n' % - target + error + 'libtool on Mac cannot handle that. Use ' + print('static library %s has several files with the same basename:\n' % target + + error + 'libtool on Mac cannot handle that. Use ' '--no-duplicate-basename-check to disable this validation.') raise GypError('Duplicate basenames in sources section, see list above') diff --git a/tools/gyp/pylib/gyp/mac_tool.py b/tools/gyp/pylib/gyp/mac_tool.py index c4c4a6df130404..781a8633bc2c7f 100755 --- a/tools/gyp/pylib/gyp/mac_tool.py +++ b/tools/gyp/pylib/gyp/mac_tool.py @@ -478,8 +478,7 @@ def _FindProvisioningProfile(self, profile, bundle_identifier): profiles_dir = os.path.join( os.environ['HOME'], 'Library', 'MobileDevice', 'Provisioning Profiles') if not os.path.isdir(profiles_dir): - print('cannot find mobile provisioning for %s' % bundle_identifier, - file=sys.stderr) + print('cannot find mobile provisioning for %s' % (bundle_identifier), file=sys.stderr) sys.exit(1) provisioning_profiles = None if profile: @@ -500,8 +499,7 @@ def _FindProvisioningProfile(self, profile, bundle_identifier): valid_provisioning_profiles[app_id_pattern] = ( profile_path, profile_data, team_identifier) if not valid_provisioning_profiles: - print('cannot find mobile provisioning for %s' % bundle_identifier, - file=sys.stderr) + print('cannot find mobile provisioning for %s' % (bundle_identifier), file=sys.stderr) sys.exit(1) # If the user has multiple provisioning profiles installed that can be # used for ${bundle_identifier}, pick the most specific one (ie. the diff --git a/tools/gyp/pylib/gyp/msvs_emulation.py b/tools/gyp/pylib/gyp/msvs_emulation.py index e130b53271c73e..d42e2e47b98a53 100644 --- a/tools/gyp/pylib/gyp/msvs_emulation.py +++ b/tools/gyp/pylib/gyp/msvs_emulation.py @@ -16,6 +16,7 @@ import gyp.MSVSUtil import gyp.MSVSVersion +PY3 = bytes != str windows_quoter_regex = re.compile(r'(\\*)"') @@ -130,7 +131,10 @@ def _FindDirectXInstallation(): # Setup params to pass to and attempt to launch reg.exe. cmd = ['reg.exe', 'query', r'HKLM\Software\Microsoft\DirectX', '/s'] p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE) - for line in p.communicate()[0].splitlines(): + stdout = p.communicate()[0] + if PY3: + stdout = stdout.decode('utf-8') + for line in stdout.splitlines(): if 'InstallPath' in line: dxsdk_dir = line.split(' ')[3] + "\\" @@ -241,7 +245,11 @@ def GetExtension(self): def GetVSMacroEnv(self, base_to_build=None, config=None): """Get a dict of variables mapping internal VS macro names to their gyp equivalents.""" - target_platform = 'Win32' if self.GetArch(config) == 'x86' else 'x64' + target_arch = self.GetArch(config) + if target_arch == 'x86': + target_platform = 'Win32' + else: + target_platform = target_arch target_name = self.spec.get('product_prefix', '') + \ self.spec.get('product_name', self.spec['target_name']) target_dir = base_to_build + '\\' if base_to_build else '' @@ -304,7 +312,7 @@ def GetArch(self, config): if not platform: # If no specific override, use the configuration's. platform = configuration_platform # Map from platform to architecture. - return {'Win32': 'x86', 'x64': 'x64'}.get(platform, 'x86') + return {'Win32': 'x86', 'x64': 'x64', 'ARM64': 'arm64'}.get(platform, 'x86') def _TargetConfig(self, config): """Returns the target-specific configuration.""" @@ -379,7 +387,7 @@ def GetCompilerPdbName(self, config, expand_special): return pdbname def GetMapFileName(self, config, expand_special): - """Gets the explicitly overriden map file name for a target or returns None + """Gets the explicitly overridden map file name for a target or returns None if it's not set.""" config = self._TargetConfig(config) map_file = self._Setting(('VCLinkerTool', 'MapFileName'), config) @@ -575,7 +583,10 @@ def GetLdflags(self, config, gyp_to_build_path, expand_special, 'VCLinkerTool', append=ldflags) self._GetDefFileAsLdflags(ldflags, gyp_to_build_path) ld('GenerateDebugInformation', map={'true': '/DEBUG'}) - ld('TargetMachine', map={'1': 'X86', '17': 'X64', '3': 'ARM'}, + # TODO: These 'map' values come from machineTypeOption enum, + # and does not have an official value for ARM64 in VS2017 (yet). + # It needs to verify the ARM64 value when machineTypeOption is updated. + ld('TargetMachine', map={'1': 'X86', '17': 'X64', '3': 'ARM', '18': 'ARM64'}, prefix='/MACHINE:') ldflags.extend(self._GetAdditionalLibraryDirectories( 'VCLinkerTool', config, gyp_to_build_path)) @@ -872,7 +883,9 @@ def midl(name, default=None): ('iid', iid), ('proxy', proxy)] # TODO(scottmg): Are there configuration settings to set these flags? - target_platform = 'win32' if self.GetArch(config) == 'x86' else 'x64' + target_platform = self.GetArch(config) + if target_platform == 'x86': + target_platform = 'win32' flags = ['/char', 'signed', '/env', target_platform, '/Oicf'] return outdir, output, variables, flags @@ -1045,6 +1058,8 @@ def GenerateEnvironmentFiles(toplevel_build_dir, generator_flags, popen = subprocess.Popen( args, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) variables, _ = popen.communicate() + if PY3: + variables = variables.decode('utf-8') if popen.returncode != 0: raise Exception('"%s" failed with error %d' % (args, popen.returncode)) env = _ExtractImportantEnvironment(variables) @@ -1066,6 +1081,8 @@ def GenerateEnvironmentFiles(toplevel_build_dir, generator_flags, 'for', '%i', 'in', '(cl.exe)', 'do', '@echo', 'LOC:%~$PATH:i')) popen = subprocess.Popen(args, shell=True, stdout=subprocess.PIPE) output, _ = popen.communicate() + if PY3: + output = output.decode('utf-8') cl_paths[arch] = _ExtractCLPath(output) return cl_paths diff --git a/tools/gyp/pylib/gyp/ordered_dict.py b/tools/gyp/pylib/gyp/ordered_dict.py deleted file mode 100644 index 6fe9c1f6c7c22b..00000000000000 --- a/tools/gyp/pylib/gyp/ordered_dict.py +++ /dev/null @@ -1,289 +0,0 @@ -# Unmodified from http://code.activestate.com/recipes/576693/ -# other than to add MIT license header (as specified on page, but not in code). -# Linked from Python documentation here: -# http://docs.python.org/2/library/collections.html#collections.OrderedDict -# -# This should be deleted once Py2.7 is available on all bots, see -# http://crbug.com/241769. -# -# Copyright (c) 2009 Raymond Hettinger. -# -# Permission is hereby granted, free of charge, to any person obtaining a copy -# of this software and associated documentation files (the "Software"), to deal -# in the Software without restriction, including without limitation the rights -# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell -# copies of the Software, and to permit persons to whom the Software is -# furnished to do so, subject to the following conditions: -# -# The above copyright notice and this permission notice shall be included in -# all copies or substantial portions of the Software. -# -# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR -# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, -# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE -# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER -# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, -# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN -# THE SOFTWARE. - -# Backport of OrderedDict() class that runs on Python 2.4, 2.5, 2.6, 2.7 and pypy. -# Passes Python2.7's test suite and incorporates all the latest updates. - -try: - from thread import get_ident as _get_ident -except ImportError: - from dummy_thread import get_ident as _get_ident - -try: - from _abcoll import KeysView, ValuesView, ItemsView -except ImportError: - pass - - -class OrderedDict(dict): - 'Dictionary that remembers insertion order' - # An inherited dict maps keys to values. - # The inherited dict provides __getitem__, __len__, __contains__, and get. - # The remaining methods are order-aware. - # Big-O running times for all methods are the same as for regular dictionaries. - - # The internal self.__map dictionary maps keys to links in a doubly linked list. - # The circular doubly linked list starts and ends with a sentinel element. - # The sentinel element never gets deleted (this simplifies the algorithm). - # Each link is stored as a list of length three: [PREV, NEXT, KEY]. - - def __init__(self, *args, **kwds): - '''Initialize an ordered dictionary. Signature is the same as for - regular dictionaries, but keyword arguments are not recommended - because their insertion order is arbitrary. - - ''' - if len(args) > 1: - raise TypeError('expected at most 1 arguments, got %d' % len(args)) - try: - self.__root - except AttributeError: - self.__root = root = [] # sentinel node - root[:] = [root, root, None] - self.__map = {} - self.__update(*args, **kwds) - - def __setitem__(self, key, value, dict_setitem=dict.__setitem__): - 'od.__setitem__(i, y) <==> od[i]=y' - # Setting a new item creates a new link which goes at the end of the linked - # list, and the inherited dictionary is updated with the new key/value pair. - if key not in self: - root = self.__root - last = root[0] - last[1] = root[0] = self.__map[key] = [last, root, key] - dict_setitem(self, key, value) - - def __delitem__(self, key, dict_delitem=dict.__delitem__): - 'od.__delitem__(y) <==> del od[y]' - # Deleting an existing item uses self.__map to find the link which is - # then removed by updating the links in the predecessor and successor nodes. - dict_delitem(self, key) - link_prev, link_next, key = self.__map.pop(key) - link_prev[1] = link_next - link_next[0] = link_prev - - def __iter__(self): - 'od.__iter__() <==> iter(od)' - root = self.__root - curr = root[1] - while curr is not root: - yield curr[2] - curr = curr[1] - - def __reversed__(self): - 'od.__reversed__() <==> reversed(od)' - root = self.__root - curr = root[0] - while curr is not root: - yield curr[2] - curr = curr[0] - - def clear(self): - 'od.clear() -> None. Remove all items from od.' - try: - for node in self.__map.itervalues(): - del node[:] - root = self.__root - root[:] = [root, root, None] - self.__map.clear() - except AttributeError: - pass - dict.clear(self) - - def popitem(self, last=True): - '''od.popitem() -> (k, v), return and remove a (key, value) pair. - Pairs are returned in LIFO order if last is true or FIFO order if false. - - ''' - if not self: - raise KeyError('dictionary is empty') - root = self.__root - if last: - link = root[0] - link_prev = link[0] - link_prev[1] = root - root[0] = link_prev - else: - link = root[1] - link_next = link[1] - root[1] = link_next - link_next[0] = root - key = link[2] - del self.__map[key] - value = dict.pop(self, key) - return key, value - - # -- the following methods do not depend on the internal structure -- - - def keys(self): - 'od.keys() -> list of keys in od' - return list(self) - - def values(self): - 'od.values() -> list of values in od' - return [self[key] for key in self] - - def items(self): - 'od.items() -> list of (key, value) pairs in od' - return [(key, self[key]) for key in self] - - def iterkeys(self): - 'od.iterkeys() -> an iterator over the keys in od' - return iter(self) - - def itervalues(self): - 'od.itervalues -> an iterator over the values in od' - for k in self: - yield self[k] - - def items(self): - 'od.items -> an iterator over the (key, value) items in od' - for k in self: - yield (k, self[k]) - - # Suppress 'OrderedDict.update: Method has no argument': - # pylint: disable=E0211 - def update(*args, **kwds): - '''od.update(E, **F) -> None. Update od from dict/iterable E and F. - - If E is a dict instance, does: for k in E: od[k] = E[k] - If E has a .keys() method, does: for k in E.keys(): od[k] = E[k] - Or if E is an iterable of items, does: for k, v in E: od[k] = v - In either case, this is followed by: for k, v in F.items(): od[k] = v - - ''' - if len(args) > 2: - raise TypeError('update() takes at most 2 positional ' - 'arguments (%d given)' % (len(args),)) - elif not args: - raise TypeError('update() takes at least 1 argument (0 given)') - self = args[0] - # Make progressively weaker assumptions about "other" - other = () - if len(args) == 2: - other = args[1] - if isinstance(other, dict): - for key in other: - self[key] = other[key] - elif hasattr(other, 'keys'): - for key in other.keys(): - self[key] = other[key] - else: - for key, value in other: - self[key] = value - for key, value in kwds.items(): - self[key] = value - - __update = update # let subclasses override update without breaking __init__ - - __marker = object() - - def pop(self, key, default=__marker): - '''od.pop(k[,d]) -> v, remove specified key and return the corresponding value. - If key is not found, d is returned if given, otherwise KeyError is raised. - - ''' - if key in self: - result = self[key] - del self[key] - return result - if default is self.__marker: - raise KeyError(key) - return default - - def setdefault(self, key, default=None): - 'od.setdefault(k[,d]) -> od.get(k,d), also set od[k]=d if k not in od' - if key in self: - return self[key] - self[key] = default - return default - - def __repr__(self, _repr_running={}): - 'od.__repr__() <==> repr(od)' - call_key = id(self), _get_ident() - if call_key in _repr_running: - return '...' - _repr_running[call_key] = 1 - try: - if not self: - return '%s()' % (self.__class__.__name__,) - return '%s(%r)' % (self.__class__.__name__, self.items()) - finally: - del _repr_running[call_key] - - def __reduce__(self): - 'Return state information for pickling' - items = [[k, self[k]] for k in self] - inst_dict = vars(self).copy() - for k in vars(OrderedDict()): - inst_dict.pop(k, None) - if inst_dict: - return (self.__class__, (items,), inst_dict) - return self.__class__, (items,) - - def copy(self): - 'od.copy() -> a shallow copy of od' - return self.__class__(self) - - @classmethod - def fromkeys(cls, iterable, value=None): - '''OD.fromkeys(S[, v]) -> New ordered dictionary with keys from S - and values equal to v (which defaults to None). - - ''' - d = cls() - for key in iterable: - d[key] = value - return d - - def __eq__(self, other): - '''od.__eq__(y) <==> od==y. Comparison to another OD is order-sensitive - while comparison to a regular mapping is order-insensitive. - - ''' - if isinstance(other, OrderedDict): - return len(self)==len(other) and self.items() == other.items() - return dict.__eq__(self, other) - - def __ne__(self, other): - return not self == other - - # -- the following methods are only used in Python 2.7 -- - - def viewkeys(self): - "od.viewkeys() -> a set-like object providing a view on od's keys" - return KeysView(self) - - def viewvalues(self): - "od.viewvalues() -> an object providing a view on od's values" - return ValuesView(self) - - def viewitems(self): - "od.viewitems() -> a set-like object providing a view on od's items" - return ItemsView(self) - diff --git a/tools/gyp/pylib/gyp/win_tool.py b/tools/gyp/pylib/gyp/win_tool.py index ab6db1c4e047fb..cfdacb0d7ccd10 100755 --- a/tools/gyp/pylib/gyp/win_tool.py +++ b/tools/gyp/pylib/gyp/win_tool.py @@ -20,6 +20,7 @@ import sys BASE_DIR = os.path.dirname(os.path.abspath(__file__)) +PY3 = bytes != str # A regex matching an argument corresponding to the output filename passed to # link.exe. @@ -132,6 +133,8 @@ def ExecLinkWrapper(self, arch, use_separate_mspdbsrv, *args): link = subprocess.Popen(args, shell=sys.platform == 'win32', env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = link.communicate() + if PY3: + out = out.decode('utf-8') for line in out.splitlines(): if (not line.startswith(' Creating library ') and not line.startswith('Generating code') and @@ -223,6 +226,8 @@ def ExecManifestWrapper(self, arch, *args): popen = subprocess.Popen(args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = popen.communicate() + if PY3: + out = out.decode('utf-8') for line in out.splitlines(): if line and 'manifest authoring warning 81010002' not in line: print(line) @@ -255,6 +260,8 @@ def ExecMidlWrapper(self, arch, outdir, tlb, h, dlldata, iid, proxy, idl, popen = subprocess.Popen(args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = popen.communicate() + if PY3: + out = out.decode('utf-8') # Filter junk out of stdout, and write filtered versions. Output we want # to filter is pairs of lines that look like this: # Processing C:\Program Files (x86)\Microsoft SDKs\...\include\objidl.idl @@ -274,6 +281,8 @@ def ExecAsmWrapper(self, arch, *args): popen = subprocess.Popen(args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = popen.communicate() + if PY3: + out = out.decode('utf-8') for line in out.splitlines(): if (not line.startswith('Copyright (C) Microsoft Corporation') and not line.startswith('Microsoft (R) Macro Assembler') and @@ -289,6 +298,8 @@ def ExecRcWrapper(self, arch, *args): popen = subprocess.Popen(args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = popen.communicate() + if PY3: + out = out.decode('utf-8') for line in out.splitlines(): if (not line.startswith('Microsoft (R) Windows (R) Resource Compiler') and not line.startswith('Copyright (C) Microsoft Corporation') and diff --git a/tools/gyp/pylib/gyp/xcode_emulation.py b/tools/gyp/pylib/gyp/xcode_emulation.py index 905bec7be34ba8..c3daba5fb82e1a 100644 --- a/tools/gyp/pylib/gyp/xcode_emulation.py +++ b/tools/gyp/pylib/gyp/xcode_emulation.py @@ -854,7 +854,7 @@ def GetLdflags(self, configname, product_dir, gyp_to_build_path, arch=None): product_dir: The directory where products such static and dynamic libraries are placed. This is added to the library search path. gyp_to_build_path: A function that converts paths relative to the - current gyp file to paths relative to the build direcotry. + current gyp file to paths relative to the build directory. """ self.configname = configname ldflags = [] @@ -1002,7 +1002,7 @@ def GetPerTargetSetting(self, setting, default=None): def _GetStripPostbuilds(self, configname, output_binary, quiet): """Returns a list of shell commands that contain the shell commands - neccessary to strip this target's binary. These should be run as postbuilds + necessary to strip this target's binary. These should be run as postbuilds before the actual postbuilds run.""" self.configname = configname @@ -1037,7 +1037,7 @@ def _GetStripPostbuilds(self, configname, output_binary, quiet): def _GetDebugInfoPostbuilds(self, configname, output, output_binary, quiet): """Returns a list of shell commands that contain the shell commands - neccessary to massage this target's debug information. These should be run + necessary to massage this target's debug information. These should be run as postbuilds before the actual postbuilds run.""" self.configname = configname @@ -1173,7 +1173,7 @@ def _AdjustLibrary(self, library, config_name=None): # "/usr/lib" libraries, is do "-L/usr/lib -lname" which is dependent on the # library order and cause collision when building Chrome. # - # Instead substitude ".tbd" to ".dylib" in the generated project when the + # Instead substitute ".tbd" to ".dylib" in the generated project when the # following conditions are both true: # - library is referenced in the gyp file as "$(SDKROOT)/**/*.dylib", # - the ".dylib" file does not exists but a ".tbd" file do. @@ -1476,7 +1476,7 @@ def GetStdout(cmdlist): def MergeGlobalXcodeSettingsToSpec(global_dict, spec): """Merges the global xcode_settings dictionary into each configuration of the target represented by spec. For keys that are both in the global and the local - xcode_settings dict, the local key gets precendence. + xcode_settings dict, the local key gets precedence. """ # The xcode generator special-cases global xcode_settings and does something # that amounts to merging in the global xcode_settings into each local @@ -1522,7 +1522,7 @@ def GetMacBundleResources(product_dir, xcode_settings, resources): output = dest # The make generator doesn't support it, so forbid it everywhere - # to keep the generators more interchangable. + # to keep the generators more interchangeable. assert ' ' not in res, ( "Spaces in resource filenames not supported (%s)" % res) @@ -1564,14 +1564,14 @@ def GetMacInfoPlist(product_dir, xcode_settings, gyp_path_to_build_path): relative to the build directory. xcode_settings: The XcodeSettings of the current target. gyp_to_build_path: A function that converts paths relative to the - current gyp file to paths relative to the build direcotry. + current gyp file to paths relative to the build directory. """ info_plist = xcode_settings.GetPerTargetSetting('INFOPLIST_FILE') if not info_plist: return None, None, [], {} # The make generator doesn't support it, so forbid it everywhere - # to keep the generators more interchangable. + # to keep the generators more interchangeable. assert ' ' not in info_plist, ( "Spaces in Info.plist filenames not supported (%s)" % info_plist) diff --git a/tools/gyp/pylib/gyp/xcodeproj_file.py b/tools/gyp/pylib/gyp/xcodeproj_file.py index 0534f51fe5cf8d..1e950dce8f0a05 100644 --- a/tools/gyp/pylib/gyp/xcodeproj_file.py +++ b/tools/gyp/pylib/gyp/xcodeproj_file.py @@ -220,7 +220,7 @@ class XCObject(object): an empty string ("", in the case of property_type str) or list ([], in the case of is_list True) from being set for the property. - default: Optional. If is_requried is True, default may be set + default: Optional. If is_required is True, default may be set to provide a default value for objects that do not supply their own value. If is_required is True and default is not provided, users of the class must supply their own diff --git a/tools/gyp/tools/pretty_gyp.py b/tools/gyp/tools/pretty_gyp.py index d01c692edcf8d6..633048a59ad28c 100755 --- a/tools/gyp/tools/pretty_gyp.py +++ b/tools/gyp/tools/pretty_gyp.py @@ -18,7 +18,7 @@ # Regex to remove quoted strings when we're counting braces. # It takes into account quoted quotes, and makes sure that the quotes match. # NOTE: It does not handle quotes that span more than one line, or -# cases where an escaped quote is preceeded by an escaped backslash. +# cases where an escaped quote is preceded by an escaped backslash. QUOTE_RE_STR = r'(?P[\'"])(.*?)(? Date: Tue, 18 Feb 2020 20:18:31 -0800 Subject: [PATCH 31/91] meta: move julianduque to emeritus MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit julianduque confirmed in email that they can be moved to emeritus. PR-URL: https://github.com/nodejs/node/pull/31863 Reviewed-By: Gireesh Punathil Reviewed-By: Colin Ihrig Reviewed-By: Сковорода Никита Андреевич Reviewed-By: Ruben Bridgewater Reviewed-By: Tobias Nießen Reviewed-By: Michael Dawson --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index bff42f1114a3a9..d0ee4645eea7e9 100644 --- a/README.md +++ b/README.md @@ -331,8 +331,6 @@ For information about the governance of the Node.js project, see **João Reis** <reis@janeasystems.com> * [joyeecheung](https://github.com/joyeecheung) - **Joyee Cheung** <joyeec9h3@gmail.com> (she/her) -* [julianduque](https://github.com/julianduque) - -**Julian Duque** <julianduquej@gmail.com> (he/him) * [JungMinu](https://github.com/JungMinu) - **Minwoo Jung** <nodecorelab@gmail.com> (he/him) * [kfarnung](https://github.com/kfarnung) - @@ -474,6 +472,8 @@ For information about the governance of the Node.js project, see **Yuval Brik** <yuval@brik.org.il> * [joshgav](https://github.com/joshgav) - **Josh Gavant** <josh.gavant@outlook.com> +* [julianduque](https://github.com/julianduque) - +**Julian Duque** <julianduquej@gmail.com> (he/him) * [kunalspathak](https://github.com/kunalspathak) - **Kunal Pathak** <kunal.pathak@microsoft.com> * [lucamaraschi](https://github.com/lucamaraschi) - From d0e94fc77eed49ea1b8522de2193e947d80cc40c Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Tobias=20Nie=C3=9Fen?= Date: Wed, 19 Feb 2020 20:16:14 -0400 Subject: [PATCH 32/91] crypto: fix ieee-p1363 for createVerify Fixes: https://github.com/nodejs/node/issues/31866 PR-URL: https://github.com/nodejs/node/pull/31876 Reviewed-By: Ben Noordhuis Reviewed-By: Colin Ihrig Reviewed-By: James M Snell --- src/node_crypto.cc | 12 ++++-------- src/node_crypto.h | 3 +-- test/parallel/test-crypto-sign-verify.js | 11 +++++++++++ 3 files changed, 16 insertions(+), 10 deletions(-) diff --git a/src/node_crypto.cc b/src/node_crypto.cc index 2176fffc543e0b..d47cc4e1e82ff7 100644 --- a/src/node_crypto.cc +++ b/src/node_crypto.cc @@ -5323,8 +5323,7 @@ void Verify::VerifyUpdate(const FunctionCallbackInfo& args) { SignBase::Error Verify::VerifyFinal(const ManagedEVPPKey& pkey, - const char* sig, - int siglen, + const ByteSource& sig, int padding, const Maybe& saltlen, bool* verify_result) { @@ -5345,11 +5344,8 @@ SignBase::Error Verify::VerifyFinal(const ManagedEVPPKey& pkey, ApplyRSAOptions(pkey, pkctx.get(), padding, saltlen) && EVP_PKEY_CTX_set_signature_md(pkctx.get(), EVP_MD_CTX_md(mdctx.get())) > 0) { - const int r = EVP_PKEY_verify(pkctx.get(), - reinterpret_cast(sig), - siglen, - m, - m_len); + const unsigned char* s = reinterpret_cast(sig.get()); + const int r = EVP_PKEY_verify(pkctx.get(), s, sig.size(), m, m_len); *verify_result = r == 1; } @@ -5394,7 +5390,7 @@ void Verify::VerifyFinal(const FunctionCallbackInfo& args) { } bool verify_result; - Error err = verify->VerifyFinal(pkey, hbuf.data(), hbuf.length(), padding, + Error err = verify->VerifyFinal(pkey, signature, padding, salt_len, &verify_result); if (err != kSignOk) return verify->CheckThrow(err); diff --git a/src/node_crypto.h b/src/node_crypto.h index b57dc29de29785..ea6778daceb851 100644 --- a/src/node_crypto.h +++ b/src/node_crypto.h @@ -700,8 +700,7 @@ class Verify : public SignBase { static void Initialize(Environment* env, v8::Local target); Error VerifyFinal(const ManagedEVPPKey& key, - const char* sig, - int siglen, + const ByteSource& sig, int padding, const v8::Maybe& saltlen, bool* verify_result); diff --git a/test/parallel/test-crypto-sign-verify.js b/test/parallel/test-crypto-sign-verify.js index e3d3d818a1ace9..b70bfccae47eef 100644 --- a/test/parallel/test-crypto-sign-verify.js +++ b/test/parallel/test-crypto-sign-verify.js @@ -527,6 +527,9 @@ assert.throws( // Unlike DER signatures, IEEE P1363 signatures have a predictable length. assert.strictEqual(sig.length, length); assert.strictEqual(crypto.verify('sha1', data, opts, sig), true); + assert.strictEqual(crypto.createVerify('sha1') + .update(data) + .verify(opts, sig), true); // Test invalid signature lengths. for (const i of [-2, -1, 1, 2, 4, 8]) { @@ -552,6 +555,14 @@ assert.throws( ok ); + assert.strictEqual( + crypto.createVerify('sha256').update(data).verify({ + key: fixtures.readKey('ec-key.pem'), + dsaEncoding: 'ieee-p1363' + }, extSig), + ok + ); + extSig[Math.floor(Math.random() * extSig.length)] ^= 1; } From 2046652b4e20ad2ed98545239730b8f6cff1824c Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Tobias=20Nie=C3=9Fen?= Date: Sat, 22 Feb 2020 15:18:56 -0400 Subject: [PATCH 33/91] doc: fix anchor for ERR_TLS_INVALID_CONTEXT PR-URL: https://github.com/nodejs/node/pull/31915 Reviewed-By: Luigi Pinca Reviewed-By: Rich Trott Reviewed-By: Colin Ihrig --- doc/api/errors.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/api/errors.md b/doc/api/errors.md index 09f43e5036729d..ed4bfacf1cd57b 100644 --- a/doc/api/errors.md +++ b/doc/api/errors.md @@ -1850,7 +1850,7 @@ recommended to use 2048 bits or larger for stronger security. A TLS/SSL handshake timed out. In this case, the server must also abort the connection. - + ### `ERR_TLS_INVALID_CONTEXT` + +The TLS socket must be connected and securily established. Ensure the 'secure' +event is emitted, before you continue. + ### `ERR_TLS_INVALID_PROTOCOL_METHOD` diff --git a/doc/api/tls.md b/doc/api/tls.md index fee6e33d610581..3341e6e9ea514b 100644 --- a/doc/api/tls.md +++ b/doc/api/tls.md @@ -1094,6 +1094,39 @@ See [SSL_get_shared_sigalgs](https://www.openssl.org/docs/man1.1.1/man3/SSL_get_shared_sigalgs.html) for more information. +### `tlsSocket.exportKeyingMaterial(length, label[, context])` + + +* `length` {number} number of bytes to retrieve from keying material +* `label` {string} an application specific label, typically this will be a +value from the +[IANA Exporter Label Registry](https://www.iana.org/assignments/tls-parameters/tls-parameters.xhtml#exporter-labels). +* `context` {Buffer} Optionally provide a context. + +* Returns: {Buffer} requested bytes of the keying material + +Keying material is used for validations to prevent different kind of attacks in +network protocols, for example in the specifications of IEEE 802.1X. + +Example + +```js +const keyingMaterial = tlsSocket.exportKeyingMaterial( + 128, + 'client finished'); + +/** + Example return value of keyingMaterial: + +*/ +``` +See the OpenSSL [`SSL_export_keying_material`][] documentation for more +information. + ### `tlsSocket.getTLSTicket()` +`--perf-basic-prof-only-functions`, `--perf-basic-prof`, +`--perf-prof-unwinding-info`, and `--perf-prof` are only available on Linux. + ### `NODE_PATH=path[:…]` + +This class is used to create asynchronous state within callbacks and promise +chains. It allows storing data throughout the lifetime of a web request +or any other asynchronous duration. It is similar to thread-local storage +in other languages. + +The following example builds a logger that will always know the current HTTP +request and uses it to display enhanced logs without needing to explicitly +provide the current HTTP request to it. + +```js +const { AsyncLocalStorage } = require('async_hooks'); +const http = require('http'); + +const kReq = 'CURRENT_REQUEST'; +const asyncLocalStorage = new AsyncLocalStorage(); + +function log(...args) { + const store = asyncLocalStorage.getStore(); + // Make sure the store exists and it contains a request. + if (store && store.has(kReq)) { + const req = store.get(kReq); + // Prints `GET /items ERR could not do something + console.log(req.method, req.url, ...args); + } else { + console.log(...args); + } +} + +http.createServer((request, response) => { + asyncLocalStorage.run(() => { + const store = asyncLocalStorage.getStore(); + store.set(kReq, request); + someAsyncOperation((err, result) => { + if (err) { + log('ERR', err.message); + } + }); + }); +}) +.listen(8080); +``` + +When having multiple instances of `AsyncLocalStorage`, they are independent +from each other. It is safe to instantiate this class multiple times. + +### `new AsyncLocalStorage()` + + +Creates a new instance of `AsyncLocalStorage`. Store is only provided within a +`run` or a `runSyncAndReturn` method call. + +### `asyncLocalStorage.disable()` + + +This method disables the instance of `AsyncLocalStorage`. All subsequent calls +to `asyncLocalStorage.getStore()` will return `undefined` until +`asyncLocalStorage.run()` or `asyncLocalStorage.runSyncAndReturn()` +is called again. + +When calling `asyncLocalStorage.disable()`, all current contexts linked to the +instance will be exited. + +Calling `asyncLocalStorage.disable()` is required before the +`asyncLocalStorage` can be garbage collected. This does not apply to stores +provided by the `asyncLocalStorage`, as those objects are garbage collected +along with the corresponding async resources. + +This method is to be used when the `asyncLocalStorage` is not in use anymore +in the current process. + +### `asyncLocalStorage.getStore()` + + +* Returns: {Map} + +This method returns the current store. +If this method is called outside of an asynchronous context initialized by +calling `asyncLocalStorage.run` or `asyncLocalStorage.runAndReturn`, it will +return `undefined`. + +### `asyncLocalStorage.run(callback[, ...args])` + + +* `callback` {Function} +* `...args` {any} + +Calling `asyncLocalStorage.run(callback)` will create a new asynchronous +context. +Within the callback function and the asynchronous operations from the callback, +`asyncLocalStorage.getStore()` will return an instance of `Map` known as +"the store". This store will be persistent through the following +asynchronous calls. + +The callback will be ran asynchronously. Optionally, arguments can be passed +to the function. They will be passed to the callback function. + +If an error is thrown by the callback function, it will not be caught by +a `try/catch` block as the callback is ran in a new asynchronous resource. +Also, the stacktrace will be impacted by the asynchronous call. + +Example: + +```js +asyncLocalStorage.run(() => { + asyncLocalStorage.getStore(); // Returns a Map + someAsyncOperation(() => { + asyncLocalStorage.getStore(); // Returns the same Map + }); +}); +asyncLocalStorage.getStore(); // Returns undefined +``` + +### `asyncLocalStorage.exit(callback[, ...args])` + + +* `callback` {Function} +* `...args` {any} + +Calling `asyncLocalStorage.exit(callback)` will create a new asynchronous +context. +Within the callback function and the asynchronous operations from the callback, +`asyncLocalStorage.getStore()` will return `undefined`. + +The callback will be ran asynchronously. Optionally, arguments can be passed +to the function. They will be passed to the callback function. + +If an error is thrown by the callback function, it will not be caught by +a `try/catch` block as the callback is ran in a new asynchronous resource. +Also, the stacktrace will be impacted by the asynchronous call. + +Example: + +```js +asyncLocalStorage.run(() => { + asyncLocalStorage.getStore(); // Returns a Map + asyncLocalStorage.exit(() => { + asyncLocalStorage.getStore(); // Returns undefined + }); + asyncLocalStorage.getStore(); // Returns the same Map +}); +``` + +### `asyncLocalStorage.runSyncAndReturn(callback[, ...args])` + + +* `callback` {Function} +* `...args` {any} + +This methods runs a function synchronously within a context and return its +return value. The store is not accessible outside of the callback function or +the asynchronous operations created within the callback. + +Optionally, arguments can be passed to the function. They will be passed to +the callback function. + +If the callback function throws an error, it will be thrown by +`runSyncAndReturn` too. The stacktrace will not be impacted by this call and +the context will be exited. + +Example: + +```js +try { + asyncLocalStorage.runSyncAndReturn(() => { + asyncLocalStorage.getStore(); // Returns a Map + throw new Error(); + }); +} catch (e) { + asyncLocalStorage.getStore(); // Returns undefined + // The error will be caught here +} +``` + +### `asyncLocalStorage.exitSyncAndReturn(callback[, ...args])` + + +* `callback` {Function} +* `...args` {any} + +This methods runs a function synchronously outside of a context and return its +return value. The store is not accessible within the callback function or +the asynchronous operations created within the callback. + +Optionally, arguments can be passed to the function. They will be passed to +the callback function. + +If the callback function throws an error, it will be thrown by +`exitSyncAndReturn` too. The stacktrace will not be impacted by this call and +the context will be re-entered. + +Example: + +```js +// Within a call to run or runSyncAndReturn +try { + asyncLocalStorage.getStore(); // Returns a Map + asyncLocalStorage.exitSyncAndReturn(() => { + asyncLocalStorage.getStore(); // Returns undefined + throw new Error(); + }); +} catch (e) { + asyncLocalStorage.getStore(); // Returns the same Map + // The error will be caught here +} +``` + +### Choosing between `run` and `runSyncAndReturn` + +#### When to choose `run` + +`run` is asynchronous. It is called with a callback function that +runs within a new asynchronous call. This is the most explicit behavior as +everything that is executed within the callback of `run` (including further +asynchronous operations) will have access to the store. + +If an instance of `AsyncLocalStorage` is used for error management (for +instance, with `process.setUncaughtExceptionCaptureCallback`), only +exceptions thrown in the scope of the callback function will be associated +with the context. + +This method is the safest as it provides strong scoping and consistent +behavior. + +It cannot be promisified using `util.promisify`. If needed, the `Promise` +constructor can be used: + +```js +new Promise((resolve, reject) => { + asyncLocalStorage.run(() => { + someFunction((err, result) => { + if (err) { + return reject(err); + } + return resolve(result); + }); + }); +}); +``` + +#### When to choose `runSyncAndReturn` + +`runSyncAndReturn` is synchronous. The callback function will be executed +synchronously and its return value will be returned by `runSyncAndReturn`. +The store will only be accessible from within the callback +function and the asynchronous operations created within this scope. +If the callback throws an error, `runSyncAndReturn` will throw it and it will +not be associated with the context. + +This method provides good scoping while being synchronous. + +#### Usage with `async/await` + +If, within an async function, only one `await` call is to run within a context, +the following pattern should be used: + +```js +async function fn() { + await asyncLocalStorage.runSyncAndReturn(() => { + asyncLocalStorage.getStore().set('key', value); + return foo(); // The return value of foo will be awaited + }); +} +``` + +In this example, the store is only available in the callback function and the +functions called by `foo`. Outside of `runSyncAndReturn`, calling `getStore` +will return `undefined`. + [`after` callback]: #async_hooks_after_asyncid [`before` callback]: #async_hooks_before_asyncid [`destroy` callback]: #async_hooks_destroy_asyncid diff --git a/lib/async_hooks.js b/lib/async_hooks.js index 3ebc9af473d5c8..23f8ddde671e30 100644 --- a/lib/async_hooks.js +++ b/lib/async_hooks.js @@ -1,9 +1,11 @@ 'use strict'; const { + Map, NumberIsSafeInteger, ReflectApply, Symbol, + } = primordials; const { @@ -209,11 +211,102 @@ class AsyncResource { } } +const storageList = []; +const storageHook = createHook({ + init(asyncId, type, triggerAsyncId, resource) { + const currentResource = executionAsyncResource(); + // Value of currentResource is always a non null object + for (let i = 0; i < storageList.length; ++i) { + storageList[i]._propagate(resource, currentResource); + } + } +}); + +class AsyncLocalStorage { + constructor() { + this.kResourceStore = Symbol('kResourceStore'); + this.enabled = false; + } + + disable() { + if (this.enabled) { + this.enabled = false; + // If this.enabled, the instance must be in storageList + storageList.splice(storageList.indexOf(this), 1); + if (storageList.length === 0) { + storageHook.disable(); + } + } + } + + // Propagate the context from a parent resource to a child one + _propagate(resource, triggerResource) { + const store = triggerResource[this.kResourceStore]; + if (this.enabled) { + resource[this.kResourceStore] = store; + } + } + + _enter() { + if (!this.enabled) { + this.enabled = true; + storageList.push(this); + storageHook.enable(); + } + const resource = executionAsyncResource(); + resource[this.kResourceStore] = new Map(); + } + + _exit() { + const resource = executionAsyncResource(); + if (resource) { + resource[this.kResourceStore] = undefined; + } + } + + runSyncAndReturn(callback, ...args) { + this._enter(); + try { + return callback(...args); + } finally { + this._exit(); + } + } + + exitSyncAndReturn(callback, ...args) { + this.enabled = false; + try { + return callback(...args); + } finally { + this.enabled = true; + } + } + + getStore() { + const resource = executionAsyncResource(); + if (this.enabled) { + return resource[this.kResourceStore]; + } + } + + run(callback, ...args) { + this._enter(); + process.nextTick(callback, ...args); + this._exit(); + } + + exit(callback, ...args) { + this.enabled = false; + process.nextTick(callback, ...args); + this.enabled = true; + } +} // Placing all exports down here because the exported classes won't export // otherwise. module.exports = { // Public API + AsyncLocalStorage, createHook, executionAsyncId, triggerAsyncId, diff --git a/test/async-hooks/test-async-local-storage-args.js b/test/async-hooks/test-async-local-storage-args.js new file mode 100644 index 00000000000000..91a3385e6eeb16 --- /dev/null +++ b/test/async-hooks/test-async-local-storage-args.js @@ -0,0 +1,20 @@ +'use strict'; +require('../common'); +const assert = require('assert'); +const { AsyncLocalStorage } = require('async_hooks'); + +const asyncLocalStorage = new AsyncLocalStorage(); + +asyncLocalStorage.run((runArg) => { + assert.strictEqual(runArg, 1); + asyncLocalStorage.exit((exitArg) => { + assert.strictEqual(exitArg, 2); + }, 2); +}, 1); + +asyncLocalStorage.runSyncAndReturn((runArg) => { + assert.strictEqual(runArg, 'foo'); + asyncLocalStorage.exitSyncAndReturn((exitArg) => { + assert.strictEqual(exitArg, 'bar'); + }, 'bar'); +}, 'foo'); diff --git a/test/async-hooks/test-async-local-storage-async-await.js b/test/async-hooks/test-async-local-storage-async-await.js new file mode 100644 index 00000000000000..28c8488da62c53 --- /dev/null +++ b/test/async-hooks/test-async-local-storage-async-await.js @@ -0,0 +1,19 @@ +'use strict'; +require('../common'); +const assert = require('assert'); +const { AsyncLocalStorage } = require('async_hooks'); + +const asyncLocalStorage = new AsyncLocalStorage(); + +async function test() { + asyncLocalStorage.getStore().set('foo', 'bar'); + await Promise.resolve(); + assert.strictEqual(asyncLocalStorage.getStore().get('foo'), 'bar'); +} + +async function main() { + await asyncLocalStorage.runSyncAndReturn(test); + assert.strictEqual(asyncLocalStorage.getStore(), undefined); +} + +main(); diff --git a/test/async-hooks/test-async-local-storage-async-functions.js b/test/async-hooks/test-async-local-storage-async-functions.js new file mode 100644 index 00000000000000..89ac0be62c7488 --- /dev/null +++ b/test/async-hooks/test-async-local-storage-async-functions.js @@ -0,0 +1,27 @@ +'use strict'; +require('../common'); +const assert = require('assert'); +const { AsyncLocalStorage } = require('async_hooks'); + +async function foo() {} + +const asyncLocalStorage = new AsyncLocalStorage(); + +async function testOut() { + await foo(); + assert.strictEqual(asyncLocalStorage.getStore(), undefined); +} + +async function testAwait() { + await foo(); + assert.notStrictEqual(asyncLocalStorage.getStore(), undefined); + assert.strictEqual(asyncLocalStorage.getStore().get('key'), 'value'); + await asyncLocalStorage.exitSyncAndReturn(testOut); +} + +asyncLocalStorage.run(() => { + const store = asyncLocalStorage.getStore(); + store.set('key', 'value'); + testAwait(); // should not reject +}); +assert.strictEqual(asyncLocalStorage.getStore(), undefined); diff --git a/test/async-hooks/test-async-local-storage-enable-disable.js b/test/async-hooks/test-async-local-storage-enable-disable.js new file mode 100644 index 00000000000000..c30d72eb805d5d --- /dev/null +++ b/test/async-hooks/test-async-local-storage-enable-disable.js @@ -0,0 +1,21 @@ +'use strict'; +require('../common'); +const assert = require('assert'); +const { AsyncLocalStorage } = require('async_hooks'); + +const asyncLocalStorage = new AsyncLocalStorage(); + +asyncLocalStorage.runSyncAndReturn(() => { + asyncLocalStorage.getStore().set('foo', 'bar'); + process.nextTick(() => { + assert.strictEqual(asyncLocalStorage.getStore().get('foo'), 'bar'); + asyncLocalStorage.disable(); + assert.strictEqual(asyncLocalStorage.getStore(), undefined); + process.nextTick(() => { + assert.strictEqual(asyncLocalStorage.getStore(), undefined); + asyncLocalStorage.runSyncAndReturn(() => { + assert.notStrictEqual(asyncLocalStorage.getStore(), undefined); + }); + }); + }); +}); diff --git a/test/async-hooks/test-async-local-storage-errors-async.js b/test/async-hooks/test-async-local-storage-errors-async.js new file mode 100644 index 00000000000000..c782b383e9ca95 --- /dev/null +++ b/test/async-hooks/test-async-local-storage-errors-async.js @@ -0,0 +1,26 @@ +'use strict'; +require('../common'); +const assert = require('assert'); +const { AsyncLocalStorage } = require('async_hooks'); + +// case 1 fully async APIS (safe) +const asyncLocalStorage = new AsyncLocalStorage(); + +let i = 0; +process.setUncaughtExceptionCaptureCallback((err) => { + ++i; + assert.strictEqual(err.message, 'err' + i); + assert.strictEqual(asyncLocalStorage.getStore().get('hello'), 'node'); +}); + +asyncLocalStorage.run(() => { + const store = asyncLocalStorage.getStore(); + store.set('hello', 'node'); + setTimeout(() => { + process.nextTick(() => { + assert.strictEqual(i, 2); + }); + throw new Error('err2'); + }, 0); + throw new Error('err1'); +}); diff --git a/test/async-hooks/test-async-local-storage-errors-sync-ret.js b/test/async-hooks/test-async-local-storage-errors-sync-ret.js new file mode 100644 index 00000000000000..f112df2b99dff7 --- /dev/null +++ b/test/async-hooks/test-async-local-storage-errors-sync-ret.js @@ -0,0 +1,31 @@ +'use strict'; +require('../common'); +const assert = require('assert'); +const { AsyncLocalStorage } = require('async_hooks'); + +// case 2 using *AndReturn calls (dual behaviors) +const asyncLocalStorage = new AsyncLocalStorage(); + +let i = 0; +process.setUncaughtExceptionCaptureCallback((err) => { + ++i; + assert.strictEqual(err.message, 'err2'); + assert.strictEqual(asyncLocalStorage.getStore().get('hello'), 'node'); +}); + +try { + asyncLocalStorage.runSyncAndReturn(() => { + const store = asyncLocalStorage.getStore(); + store.set('hello', 'node'); + setTimeout(() => { + process.nextTick(() => { + assert.strictEqual(i, 1); + }); + throw new Error('err2'); + }, 0); + throw new Error('err1'); + }); +} catch (e) { + assert.strictEqual(e.message, 'err1'); + assert.strictEqual(asyncLocalStorage.getStore(), undefined); +} diff --git a/test/async-hooks/test-async-local-storage-http.js b/test/async-hooks/test-async-local-storage-http.js new file mode 100644 index 00000000000000..9f107148402ec5 --- /dev/null +++ b/test/async-hooks/test-async-local-storage-http.js @@ -0,0 +1,21 @@ +'use strict'; +require('../common'); +const assert = require('assert'); +const { AsyncLocalStorage } = require('async_hooks'); +const http = require('http'); + +const asyncLocalStorage = new AsyncLocalStorage(); +const server = http.createServer((req, res) => { + res.end('ok'); +}); + +server.listen(0, () => { + asyncLocalStorage.run(() => { + const store = asyncLocalStorage.getStore(); + store.set('hello', 'world'); + http.get({ host: 'localhost', port: server.address().port }, () => { + assert.strictEqual(asyncLocalStorage.getStore().get('hello'), 'world'); + server.close(); + }); + }); +}); diff --git a/test/async-hooks/test-async-local-storage-nested.js b/test/async-hooks/test-async-local-storage-nested.js new file mode 100644 index 00000000000000..38330fff607ce2 --- /dev/null +++ b/test/async-hooks/test-async-local-storage-nested.js @@ -0,0 +1,22 @@ +'use strict'; +require('../common'); +const assert = require('assert'); +const { AsyncLocalStorage } = require('async_hooks'); + +const asyncLocalStorage = new AsyncLocalStorage(); + +setTimeout(() => { + asyncLocalStorage.run(() => { + const asyncLocalStorage2 = new AsyncLocalStorage(); + asyncLocalStorage2.run(() => { + const store = asyncLocalStorage.getStore(); + const store2 = asyncLocalStorage2.getStore(); + store.set('hello', 'world'); + store2.set('hello', 'foo'); + setTimeout(() => { + assert.strictEqual(asyncLocalStorage.getStore().get('hello'), 'world'); + assert.strictEqual(asyncLocalStorage2.getStore().get('hello'), 'foo'); + }, 200); + }); + }); +}, 100); diff --git a/test/async-hooks/test-async-local-storage-no-mix-contexts.js b/test/async-hooks/test-async-local-storage-no-mix-contexts.js new file mode 100644 index 00000000000000..561df546d4aa45 --- /dev/null +++ b/test/async-hooks/test-async-local-storage-no-mix-contexts.js @@ -0,0 +1,38 @@ +'use strict'; +require('../common'); +const assert = require('assert'); +const { AsyncLocalStorage } = require('async_hooks'); + +const asyncLocalStorage = new AsyncLocalStorage(); +const asyncLocalStorage2 = new AsyncLocalStorage(); + +setTimeout(() => { + asyncLocalStorage.run(() => { + asyncLocalStorage2.run(() => { + const store = asyncLocalStorage.getStore(); + const store2 = asyncLocalStorage2.getStore(); + store.set('hello', 'world'); + store2.set('hello', 'foo'); + setTimeout(() => { + assert.strictEqual(asyncLocalStorage.getStore().get('hello'), 'world'); + assert.strictEqual(asyncLocalStorage2.getStore().get('hello'), 'foo'); + asyncLocalStorage.exit(() => { + assert.strictEqual(asyncLocalStorage.getStore(), undefined); + assert.strictEqual(asyncLocalStorage2.getStore().get('hello'), 'foo'); + }); + assert.strictEqual(asyncLocalStorage.getStore().get('hello'), 'world'); + assert.strictEqual(asyncLocalStorage2.getStore().get('hello'), 'foo'); + }, 200); + }); + }); +}, 100); + +setTimeout(() => { + asyncLocalStorage.run(() => { + const store = asyncLocalStorage.getStore(); + store.set('hello', 'earth'); + setTimeout(() => { + assert.strictEqual(asyncLocalStorage.getStore().get('hello'), 'earth'); + }, 100); + }); +}, 100); diff --git a/test/async-hooks/test-async-local-storage-promises.js b/test/async-hooks/test-async-local-storage-promises.js new file mode 100644 index 00000000000000..3b05d0f1981a3c --- /dev/null +++ b/test/async-hooks/test-async-local-storage-promises.js @@ -0,0 +1,28 @@ +'use strict'; +require('../common'); +const assert = require('assert'); +const { AsyncLocalStorage } = require('async_hooks'); + +async function main() { + const asyncLocalStorage = new AsyncLocalStorage(); + const err = new Error(); + const next = () => Promise.resolve() + .then(() => { + assert.strictEqual(asyncLocalStorage.getStore().get('a'), 1); + throw err; + }); + await new Promise((resolve, reject) => { + asyncLocalStorage.run(() => { + const store = asyncLocalStorage.getStore(); + store.set('a', 1); + next().then(resolve, reject); + }); + }) + .catch((e) => { + assert.strictEqual(asyncLocalStorage.getStore(), undefined); + assert.strictEqual(e, err); + }); + assert.strictEqual(asyncLocalStorage.getStore(), undefined); +} + +main(); From 15cc9b01260c0099bc1c1ca1ef5f1fb669f24725 Mon Sep 17 00:00:00 2001 From: Eric Eastwood Date: Thu, 9 Jan 2020 01:14:44 -0600 Subject: [PATCH 39/91] doc: update assert.rejects() docs with a validation function example Spawned from my own struggle to use in https://gitlab.com/gitlab-org/gitter/webapp/merge_requests/1702#note_268452483 PR-URL: https://github.com/nodejs/node/pull/31271 Reviewed-By: Anna Henningsen Reviewed-By: Colin Ihrig --- doc/api/assert.md | 15 +++++++++++++++ 1 file changed, 15 insertions(+) diff --git a/doc/api/assert.md b/doc/api/assert.md index 809dc01340fcd7..ca53a69f564b4b 100644 --- a/doc/api/assert.md +++ b/doc/api/assert.md @@ -1111,6 +1111,21 @@ if the `asyncFn` fails to reject. })(); ``` +```js +(async () => { + await assert.rejects( + async () => { + throw new TypeError('Wrong value'); + }, + (err) => { + assert.strictEqual(err.name, 'TypeError'); + assert.strictEqual(err.message, 'Wrong value'); + return true; + } + ); +})(); +``` + ```js assert.rejects( Promise.reject(new Error('Wrong value')), From 88ccb444e3b0add8f140e4bc50cb0a8f924fe1cb Mon Sep 17 00:00:00 2001 From: Anna Henningsen Date: Fri, 6 Dec 2019 13:45:40 +0100 Subject: [PATCH 40/91] src: move BaseObject subclass dtors/ctors out of node_crypto.h MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Originally landed in the QUIC repo Move constructor and destructors for subclasses of `BaseObject` from node_crypto.h to node_crypto.cc. This removes the need to include base_object-inl.h when using node_crypto.h in some cases. Original review metadata: ``` PR-URL: https://github.com/nodejs/quic/pull/220 Reviewed-By: Anna Henningsen Reviewed-By: James M Snell ``` PR-URL: https://github.com/nodejs/node/pull/31872 Reviewed-By: Sam Roberts Reviewed-By: David Carlier Reviewed-By: Colin Ihrig Reviewed-By: Denys Otrishko Reviewed-By: Tobias Nießen --- src/node_crypto.cc | 79 +++++++++++++++++++++++++++++++++++++ src/node_crypto.h | 98 +++++++++------------------------------------- 2 files changed, 97 insertions(+), 80 deletions(-) diff --git a/src/node_crypto.cc b/src/node_crypto.cc index e129c7f3f595ae..a8a067086f62d8 100644 --- a/src/node_crypto.cc +++ b/src/node_crypto.cc @@ -531,6 +531,24 @@ void SecureContext::Initialize(Environment* env, Local target) { env->set_secure_context_constructor_template(t); } +SecureContext::SecureContext(Environment* env, v8::Local wrap) + : BaseObject(env, wrap) { + MakeWeak(); + env->isolate()->AdjustAmountOfExternalAllocatedMemory(kExternalSize); +} + +inline void SecureContext::Reset() { + if (ctx_ != nullptr) { + env()->isolate()->AdjustAmountOfExternalAllocatedMemory(-kExternalSize); + } + ctx_.reset(); + cert_.reset(); + issuer_.reset(); +} + +SecureContext::~SecureContext() { + Reset(); +} void SecureContext::New(const FunctionCallbackInfo& args) { Environment* env = Environment::GetCurrent(args); @@ -3854,6 +3872,15 @@ KeyType KeyObject::GetKeyType() const { return this->key_type_; } +KeyObject::KeyObject(Environment* env, + v8::Local wrap, + KeyType key_type) + : BaseObject(env, wrap), + key_type_(key_type), + symmetric_key_(nullptr, nullptr) { + MakeWeak(); +} + void KeyObject::Init(const FunctionCallbackInfo& args) { KeyObject* key; ASSIGN_OR_RETURN_UNWRAP(&key, args.Holder()); @@ -3998,6 +4025,17 @@ MaybeLocal KeyObject::ExportPrivateKey( return WritePrivateKey(env(), asymmetric_key_.get(), config); } +CipherBase::CipherBase(Environment* env, + v8::Local wrap, + CipherKind kind) + : BaseObject(env, wrap), + ctx_(nullptr), + kind_(kind), + auth_tag_state_(kAuthTagUnknown), + auth_tag_len_(kNoAuthTagLength), + pending_auth_failed_(false) { + MakeWeak(); +} void CipherBase::Initialize(Environment* env, Local target) { Local t = env->NewFunctionTemplate(New); @@ -4620,6 +4658,11 @@ void CipherBase::Final(const FunctionCallbackInfo& args) { args.GetReturnValue().Set(out.ToBuffer().ToLocalChecked()); } +Hmac::Hmac(Environment* env, v8::Local wrap) + : BaseObject(env, wrap), + ctx_(nullptr) { + MakeWeak(); +} void Hmac::Initialize(Environment* env, Local target) { Local t = env->NewFunctionTemplate(New); @@ -4739,6 +4782,13 @@ void Hmac::HmacDigest(const FunctionCallbackInfo& args) { args.GetReturnValue().Set(rc.ToLocalChecked()); } +Hash::Hash(Environment* env, v8::Local wrap) + : BaseObject(env, wrap), + mdctx_(nullptr), + has_md_(false), + md_value_(nullptr) { + MakeWeak(); +} void Hash::Initialize(Environment* env, Local target) { Local t = env->NewFunctionTemplate(New); @@ -4753,6 +4803,10 @@ void Hash::Initialize(Environment* env, Local target) { t->GetFunction(env->context()).ToLocalChecked()).Check(); } +Hash::~Hash() { + if (md_value_ != nullptr) + OPENSSL_clear_free(md_value_, md_len_); +} void Hash::New(const FunctionCallbackInfo& args) { Environment* env = Environment::GetCurrent(args); @@ -4977,6 +5031,10 @@ void CheckThrow(Environment* env, SignBase::Error error) { } } +SignBase::SignBase(Environment* env, v8::Local wrap) + : BaseObject(env, wrap) { +} + void SignBase::CheckThrow(SignBase::Error error) { node::crypto::CheckThrow(env(), error); } @@ -5000,6 +5058,9 @@ static bool ApplyRSAOptions(const ManagedEVPPKey& pkey, } +Sign::Sign(Environment* env, v8::Local wrap) : SignBase(env, wrap) { + MakeWeak(); +} void Sign::Initialize(Environment* env, Local target) { Local t = env->NewFunctionTemplate(New); @@ -5320,6 +5381,11 @@ void SignOneShot(const FunctionCallbackInfo& args) { args.GetReturnValue().Set(signature.ToBuffer().ToLocalChecked()); } +Verify::Verify(Environment* env, v8::Local wrap) : + SignBase(env, wrap) { + MakeWeak(); +} + void Verify::Initialize(Environment* env, Local target) { Local t = env->NewFunctionTemplate(New); @@ -5623,6 +5689,10 @@ void PublicKeyCipher::Cipher(const FunctionCallbackInfo& args) { args.GetReturnValue().Set(out.ToBuffer().ToLocalChecked()); } +DiffieHellman::DiffieHellman(Environment* env, v8::Local wrap) + : BaseObject(env, wrap), verifyError_(0) { + MakeWeak(); +} void DiffieHellman::Initialize(Environment* env, Local target) { auto make = [&] (Local name, FunctionCallback callback) { @@ -5992,6 +6062,15 @@ void ECDH::Initialize(Environment* env, Local target) { t->GetFunction(env->context()).ToLocalChecked()).Check(); } +ECDH::ECDH(Environment* env, v8::Local wrap, ECKeyPointer&& key) + : BaseObject(env, wrap), + key_(std::move(key)), + group_(EC_KEY_get0_group(key_.get())) { + MakeWeak(); + CHECK_NOT_NULL(group_); +} + +ECDH::~ECDH() {} void ECDH::New(const FunctionCallbackInfo& args) { Environment* env = Environment::GetCurrent(args); diff --git a/src/node_crypto.h b/src/node_crypto.h index b1270789b18ba2..655605290b092f 100644 --- a/src/node_crypto.h +++ b/src/node_crypto.h @@ -84,11 +84,9 @@ extern void UseExtraCaCerts(const std::string& file); void InitCryptoOnce(); -class SecureContext : public BaseObject { +class SecureContext final : public BaseObject { public: - ~SecureContext() override { - Reset(); - } + ~SecureContext() override; static void Initialize(Environment* env, v8::Local target); @@ -177,20 +175,8 @@ class SecureContext : public BaseObject { HMAC_CTX* hctx, int enc); - SecureContext(Environment* env, v8::Local wrap) - : BaseObject(env, wrap) { - MakeWeak(); - env->isolate()->AdjustAmountOfExternalAllocatedMemory(kExternalSize); - } - - inline void Reset() { - if (ctx_ != nullptr) { - env()->isolate()->AdjustAmountOfExternalAllocatedMemory(-kExternalSize); - } - ctx_.reset(); - cert_.reset(); - issuer_.reset(); - } + SecureContext(Environment* env, v8::Local wrap); + void Reset(); }; // SSLWrap implicitly depends on the inheriting class' handle having an @@ -463,14 +449,7 @@ class KeyObject : public BaseObject { v8::MaybeLocal ExportPrivateKey( const PrivateKeyEncodingConfig& config) const; - KeyObject(Environment* env, - v8::Local wrap, - KeyType key_type) - : BaseObject(env, wrap), - key_type_(key_type), - symmetric_key_(nullptr, nullptr) { - MakeWeak(); - } + KeyObject(Environment* env, v8::Local wrap, KeyType key_type); private: const KeyType key_type_; @@ -544,17 +523,7 @@ class CipherBase : public BaseObject { static void SetAuthTag(const v8::FunctionCallbackInfo& args); static void SetAAD(const v8::FunctionCallbackInfo& args); - CipherBase(Environment* env, - v8::Local wrap, - CipherKind kind) - : BaseObject(env, wrap), - ctx_(nullptr), - kind_(kind), - auth_tag_state_(kAuthTagUnknown), - auth_tag_len_(kNoAuthTagLength), - pending_auth_failed_(false) { - MakeWeak(); - } + CipherBase(Environment* env, v8::Local wrap, CipherKind kind); private: DeleteFnPtr ctx_; @@ -584,18 +553,16 @@ class Hmac : public BaseObject { static void HmacUpdate(const v8::FunctionCallbackInfo& args); static void HmacDigest(const v8::FunctionCallbackInfo& args); - Hmac(Environment* env, v8::Local wrap) - : BaseObject(env, wrap), - ctx_(nullptr) { - MakeWeak(); - } + Hmac(Environment* env, v8::Local wrap); private: DeleteFnPtr ctx_; }; -class Hash : public BaseObject { +class Hash final : public BaseObject { public: + ~Hash() override; + static void Initialize(Environment* env, v8::Local target); // TODO(joyeecheung): track the memory used by OpenSSL types @@ -611,18 +578,7 @@ class Hash : public BaseObject { static void HashUpdate(const v8::FunctionCallbackInfo& args); static void HashDigest(const v8::FunctionCallbackInfo& args); - Hash(Environment* env, v8::Local wrap) - : BaseObject(env, wrap), - mdctx_(nullptr), - has_md_(false), - md_value_(nullptr) { - MakeWeak(); - } - - ~Hash() override { - if (md_value_ != nullptr) - OPENSSL_clear_free(md_value_, md_len_); - } + Hash(Environment* env, v8::Local wrap); private: EVPMDPointer mdctx_; @@ -644,9 +600,7 @@ class SignBase : public BaseObject { kSignMalformedSignature } Error; - SignBase(Environment* env, v8::Local wrap) - : BaseObject(env, wrap) { - } + SignBase(Environment* env, v8::Local wrap); Error Init(const char* sign_type); Error Update(const char* data, int len); @@ -692,9 +646,7 @@ class Sign : public SignBase { static void SignUpdate(const v8::FunctionCallbackInfo& args); static void SignFinal(const v8::FunctionCallbackInfo& args); - Sign(Environment* env, v8::Local wrap) : SignBase(env, wrap) { - MakeWeak(); - } + Sign(Environment* env, v8::Local wrap); }; class Verify : public SignBase { @@ -713,9 +665,7 @@ class Verify : public SignBase { static void VerifyUpdate(const v8::FunctionCallbackInfo& args); static void VerifyFinal(const v8::FunctionCallbackInfo& args); - Verify(Environment* env, v8::Local wrap) : SignBase(env, wrap) { - MakeWeak(); - } + Verify(Environment* env, v8::Local wrap); }; class PublicKeyCipher { @@ -772,11 +722,7 @@ class DiffieHellman : public BaseObject { static void VerifyErrorGetter( const v8::FunctionCallbackInfo& args); - DiffieHellman(Environment* env, v8::Local wrap) - : BaseObject(env, wrap), - verifyError_(0) { - MakeWeak(); - } + DiffieHellman(Environment* env, v8::Local wrap); // TODO(joyeecheung): track the memory used by OpenSSL types SET_NO_MEMORY_INFO() @@ -795,11 +741,9 @@ class DiffieHellman : public BaseObject { DHPointer dh_; }; -class ECDH : public BaseObject { +class ECDH final : public BaseObject { public: - ~ECDH() override { - group_ = nullptr; - } + ~ECDH() override; static void Initialize(Environment* env, v8::Local target); static ECPointPointer BufferToPoint(Environment* env, @@ -812,13 +756,7 @@ class ECDH : public BaseObject { SET_SELF_SIZE(ECDH) protected: - ECDH(Environment* env, v8::Local wrap, ECKeyPointer&& key) - : BaseObject(env, wrap), - key_(std::move(key)), - group_(EC_KEY_get0_group(key_.get())) { - MakeWeak(); - CHECK_NOT_NULL(group_); - } + ECDH(Environment* env, v8::Local wrap, ECKeyPointer&& key); static void New(const v8::FunctionCallbackInfo& args); static void GenerateKeys(const v8::FunctionCallbackInfo& args); From 8fa6373e628612ca9b977865efa16f134f895893 Mon Sep 17 00:00:00 2001 From: Anna Henningsen Date: Tue, 1 Oct 2019 21:58:12 +0200 Subject: [PATCH 41/91] src: allow unique_ptrs with custom deleter in memory tracker Originally landed in the QUIC repo Original review metadata: ``` PR-URL: https://github.com/nodejs/quic/pull/145 Reviewed-By: James M Snell ``` PR-URL: https://github.com/nodejs/node/pull/31870 Reviewed-By: Anna Henningsen Reviewed-By: David Carlier Reviewed-By: Joyee Cheung Reviewed-By: Denys Otrishko Reviewed-By: Colin Ihrig --- src/memory_tracker-inl.h | 4 ++-- src/memory_tracker.h | 4 ++-- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/src/memory_tracker-inl.h b/src/memory_tracker-inl.h index 1a28e2dd792747..9e6201442ab6b1 100644 --- a/src/memory_tracker-inl.h +++ b/src/memory_tracker-inl.h @@ -107,9 +107,9 @@ void MemoryTracker::TrackField(const char* edge_name, } } -template +template void MemoryTracker::TrackField(const char* edge_name, - const std::unique_ptr& value, + const std::unique_ptr& value, const char* node_name) { if (value.get() == nullptr) { return; diff --git a/src/memory_tracker.h b/src/memory_tracker.h index 616976ab2afab1..4a66e9ce74ccc5 100644 --- a/src/memory_tracker.h +++ b/src/memory_tracker.h @@ -140,9 +140,9 @@ class MemoryTracker { const char* node_name = nullptr); // Shortcut to extract the underlying object out of the smart pointer - template + template inline void TrackField(const char* edge_name, - const std::unique_ptr& value, + const std::unique_ptr& value, const char* node_name = nullptr); template From a46839279f6641b2f285f19e05b9fa081cc56182 Mon Sep 17 00:00:00 2001 From: Myles Borins Date: Tue, 18 Feb 2020 15:03:18 -0500 Subject: [PATCH 42/91] doc: update releases guide re pushing tags Currently we specify pushing the tag before updating any of the branches. This creates a potential of creating and pushing a tag on an out of sync branch, and only really when attempting to merge the release commit that things have gone out of sync. Changing the order here would minimize the possibility of this error PR-URL: https://github.com/nodejs/node/pull/31855 Reviewed-By: Richard Lau Reviewed-By: Shelley Vohr Reviewed-By: Beth Griggs Reviewed-By: Ruben Bridgewater Reviewed-By: Trivikram Kamat --- doc/releases.md | 38 ++++++++++++++++++++------------------ 1 file changed, 20 insertions(+), 18 deletions(-) diff --git a/doc/releases.md b/doc/releases.md index 76f6ad57412ab7..37018e1b3fd002 100644 --- a/doc/releases.md +++ b/doc/releases.md @@ -498,17 +498,6 @@ $ git secure-tag -sm "YYYY-MM-DD Node.js vx.y.z ( -``` - -*Note*: Please do not push the tag unless you are ready to complete the -remainder of the release steps. - ### 12. Set Up For the Next Release On release proposal branch, edit `src/node_version.h` again and: @@ -547,7 +536,20 @@ cherry-pick the "Working on vx.y.z" commit to `master`. Run `make lint` before pushing to `master`, to make sure the Changelog formatting passes the lint rules on `master`. -### 13. Promote and Sign the Release Builds +### 13. Push the release tag + +Push the tag to the repo before you promote the builds. If you haven't pushed +your tag first, then build promotion won't work properly. Push the tag using the +following command: + +```console +$ git push +``` + +*Note*: Please do not push the tag unless you are ready to complete the +remainder of the release steps. + +### 14. Promote and Sign the Release Builds **The same individual who signed the release tag must be the one to promote the builds as the `SHASUMS256.txt` file needs to be signed with the @@ -620,7 +622,7 @@ be prompted to re-sign `SHASUMS256.txt`. **It is possible to only sign a release by running `./tools/release.sh -s vX.Y.Z`.** -### 14. Check the Release +### 15. Check the Release Your release should be available at `https://nodejs.org/dist/vx.y.z/` and . Check that the appropriate files are in @@ -629,7 +631,7 @@ have the right internal version strings. Check that the API docs are available at . Check that the release catalog files are correct at and . -### 15. Create a Blog Post +### 16. Create a Blog Post There is an automatic build that is kicked off when you promote new builds, so within a few minutes nodejs.org will be listing your new version as the latest @@ -662,7 +664,7 @@ This script will use the promoted builds and changelog to generate the post. Run * Changes to `master` on the [nodejs.org repository][] will trigger a new build of nodejs.org so your changes should appear a few minutes after pushing. -### 16. Create the release on GitHub +### 17. Create the release on GitHub * Go to the [New release page](https://github.com/nodejs/node/releases/new). * Select the tag version you pushed earlier. @@ -670,11 +672,11 @@ This script will use the promoted builds and changelog to generate the post. Run * For the description, copy the rest of the changelog entry. * Click on the "Publish release" button. -### 17. Cleanup +### 18. Cleanup Close your release proposal PR and delete the proposal branch. -### 18. Announce +### 19. Announce The nodejs.org website will automatically rebuild and include the new version. To announce the build on Twitter through the official @nodejs account, email @@ -691,7 +693,7 @@ announcements. Ping the IRC ops and the other [Partner Communities][] liaisons. -### 19. Celebrate +### 20. Celebrate _In whatever form you do this..._ From 9f68e140529dda953f312d2ae19a595ba5b2b9f0 Mon Sep 17 00:00:00 2001 From: Harshitha KP Date: Fri, 21 Feb 2020 05:38:53 -0500 Subject: [PATCH 43/91] src: elevate v8 namespaces PR-URL: https://github.com/nodejs/node/pull/31901 Reviewed-By: Richard Lau Reviewed-By: Michael Dawson Reviewed-By: James M Snell Reviewed-By: Gus Caplan Reviewed-By: Anna Henningsen --- src/module_wrap.cc | 12 +++++++----- 1 file changed, 7 insertions(+), 5 deletions(-) diff --git a/src/module_wrap.cc b/src/module_wrap.cc index 436a6e98e73fe5..350e395ebf43a2 100644 --- a/src/module_wrap.cc +++ b/src/module_wrap.cc @@ -25,6 +25,7 @@ using node::url::URL_FLAGS_FAILED; using v8::Array; using v8::ArrayBufferView; using v8::Context; +using v8::EscapableHandleScope; using v8::Function; using v8::FunctionCallbackInfo; using v8::FunctionTemplate; @@ -45,6 +46,7 @@ using v8::PrimitiveArray; using v8::Promise; using v8::ScriptCompiler; using v8::ScriptOrigin; +using v8::ScriptOrModule; using v8::String; using v8::UnboundModuleScript; using v8::Undefined; @@ -627,7 +629,7 @@ Maybe GetPackageConfig(Environment* env, std::string pkg_src = source.FromJust(); Isolate* isolate = env->isolate(); - v8::HandleScope handle_scope(isolate); + HandleScope handle_scope(isolate); Local pkg_json; { @@ -899,7 +901,7 @@ void ThrowExportsInvalid(Environment* env, const URL& base) { Local target_string; if (target->IsObject()) { - if (!v8::JSON::Stringify(env->context(), target.As(), + if (!v8::JSON::Stringify(env->context(), target.As(), v8::String::Empty(env->isolate())).ToLocal(&target_string)) return; } else { @@ -977,7 +979,7 @@ Maybe ResolveExportsTarget(Environment* env, Isolate* isolate = env->isolate(); Local context = env->context(); if (target->IsString()) { - Utf8Value target_utf8(isolate, target.As()); + Utf8Value target_utf8(isolate, target.As()); std::string target_str(*target_utf8, target_utf8.length()); Maybe resolved = ResolveExportsTargetString(env, target_str, subpath, pkg_subpath, pjson_url, base); @@ -1440,12 +1442,12 @@ void ModuleWrap::GetPackageType(const FunctionCallbackInfo& args) { static MaybeLocal ImportModuleDynamically( Local context, - Local referrer, + Local referrer, Local specifier) { Isolate* iso = context->GetIsolate(); Environment* env = Environment::GetCurrent(context); CHECK_NOT_NULL(env); // TODO(addaleax): Handle nullptr here. - v8::EscapableHandleScope handle_scope(iso); + EscapableHandleScope handle_scope(iso); Local import_callback = env->host_import_module_dynamically_callback(); From b30a6981d36b5eff58fa36aeb6e731fa60b22909 Mon Sep 17 00:00:00 2001 From: Sam Roberts Date: Fri, 14 Feb 2020 11:47:20 -0800 Subject: [PATCH 44/91] deps: move zlib maintenance info to guides deps/zlib/README.md is not part of the upstream zlib, it is a Node.js specific addition describing how to maintain zlib and should be in doc/guides/. PR-URL: https://github.com/nodejs/node/pull/31800 Reviewed-By: James M Snell Reviewed-By: Colin Ihrig Reviewed-By: Beth Griggs --- deps/zlib/README.md => doc/guides/maintaining-zlib.md | 0 1 file changed, 0 insertions(+), 0 deletions(-) rename deps/zlib/README.md => doc/guides/maintaining-zlib.md (100%) diff --git a/deps/zlib/README.md b/doc/guides/maintaining-zlib.md similarity index 100% rename from deps/zlib/README.md rename to doc/guides/maintaining-zlib.md From 4c6343fdeaf4fef73769e8cf0e2db7d1ec75ac9e Mon Sep 17 00:00:00 2001 From: Sam Roberts Date: Fri, 14 Feb 2020 13:00:14 -0800 Subject: [PATCH 45/91] doc: describe how to update zlib See: - https://github.com/nodejs/node/pull/31201 PR-URL: https://github.com/nodejs/node/pull/31800 Reviewed-By: James M Snell Reviewed-By: Colin Ihrig Reviewed-By: Beth Griggs --- doc/guides/maintaining-zlib.md | 36 ++++++++++++++++++++++++++++++---- 1 file changed, 32 insertions(+), 4 deletions(-) diff --git a/doc/guides/maintaining-zlib.md b/doc/guides/maintaining-zlib.md index a12f0a7876a168..c293fdf5d40fc6 100644 --- a/doc/guides/maintaining-zlib.md +++ b/doc/guides/maintaining-zlib.md @@ -1,6 +1,34 @@ -This copy of zlib comes from the Chromium team's zlib fork which incorporated performance improvements not currently available in standard zlib. +# Maintaining zlib -To update this code: +This copy of zlib comes from the Chromium team's zlib fork which incorporated +performance improvements not currently available in standard zlib. -* Clone https://chromium.googlesource.com/chromium/src/third_party/zlib -* Comment out the `#include "chromeconf.h"` in zconf.h to maintain full compatibility with node addons +## Updating zlib + +Update zlib: +```shell +git clone https://chromium.googlesource.com/chromium/src/third_party/zlib +cp deps/zlib/zlib.gyp deps/zlib/win32/zlib.def deps +rm -rf deps/zlib zlib/.git +mv zlib deps/ +mv deps/zlib.gyp deps/zlib/ +mkdir deps/zlib/win32 +mv deps/zlib.def deps/zlib/win32 +sed -i -- 's_^#include "chromeconf.h"_//#include "chromeconf.h"_' deps/zlib/zconf.h +``` + +Check that Node.js still builds and tests. + +It may be necessary to update deps/zlib/zlib.gyp if any significant changes have +occurred upstream. + +## Commiting zlib + +Add zlib: `git add --all deps/zlib` + +Commit the changes with a message like +```text +deps: update zlib to upstream d7f3ca9 + +Updated as described in doc/guides/maintaining-zlib.md. +``` From c27f0d10c49ee113f7cdcfa7d458dc236ac333f4 Mon Sep 17 00:00:00 2001 From: Sam Roberts Date: Fri, 14 Feb 2020 13:10:57 -0800 Subject: [PATCH 46/91] deps: update zlib to upstream d7f3ca9 Updated as described in doc/guides/maintaining-zlib.md. PR-URL: https://github.com/nodejs/node/pull/31800 Reviewed-By: James M Snell Reviewed-By: Colin Ihrig Reviewed-By: Beth Griggs --- deps/zlib/google/test/data/create_test_zip.sh | 0 1 file changed, 0 insertions(+), 0 deletions(-) mode change 100644 => 100755 deps/zlib/google/test/data/create_test_zip.sh diff --git a/deps/zlib/google/test/data/create_test_zip.sh b/deps/zlib/google/test/data/create_test_zip.sh old mode 100644 new mode 100755 From 776f379124fbf15743bd83b1f73dbb56dfcade10 Mon Sep 17 00:00:00 2001 From: Gabriel Schulhof Date: Fri, 21 Feb 2020 08:26:29 -0800 Subject: [PATCH 47/91] src: include large pages source unconditionally Restrict the usage of the C preprocessor directive enabling large pages support to the large pages implementation. This cleans up the code in src/node.cc. PR-URL: https://github.com/nodejs/node/pull/31904 Reviewed-By: Ben Noordhuis Reviewed-By: David Carlier Reviewed-By: Denys Otrishko --- node.gyp | 6 +-- src/large_pages/node_large_page.cc | 69 +++++++++++++++++++++++------- src/large_pages/node_large_page.h | 3 +- src/node.cc | 20 ++------- 4 files changed, 59 insertions(+), 39 deletions(-) diff --git a/node.gyp b/node.gyp index 8c7911732b944c..0ac8064697aa36 100644 --- a/node.gyp +++ b/node.gyp @@ -619,6 +619,8 @@ 'src/histogram.h', 'src/histogram-inl.h', 'src/js_stream.h', + 'src/large_pages/node_large_page.cc', + 'src/large_pages/node_large_page.h' 'src/memory_tracker.h', 'src/memory_tracker-inl.h', 'src/module_wrap.h', @@ -850,10 +852,6 @@ 'target_arch=="x64" and ' 'node_target_type=="executable"', { 'defines': [ 'NODE_ENABLE_LARGE_CODE_PAGES=1' ], - 'sources': [ - 'src/large_pages/node_large_page.cc', - 'src/large_pages/node_large_page.h' - ], }], [ 'use_openssl_def==1', { # TODO(bnoordhuis) Make all platforms export the same list of symbols. diff --git a/src/large_pages/node_large_page.cc b/src/large_pages/node_large_page.cc index ce58e32e719bc8..a72cb65bb65674 100644 --- a/src/large_pages/node_large_page.cc +++ b/src/large_pages/node_large_page.cc @@ -21,6 +21,11 @@ // SPDX-License-Identifier: MIT #include "node_large_page.h" + +#include // NOLINT(build/include) + +// Besides returning ENOTSUP at runtime we do nothing if this define is missing. +#if defined(NODE_ENABLE_LARGE_CODE_PAGES) && NODE_ENABLE_LARGE_CODE_PAGES #include "util.h" #include "uv.h" @@ -35,7 +40,6 @@ #endif #include // readlink -#include // NOLINT(build/include) #include // PATH_MAX #include #include @@ -71,7 +75,11 @@ extern char __executable_start; } // extern "C" #endif // defined(__linux__) +#endif // defined(NODE_ENABLE_LARGE_CODE_PAGES) && NODE_ENABLE_LARGE_CODE_PAGES namespace node { +#if defined(NODE_ENABLE_LARGE_CODE_PAGES) && NODE_ENABLE_LARGE_CODE_PAGES + +namespace { struct text_region { char* from; @@ -103,7 +111,7 @@ inline uintptr_t hugepage_align_down(uintptr_t addr) { // 00400000-00452000 r-xp 00000000 08:02 173521 /usr/bin/dbus-daemon // This is also handling the case where the first line is not the binary. -static struct text_region FindNodeTextRegion() { +struct text_region FindNodeTextRegion() { struct text_region nregion; nregion.found_text_region = false; #if defined(__linux__) @@ -263,7 +271,7 @@ static struct text_region FindNodeTextRegion() { } #if defined(__linux__) -static bool IsTransparentHugePagesEnabled() { +bool IsTransparentHugePagesEnabled() { std::ifstream ifs; ifs.open("/sys/kernel/mm/transparent_hugepage/enabled"); @@ -294,6 +302,8 @@ static bool IsSuperPagesEnabled() { } #endif +} // End of anonymous namespace + // Moving the text region to large pages. We need to be very careful. // 1: This function itself should not be moved. // We use a gcc attributes @@ -408,14 +418,26 @@ MoveTextRegionToLargePages(const text_region& r) { if (-1 == munmap(nmem, size)) PrintSystemError(errno); return ret; } +#endif // defined(NODE_ENABLE_LARGE_CODE_PAGES) && NODE_ENABLE_LARGE_CODE_PAGES // This is the primary API called from main. int MapStaticCodeToLargePages() { +#if defined(NODE_ENABLE_LARGE_CODE_PAGES) && NODE_ENABLE_LARGE_CODE_PAGES + bool have_thp = false; +#if defined(__linux__) + have_thp = IsTransparentHugePagesEnabled(); +#elif defined(__FreeBSD__) + have_thp = IsSuperPagesEnabled(); +#elif defined(__APPLE__) + // pse-36 flag is present in recent mac x64 products. + have_thp = true; +#endif + if (!have_thp) + return EACCES; + struct text_region r = FindNodeTextRegion(); - if (r.found_text_region == false) { - PrintWarning("failed to find text region"); - return -1; - } + if (r.found_text_region == false) + return ENOENT; #if defined(__FreeBSD__) if (r.from < reinterpret_cast(&MoveTextRegionToLargePages)) @@ -423,17 +445,32 @@ int MapStaticCodeToLargePages() { #endif return MoveTextRegionToLargePages(r); +#else + return ENOTSUP; +#endif } -bool IsLargePagesEnabled() { -#if defined(__linux__) - return IsTransparentHugePagesEnabled(); -#elif defined(__FreeBSD__) - return IsSuperPagesEnabled(); -#elif defined(__APPLE__) - // pse-36 flag is present in recent mac x64 products. - return true; -#endif +const char* LargePagesError(int status) { + switch (status) { + case ENOTSUP: + return "Mapping to large pages is not supported."; + + case EACCES: + return "Large pages are not enabled."; + + case ENOENT: + return "failed to find text region"; + + case -1: + return "Mapping code to large pages failed. Reverting to default page " + "size."; + + case 0: + return "OK"; + + default: + return "Unknown error"; + } } } // namespace node diff --git a/src/large_pages/node_large_page.h b/src/large_pages/node_large_page.h index bce505585cf0d0..622cf09ede4e1c 100644 --- a/src/large_pages/node_large_page.h +++ b/src/large_pages/node_large_page.h @@ -25,10 +25,9 @@ #if defined(NODE_WANT_INTERNALS) && NODE_WANT_INTERNALS - namespace node { -bool IsLargePagesEnabled(); int MapStaticCodeToLargePages(); +const char* LargePagesError(int status); } // namespace node #endif // NODE_WANT_INTERNALS diff --git a/src/node.cc b/src/node.cc index a0398b1a4f8d2c..aec70381c58e67 100644 --- a/src/node.cc +++ b/src/node.cc @@ -65,9 +65,7 @@ #include "inspector/worker_inspector.h" // ParentInspectorHandle #endif -#ifdef NODE_ENABLE_LARGE_CODE_PAGES #include "large_pages/node_large_page.h" -#endif #ifdef NODE_REPORT #include "node_report.h" @@ -936,25 +934,13 @@ InitializationResult InitializeOncePerProcess(int argc, char** argv) { } } -#if defined(NODE_ENABLE_LARGE_CODE_PAGES) && NODE_ENABLE_LARGE_CODE_PAGES if (per_process::cli_options->use_largepages == "on" || per_process::cli_options->use_largepages == "silent") { - if (node::IsLargePagesEnabled()) { - if (node::MapStaticCodeToLargePages() != 0 && - per_process::cli_options->use_largepages != "silent") { - fprintf(stderr, - "Mapping code to large pages failed. Reverting to default page " - "size.\n"); - } - } else if (per_process::cli_options->use_largepages != "silent") { - fprintf(stderr, "Large pages are not enabled.\n"); + int result = node::MapStaticCodeToLargePages(); + if (per_process::cli_options->use_largepages == "on" && result != 0) { + fprintf(stderr, "%s\n", node::LargePagesError(result)); } } -#else - if (per_process::cli_options->use_largepages == "on") { - fprintf(stderr, "Mapping to large pages is not supported.\n"); - } -#endif // NODE_ENABLE_LARGE_CODE_PAGES if (per_process::cli_options->print_version) { printf("%s\n", NODE_VERSION); From e08fef1fdaa0a8114d439956651e50a9de5c6c44 Mon Sep 17 00:00:00 2001 From: Daniel Bevenius Date: Fri, 21 Feb 2020 07:27:29 +0100 Subject: [PATCH 48/91] test: add secp224k1 check in crypto-dh-stateless MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit This commit adds a check to test-crypto-dh-statless to avoid an error if the curve secp224k1 is not present. This could occur if node was configured with shared-openssl. The use case for this was building on RHEL 8.1 which only has the following curves: $ openssl ecparam -list_curves secp224r1 : NIST/SECG curve over a 224 bit prime field secp256k1 : SECG curve over a 256 bit prime field secp384r1 : NIST/SECG curve over a 384 bit prime field secp521r1 : NIST/SECG curve over a 521 bit prime field prime256v1: X9.62/SECG curve over a 256 bit prime field PR-URL: https://github.com/nodejs/node/pull/31715 Reviewed-By: Ben Noordhuis Reviewed-By: Tobias Nießen Reviewed-By: Michael Dawson Reviewed-By: Anna Henningsen Reviewed-By: James M Snell --- test/parallel/test-crypto-dh-stateless.js | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/test/parallel/test-crypto-dh-stateless.js b/test/parallel/test-crypto-dh-stateless.js index f00ee997cfcfdf..b01cea76b221c1 100644 --- a/test/parallel/test-crypto-dh-stateless.js +++ b/test/parallel/test-crypto-dh-stateless.js @@ -196,9 +196,10 @@ for (const [params1, params2] of [ test(crypto.generateKeyPairSync('ec', { namedCurve: 'secp256k1' }), crypto.generateKeyPairSync('ec', { namedCurve: 'secp256k1' })); +const not256k1 = crypto.getCurves().find((c) => /^sec.*(224|384|512)/.test(c)); assert.throws(() => { test(crypto.generateKeyPairSync('ec', { namedCurve: 'secp256k1' }), - crypto.generateKeyPairSync('ec', { namedCurve: 'secp224k1' })); + crypto.generateKeyPairSync('ec', { namedCurve: not256k1 })); }, { name: 'Error', code: 'ERR_OSSL_EVP_DIFFERENT_PARAMETERS' From f293dcf6de3c9e61fe2418c408145c92cc65dc70 Mon Sep 17 00:00:00 2001 From: Joyee Cheung Date: Tue, 18 Feb 2020 23:01:21 +0800 Subject: [PATCH 49/91] tools: add NODE_TEST_NO_INTERNET to the doc builder At the moment the doc builder tries to access the internet for CHANGELOG information and only falls back to local sources after the connection fails or a 5 second timeout. This means that the doc building could take at least 7 minutes on a machine with hijacked connection to Github for useless network attempts. This patch adds a NODE_TEST_NO_INTERNET environment variable to directly bypass these attempts so that docs can be built in reasonable time on a machine like that. PR-URL: https://github.com/nodejs/node/pull/31849 Fixes: https://github.com/nodejs/node/issues/29918 Reviewed-By: Matheus Marchini Reviewed-By: Richard Lau Reviewed-By: Ruben Bridgewater Reviewed-By: Colin Ihrig Reviewed-By: Luigi Pinca --- tools/doc/versions.js | 26 ++++++++++++++++---------- 1 file changed, 16 insertions(+), 10 deletions(-) diff --git a/tools/doc/versions.js b/tools/doc/versions.js index 7a4e2c3ff76b1a..782ce90ee2c616 100644 --- a/tools/doc/versions.js +++ b/tools/doc/versions.js @@ -31,6 +31,8 @@ const getUrl = (url) => { }); }; +const kNoInternet = !!process.env.NODE_TEST_NO_INTERNET; + module.exports = { async versions() { if (_versions) { @@ -42,16 +44,20 @@ module.exports = { const url = 'https://raw.githubusercontent.com/nodejs/node/master/CHANGELOG.md'; let changelog; - try { - changelog = await getUrl(url); - } catch (e) { - // Fail if this is a release build, otherwise fallback to local files. - if (isRelease()) { - throw e; - } else { - const file = path.join(srcRoot, 'CHANGELOG.md'); - console.warn(`Unable to retrieve ${url}. Falling back to ${file}.`); - changelog = readFileSync(file, { encoding: 'utf8' }); + const file = path.join(srcRoot, 'CHANGELOG.md'); + if (kNoInternet) { + changelog = readFileSync(file, { encoding: 'utf8' }); + } else { + try { + changelog = await getUrl(url); + } catch (e) { + // Fail if this is a release build, otherwise fallback to local files. + if (isRelease()) { + throw e; + } else { + console.warn(`Unable to retrieve ${url}. Falling back to ${file}.`); + changelog = readFileSync(file, { encoding: 'utf8' }); + } } } const ltsRE = /Long Term Support/i; From c5acf0a13bd1debf2d8779ab6f2baf2639cc5b0f Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Sun, 23 Feb 2020 11:54:39 -0800 Subject: [PATCH 50/91] doc: updated YAML version representation in readline.md All other versions in YAML throughout the docs start with _v_. Fix two cases in `readline.md` that did not. PR-URL: https://github.com/nodejs/node/pull/31924 Reviewed-By: Ruben Bridgewater Reviewed-By: James M Snell --- doc/api/readline.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/doc/api/readline.md b/doc/api/readline.md index 1fb00b9c3121be..f36f0db9db73b1 100644 --- a/doc/api/readline.md +++ b/doc/api/readline.md @@ -352,7 +352,7 @@ async function processLineByLine() { ### `rl.line` * {string|undefined} @@ -387,7 +387,7 @@ process.stdin.on('keypress', (c, k) => { ### `rl.cursor` * {number|undefined} From ae3929e958dfa72bcf9f9efeb53c89f825c228ce Mon Sep 17 00:00:00 2001 From: Joyee Cheung Date: Sat, 8 Feb 2020 01:49:43 +0800 Subject: [PATCH 51/91] vm: implement vm.measureMemory() for per-context memory measurement This patch implements `vm.measureMemory()` with the new `v8::Isolate::MeasureMemory()` API to measure per-context memory usage. This should be experimental, since detailed memory measurement requires further integration with the V8 API that should be available in a future V8 update. PR-URL: https://github.com/nodejs/node/pull/31824 Refs: https://github.com/ulan/performance-measure-memory Reviewed-By: Ben Noordhuis Reviewed-By: Anna Henningsen Reviewed-By: James M Snell Reviewed-By: Denys Otrishko Reviewed-By: Colin Ihrig --- doc/api/errors.md | 8 +++ doc/api/vm.md | 50 ++++++++++++++++++ lib/internal/errors.js | 1 + lib/vm.js | 37 ++++++++++++- src/node_contextify.cc | 44 ++++++++++++++++ test/parallel/test-vm-measure-memory.js | 70 +++++++++++++++++++++++++ 6 files changed, 208 insertions(+), 2 deletions(-) create mode 100644 test/parallel/test-vm-measure-memory.js diff --git a/doc/api/errors.md b/doc/api/errors.md index a7726cd03019cd..15be1fd2061618 100644 --- a/doc/api/errors.md +++ b/doc/api/errors.md @@ -710,6 +710,14 @@ STDERR/STDOUT, and the data's length is longer than the `maxBuffer` option. `Console` was instantiated without `stdout` stream, or `Console` has a non-writable `stdout` or `stderr` stream. + +### `ERR_CONTEXT_NOT_INITIALIZED` + +The vm context passed into the API is not yet initialized. This could happen +when an error occurs (and is caught) during the creation of the +context, for example, when the allocation fails or the maximum call stack +size is reached when the context is created. + ### `ERR_CONSTRUCT_CALL_REQUIRED` diff --git a/doc/api/vm.md b/doc/api/vm.md index bd64b23484eb24..ed676414471b8e 100644 --- a/doc/api/vm.md +++ b/doc/api/vm.md @@ -295,6 +295,56 @@ console.log(globalVar); // 1000 ``` +## `vm.measureMemory([options])` + + + +> Stability: 1 - Experimental + +Measure the memory known to V8 and used by the current execution context +or a specified context. + +* `options` {Object} Optional. + * `mode` {string} Either `'summary'` or `'detailed'`. + **Default:** `'summary'` + * `context` {Object} Optional. A [contextified][] object returned + by `vm.createContext()`. If not specified, measure the memory + usage of the current context where `vm.measureMemory()` is invoked. +* Returns: {Promise} If the memory is successfully measured the promise will + resolve with an object containing information about the memory usage. + +The format of the object that the returned Promise may resolve with is +specific to the V8 engine and may change from one version of V8 to the next. + +The returned result is different from the statistics returned by +`v8.getHeapSpaceStatistics()` in that `vm.measureMemory()` measures +the memory reachable by V8 from a specific context, while +`v8.getHeapSpaceStatistics()` measures the memory used by an instance +of V8 engine, which can switch among multiple contexts that reference +objects in the heap of one engine. + +```js +const vm = require('vm'); +// Measure the memory used by the current context and return the result +// in summary. +vm.measureMemory({ mode: 'summary' }) + // Is the same as vm.measureMemory() + .then((result) => { + // The current format is: + // { total: { jsMemoryEstimate: 2211728, jsMemoryRange: [ 0, 2211728 ] } } + console.log(result); + }); + +const context = vm.createContext({}); +vm.measureMemory({ mode: 'detailed' }, context) + .then((result) => { + // At the moment the detailed format is the same as the summary one. + console.log(result); + }); +``` + ## Class: `vm.Module` -* `authority` {string|URL} +* `authority` {string|URL} The remote HTTP/2 server to connect to. This must + be in the form of a minimal, valid URL with the `http://` or `https://` + prefix, host name, and IP port (if a non-default port is used). Userinfo + (user ID and password), path, querystring, and fragment details in the + URL will be ignored. * `options` {Object} * `maxDeflateDynamicTableSize` {number} Sets the maximum dynamic table size for deflating header fields. **Default:** `4Kib`. From e3258fd1483c5870461483116547168e85518643 Mon Sep 17 00:00:00 2001 From: James M Snell Date: Wed, 5 Feb 2020 18:37:48 -0800 Subject: [PATCH 60/91] doc: update zlib doc Just some general improvements to zlib docs and examples Signed-off-by: James M Snell PR-URL: https://github.com/nodejs/node/pull/31665 Reviewed-By: Anna Henningsen Reviewed-By: Luigi Pinca --- doc/api/zlib.md | 190 ++++++++++++++++++++++++++++++++++++------------ 1 file changed, 144 insertions(+), 46 deletions(-) diff --git a/doc/api/zlib.md b/doc/api/zlib.md index e059cc407df5cd..04465c11352269 100644 --- a/doc/api/zlib.md +++ b/doc/api/zlib.md @@ -4,60 +4,121 @@ > Stability: 2 - Stable -The `zlib` module provides compression functionality implemented using Gzip and -Deflate/Inflate, as well as Brotli. It can be accessed using: +The `zlib` module provides compression functionality implemented using Gzip, +Deflate/Inflate, and Brotli. + +To access it: ```js const zlib = require('zlib'); ``` +Compression and decompression are built around the Node.js [Streams API][]. + Compressing or decompressing a stream (such as a file) can be accomplished by -piping the source stream data through a `zlib` stream into a destination stream: +piping the source stream through a `zlib` `Transform` stream into a destination +stream: ```js -const gzip = zlib.createGzip(); -const fs = require('fs'); -const inp = fs.createReadStream('input.txt'); -const out = fs.createWriteStream('input.txt.gz'); - -inp.pipe(gzip) - .on('error', () => { - // handle error - }) - .pipe(out) - .on('error', () => { - // handle error +const { createGzip } = require('zlib'); +const { pipeline } = require('stream'); +const { + createReadStream, + createWriteStream +} = require('fs'); + +const gzip = createGzip(); +const source = createReadStream('input.txt'); +const destination = createWriteStream('input.txt.gz'); + +pipeline(source, gzip, destination, (err) => { + if (err) { + console.error('An error occurred:', err); + process.exitCode = 1; + } +}); + +// Or, Promisified + +const { promisify } = require('util'); +const pipe = promisify(pipeline); + +async function do_gzip(input, output) { + const gzip = createGzip(); + const source = createReadStream(input); + const destination = createWriteStream(output); + await pipe(source, gzip, destination); +} + +do_gzip('input.txt', 'input.txt.gz') + .catch((err) => { + console.error('An error occurred:', err); + process.exitCode = 1; }); ``` It is also possible to compress or decompress data in a single step: ```js +const { deflate, unzip } = require('zlib'); + const input = '.................................'; -zlib.deflate(input, (err, buffer) => { - if (!err) { - console.log(buffer.toString('base64')); - } else { - // handle error +deflate(input, (err, buffer) => { + if (err) { + console.error('An error occurred:', err); + process.exitCode = 1; } + console.log(buffer.toString('base64')); }); const buffer = Buffer.from('eJzT0yMAAGTvBe8=', 'base64'); -zlib.unzip(buffer, (err, buffer) => { - if (!err) { - console.log(buffer.toString()); - } else { - // handle error +unzip(buffer, (err, buffer) => { + if (err) { + console.error('An error occurred:', err); + process.exitCode = 1; } + console.log(buffer.toString()); }); + +// Or, Promisified + +const { promisify } = require('util'); +const do_unzip = promisify(unzip); + +do_unzip(buffer) + .then((buf) => console.log(buf.toString())) + .catch((err) => { + console.error('An error occurred:', err); + process.exitCode = 1; + }); ``` -## Threadpool Usage +## Threadpool Usage and Performance Considerations + +All `zlib` APIs, except those that are explicitly synchronous, use the Node.js +internal threadpool. This can lead to surprising effects and performance +limitations in some applications. -All zlib APIs, except those that are explicitly synchronous, use libuv's -threadpool. This can lead to surprising effects in some applications, such as -subpar performance (which can be mitigated by adjusting the [pool size][]) -and/or unrecoverable and catastrophic memory fragmentation. +Creating and using a large number of zlib objects simultaneously can cause +significant memory fragmentation. + +```js +const zlib = require('zlib'); + +const payload = Buffer.from('This is some data'); + +// WARNING: DO NOT DO THIS! +for (let i = 0; i < 30000; ++i) { + zlib.deflate(payload, (err, buffer) => {}); +} +``` + +In the preceding example, 30,000 deflate instances are created concurrently. +Because of how some operating systems handle memory allocation and +deallocation, this may lead to to significant memory fragmentation. + +It is strongly recommended that the results of compression +operations be cached to avoid duplication of effort. ## Compressing HTTP requests and responses @@ -80,6 +141,8 @@ tradeoffs involved in `zlib` usage. const zlib = require('zlib'); const http = require('http'); const fs = require('fs'); +const { pipeline } = require('stream'); + const request = http.get({ host: 'example.com', path: '/', port: 80, @@ -87,19 +150,26 @@ const request = http.get({ host: 'example.com', request.on('response', (response) => { const output = fs.createWriteStream('example.com_index.html'); + const onError = (err) => { + if (err) { + console.error('An error occurred:', err); + process.exitCode = 1; + } + }; + switch (response.headers['content-encoding']) { case 'br': - response.pipe(zlib.createBrotliDecompress()).pipe(output); + pipeline(response, zlib.createBrotliDecompress(), output, onError); break; // Or, just use zlib.createUnzip() to handle both of the following cases: case 'gzip': - response.pipe(zlib.createGunzip()).pipe(output); + pipeline(response, zlib.createGunzip(), output, onError); break; case 'deflate': - response.pipe(zlib.createInflate()).pipe(output); + pipeline(response, zlib.createInflate(), outout, onError); break; default: - response.pipe(output); + pipeline(response, output, onError); break; } }); @@ -112,6 +182,8 @@ request.on('response', (response) => { const zlib = require('zlib'); const http = require('http'); const fs = require('fs'); +const { pipeline } = require('stream'); + http.createServer((request, response) => { const raw = fs.createReadStream('index.html'); // Store both a compressed and an uncompressed version of the resource. @@ -121,20 +193,32 @@ http.createServer((request, response) => { acceptEncoding = ''; } + const onError = (err) => { + if (err) { + // If an error occurs, there's not much we can do because + // the server has already sent the 200 response code and + // some amount of data has already been sent to the client. + // The best we can do is terminate the response immediately + // and log the error. + response.end(); + console.error('An error occurred:', err); + } + }; + // Note: This is not a conformant accept-encoding parser. // See https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.3 if (/\bdeflate\b/.test(acceptEncoding)) { response.writeHead(200, { 'Content-Encoding': 'deflate' }); - raw.pipe(zlib.createDeflate()).pipe(response); + pipeline(raw, zlib.createDeflate(), response, onError); } else if (/\bgzip\b/.test(acceptEncoding)) { response.writeHead(200, { 'Content-Encoding': 'gzip' }); - raw.pipe(zlib.createGzip()).pipe(response); + pipeline(raw, zlib.createGzip(), response, onError); } else if (/\bbr\b/.test(acceptEncoding)) { response.writeHead(200, { 'Content-Encoding': 'br' }); - raw.pipe(zlib.createBrotliCompress()).pipe(response); + pipeline(raw, zlib.createBrotliCompress(), response, onError); } else { response.writeHead(200, {}); - raw.pipe(response); + pipeline(raw, response, onError); } }).listen(1337); ``` @@ -154,11 +238,11 @@ zlib.unzip( // For Brotli, the equivalent is zlib.constants.BROTLI_OPERATION_FLUSH. { finishFlush: zlib.constants.Z_SYNC_FLUSH }, (err, buffer) => { - if (!err) { - console.log(buffer.toString()); - } else { - // handle error + if (err) { + console.error('An error occurred:', err); + process.exitCode = 1; } + console.log(buffer.toString()); }); ``` @@ -234,14 +318,28 @@ HTTP response to the client: ```js const zlib = require('zlib'); const http = require('http'); +const { pipeline } = require('stream'); http.createServer((request, response) => { // For the sake of simplicity, the Accept-Encoding checks are omitted. response.writeHead(200, { 'content-encoding': 'gzip' }); const output = zlib.createGzip(); - output.pipe(response); + let i; + + pipeline(output, response, (err) => { + if (err) { + // If an error occurs, there's not much we can do because + // the server has already sent the 200 response code and + // some amount of data has already been sent to the client. + // The best we can do is terminate the response immediately + // and log the error. + clearInterval(i); + response.end(); + console.error('An error occurred:', err); + } + }); - setInterval(() => { + i = setInterval(() => { output.write(`The current time is ${Date()}\n`, () => { // The data has been passed to zlib, but the compression algorithm may // have decided to buffer the data for more efficient compression. @@ -399,7 +497,7 @@ changes: -Each zlib-based class takes an `options` object. All options are optional. +Each zlib-based class takes an `options` object. No options are required. Some options are only relevant when compressing and are ignored by the decompression classes. @@ -1058,6 +1156,6 @@ Decompress a chunk of data with [`Unzip`][]. [Brotli parameters]: #zlib_brotli_constants [Memory Usage Tuning]: #zlib_memory_usage_tuning [RFC 7932]: https://www.rfc-editor.org/rfc/rfc7932.txt -[pool size]: cli.html#cli_uv_threadpool_size_size +[Streams API]: stream.md [zlib documentation]: https://zlib.net/manual.html#Constants [zlib.createGzip example]: #zlib_zlib From 6adbfac9b06d44366b3cfb32d6fa2f549336b593 Mon Sep 17 00:00:00 2001 From: Shelley Vohr Date: Mon, 24 Feb 2020 18:50:13 -0800 Subject: [PATCH 61/91] repl: eager-evaluate input in parens PR-URL: https://github.com/nodejs/node/pull/31943 Reviewed-By: Gus Caplan Reviewed-By: James M Snell Reviewed-By: Ruben Bridgewater --- lib/repl.js | 12 ++++++------ test/parallel/test-repl-preview.js | 17 +++++++++++++++++ 2 files changed, 23 insertions(+), 6 deletions(-) diff --git a/lib/repl.js b/lib/repl.js index 3a28fa21be0292..183dcd43499f47 100644 --- a/lib/repl.js +++ b/lib/repl.js @@ -322,12 +322,12 @@ function REPLServer(prompt, let awaitPromise = false; const input = code; - if (/^\s*{/.test(code) && /}\s*$/.test(code)) { - // It's confusing for `{ a : 1 }` to be interpreted as a block - // statement rather than an object literal. So, we first try - // to wrap it in parentheses, so that it will be interpreted as - // an expression. Note that if the above condition changes, - // lib/internal/repl/utils.js needs to be changed to match. + // It's confusing for `{ a : 1 }` to be interpreted as a block + // statement rather than an object literal. So, we first try + // to wrap it in parentheses, so that it will be interpreted as + // an expression. Note that if the above condition changes, + // lib/internal/repl/utils.js needs to be changed to match. + if (/^\s*{/.test(code) && !/;\s*$/.test(code)) { code = `(${code.trim()})\n`; wrappedCmd = true; } diff --git a/test/parallel/test-repl-preview.js b/test/parallel/test-repl-preview.js index 82f1a9e507e750..b36b99cca7c40e 100644 --- a/test/parallel/test-repl-preview.js +++ b/test/parallel/test-repl-preview.js @@ -88,6 +88,23 @@ async function tests(options) { '\x1B[36m[Function: koo]\x1B[39m', '\x1B[1G\x1B[0Jrepl > \x1B[8G'], ['a', [1, 2], undefined], + [" { b: 1 }['b'] === 1", [2, 6], '\x1B[33mtrue\x1B[39m', + " { b: 1 }['b']", + '\x1B[90m1\x1B[39m\x1B[22G\x1B[1A\x1B[1B\x1B[2K\x1B[1A ', + '\x1B[90m1\x1B[39m\x1B[23G\x1B[1A\x1B[1B\x1B[2K\x1B[1A=== 1', + '\x1B[90mtrue\x1B[39m\x1B[28G\x1B[1A\x1B[1B\x1B[2K\x1B[1A\r', + '\x1B[33mtrue\x1B[39m', + '\x1B[1G\x1B[0Jrepl > \x1B[8G' + ], + ["{ b: 1 }['b'] === 1;", [2, 7], '\x1B[33mfalse\x1B[39m', + "{ b: 1 }['b']", + '\x1B[90m1\x1B[39m\x1B[21G\x1B[1A\x1B[1B\x1B[2K\x1B[1A ', + '\x1B[90m1\x1B[39m\x1B[22G\x1B[1A\x1B[1B\x1B[2K\x1B[1A=== 1', + '\x1B[90mtrue\x1B[39m\x1B[27G\x1B[1A\x1B[1B\x1B[2K\x1B[1A;', + '\x1B[90mfalse\x1B[39m\x1B[28G\x1B[1A\x1B[1B\x1B[2K\x1B[1A\r', + '\x1B[33mfalse\x1B[39m', + '\x1B[1G\x1B[0Jrepl > \x1B[8G' + ], ['{ a: true }', [2, 3], '{ a: \x1B[33mtrue\x1B[39m }', '{ a: tru\x1B[90me\x1B[39m\x1B[16G\x1B[0Ke }\r', '{ a: \x1B[33mtrue\x1B[39m }', From 49c959d636fdfe02e68002428ac09868a156d757 Mon Sep 17 00:00:00 2001 From: Denys Otrishko Date: Wed, 26 Feb 2020 10:31:29 +0200 Subject: [PATCH 62/91] test: increase timeout in vm-timeout-escape-queuemicrotask It looks like under high load the loop isn't even started and therefore successfully finishes without 'escaping'. After increasing the timeout during parallel run of the test failure rate decreased from 15/1000 to 0/1000. PR-URL: https://github.com/nodejs/node/pull/31966 Refs: https://github.com/nodejs/node/issues/25529 Reviewed-By: Luigi Pinca Reviewed-By: Anna Henningsen Reviewed-By: James M Snell Reviewed-By: Rich Trott --- test/known_issues/test-vm-timeout-escape-queuemicrotask.js | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/test/known_issues/test-vm-timeout-escape-queuemicrotask.js b/test/known_issues/test-vm-timeout-escape-queuemicrotask.js index df0531bae1d9ed..0d3a0b0c5c5814 100644 --- a/test/known_issues/test-vm-timeout-escape-queuemicrotask.js +++ b/test/known_issues/test-vm-timeout-escape-queuemicrotask.js @@ -12,8 +12,8 @@ const NS_PER_MS = 1000000n; const hrtime = process.hrtime.bigint; -const loopDuration = common.platformTimeout(100n); -const timeout = common.platformTimeout(10); +const loopDuration = common.platformTimeout(1000n); +const timeout = common.platformTimeout(100); function loop() { const start = hrtime(); From ab8f060159be0d8149b4ec9f432690e1a10a39a3 Mon Sep 17 00:00:00 2001 From: Denys Otrishko Date: Mon, 24 Feb 2020 14:54:51 +0200 Subject: [PATCH 63/91] test: fix usage of invalid common properties PR-URL: https://github.com/nodejs/node/pull/31933 Reviewed-By: Luigi Pinca Reviewed-By: Shelley Vohr Reviewed-By: Yongsheng Zhang Reviewed-By: Rich Trott Reviewed-By: Anna Henningsen --- test/common/index.mjs | 2 -- 1 file changed, 2 deletions(-) diff --git a/test/common/index.mjs b/test/common/index.mjs index a5774fc008a9b3..96e6699e3c6f99 100644 --- a/test/common/index.mjs +++ b/test/common/index.mjs @@ -37,7 +37,6 @@ const { mustNotCall, printSkipMessage, skip, - ArrayStream, nodeProcessAborted, isAlive, expectWarning, @@ -83,7 +82,6 @@ export { mustNotCall, printSkipMessage, skip, - ArrayStream, nodeProcessAborted, isAlive, expectWarning, From f1e76488a7bc4b976d41224d20805b2deb72408c Mon Sep 17 00:00:00 2001 From: Denys Otrishko Date: Mon, 24 Feb 2020 14:55:14 +0200 Subject: [PATCH 64/91] test: validate common property usage `common` contains multiple 'check'(boolean) properties that will be false if mistyped and may lead to errors. This makes sure that the used property exists in the `common`. PR-URL: https://github.com/nodejs/node/pull/31933 Reviewed-By: Luigi Pinca Reviewed-By: Shelley Vohr Reviewed-By: Yongsheng Zhang Reviewed-By: Rich Trott Reviewed-By: Anna Henningsen --- test/common/index.js | 11 ++++++++++- 1 file changed, 10 insertions(+), 1 deletion(-) diff --git a/test/common/index.js b/test/common/index.js index 1e66e32c9c1260..653de4685ca7a0 100644 --- a/test/common/index.js +++ b/test/common/index.js @@ -672,7 +672,7 @@ function invalidArgTypeHelper(input) { return ` Received type ${typeof input} (${inspected})`; } -module.exports = { +const common = { allowGlobals, buildType, canCreateSymLink, @@ -815,3 +815,12 @@ module.exports = { } }; + +const validProperties = new Set(Object.keys(common)); +module.exports = new Proxy(common, { + get(obj, prop) { + if (!validProperties.has(prop)) + throw new Error(`Using invalid common property: '${prop}'`); + return obj[prop]; + } +}); From ca4407105e8b8a8cb1b9461f12589fb7dff7cc61 Mon Sep 17 00:00:00 2001 From: cjihrig Date: Tue, 25 Feb 2020 21:38:34 -0500 Subject: [PATCH 65/91] build: add missing comma in node.gyp MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit This commit adds a missing comma for consistency with the surrounding lines. PR-URL: https://github.com/nodejs/node/pull/31959 Reviewed-By: Jiawen Geng Reviewed-By: Anna Henningsen Reviewed-By: Richard Lau Reviewed-By: Ben Noordhuis Reviewed-By: Luigi Pinca Reviewed-By: Сковорода Никита Андреевич --- node.gyp | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/node.gyp b/node.gyp index 0ac8064697aa36..29dd0ed0c7a16e 100644 --- a/node.gyp +++ b/node.gyp @@ -620,7 +620,7 @@ 'src/histogram-inl.h', 'src/js_stream.h', 'src/large_pages/node_large_page.cc', - 'src/large_pages/node_large_page.h' + 'src/large_pages/node_large_page.h', 'src/memory_tracker.h', 'src/memory_tracker-inl.h', 'src/module_wrap.h', From 3497370d66117f9f734c3307867ff21b7e5956de Mon Sep 17 00:00:00 2001 From: Shelley Vohr Date: Mon, 24 Feb 2020 19:14:15 -0800 Subject: [PATCH 66/91] src: move InternalCallbackScope to StartExecution PR-URL: https://github.com/nodejs/node/pull/31944 Reviewed-By: Anna Henningsen Reviewed-By: Stephen Belanger Reviewed-By: Joyee Cheung --- src/node.cc | 6 ++++++ src/node_main_instance.cc | 9 +-------- src/node_worker.cc | 5 ----- 3 files changed, 7 insertions(+), 13 deletions(-) diff --git a/src/node.cc b/src/node.cc index aec70381c58e67..6b95796e1269b3 100644 --- a/src/node.cc +++ b/src/node.cc @@ -395,6 +395,12 @@ MaybeLocal StartExecution(Environment* env, const char* main_script_id) { ->GetFunction(env->context()) .ToLocalChecked()}; + InternalCallbackScope callback_scope( + env, + Object::New(env->isolate()), + { 1, 0 }, + InternalCallbackScope::kSkipAsyncHooks); + return scope.EscapeMaybe( ExecuteBootstrapper(env, main_script_id, ¶meters, &arguments)); } diff --git a/src/node_main_instance.cc b/src/node_main_instance.cc index d53eaa7329beed..6f240d7e809f8e 100644 --- a/src/node_main_instance.cc +++ b/src/node_main_instance.cc @@ -122,14 +122,7 @@ int NodeMainInstance::Run() { Context::Scope context_scope(env->context()); if (exit_code == 0) { - { - InternalCallbackScope callback_scope( - env.get(), - Object::New(isolate_), - { 1, 0 }, - InternalCallbackScope::kSkipAsyncHooks); - LoadEnvironment(env.get()); - } + LoadEnvironment(env.get()); env->set_trace_sync_io(env->options()->trace_sync_io); diff --git a/src/node_worker.cc b/src/node_worker.cc index a5dcec250e7a72..04ae234c3b4cb6 100644 --- a/src/node_worker.cc +++ b/src/node_worker.cc @@ -341,11 +341,6 @@ void Worker::Run() { env_->InitializeInspector(std::move(inspector_parent_handle_)); #endif HandleScope handle_scope(isolate_); - InternalCallbackScope callback_scope( - env_.get(), - Object::New(isolate_), - { 1, 0 }, - InternalCallbackScope::kSkipAsyncHooks); if (!env_->RunBootstrapping().IsEmpty()) { CreateEnvMessagePort(env_.get()); From f71fc9044a4193bb72963bc5f052de038f16d1b4 Mon Sep 17 00:00:00 2001 From: Andrey Pechkurov Date: Mon, 24 Feb 2020 13:00:59 +0300 Subject: [PATCH 67/91] async_hooks: add store arg in AsyncLocalStorage MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit This commit introduces store as the first argument in AsyncLocalStorage's run methods. The change is motivated by the following expectation: most users are going to use a custom object as the store and an extra Map created by the previous implementation is an overhead for their use case. Important note. This is a backwards incompatible change. It was discussed and agreed an incompatible change is ok since the API is still experimental and the modified methods were only added within the last week so usage will be minimal to none. PR-URL: https://github.com/nodejs/node/pull/31930 Reviewed-By: Stephen Belanger Reviewed-By: Vladimir de Turckheim Reviewed-By: Matteo Collina Reviewed-By: Michaël Zasso Reviewed-By: Michael Dawson --- .../async_hooks/async-resource-vs-destroy.js | 6 +-- doc/api/async_hooks.md | 46 ++++++++++--------- lib/async_hooks.js | 14 +++--- .../test-async-local-storage-args.js | 4 +- .../test-async-local-storage-async-await.js | 2 +- ...est-async-local-storage-async-functions.js | 2 +- ...test-async-local-storage-enable-disable.js | 4 +- .../test-async-local-storage-errors-async.js | 2 +- ...est-async-local-storage-errors-sync-ret.js | 2 +- .../test-async-local-storage-http.js | 2 +- .../test-async-local-storage-misc-stores.js | 24 ++++++++++ .../test-async-local-storage-nested.js | 4 +- ...est-async-local-storage-no-mix-contexts.js | 6 +-- .../test-async-local-storage-promises.js | 2 +- 14 files changed, 73 insertions(+), 47 deletions(-) create mode 100644 test/async-hooks/test-async-local-storage-misc-stores.js diff --git a/benchmark/async_hooks/async-resource-vs-destroy.js b/benchmark/async_hooks/async-resource-vs-destroy.js index 84e17ed56d8c61..c9b9a81c5b7c7f 100644 --- a/benchmark/async_hooks/async-resource-vs-destroy.js +++ b/benchmark/async_hooks/async-resource-vs-destroy.js @@ -106,7 +106,7 @@ function buildDestroy(getServe) { function buildAsyncLocalStorage(getServe) { const asyncLocalStorage = new AsyncLocalStorage(); const server = createServer((req, res) => { - asyncLocalStorage.runSyncAndReturn(() => { + asyncLocalStorage.runSyncAndReturn({}, () => { getServe(getCLS, setCLS)(req, res); }); }); @@ -118,12 +118,12 @@ function buildAsyncLocalStorage(getServe) { function getCLS() { const store = asyncLocalStorage.getStore(); - return store.get('store'); + return store.state; } function setCLS(state) { const store = asyncLocalStorage.getStore(); - store.set('store', state); + store.state = state; } function close() { diff --git a/doc/api/async_hooks.md b/doc/api/async_hooks.md index 529c6cc5a83101..965c64218fdaba 100644 --- a/doc/api/async_hooks.md +++ b/doc/api/async_hooks.md @@ -893,7 +893,7 @@ function log(...args) { } http.createServer((request, response) => { - asyncLocalStorage.run(() => { + asyncLocalStorage.run(new Map(), () => { const store = asyncLocalStorage.getStore(); store.set(kReq, request); someAsyncOperation((err, result) => { @@ -943,27 +943,27 @@ in the current process. added: REPLACEME --> -* Returns: {Map} +* Returns: {any} This method returns the current store. If this method is called outside of an asynchronous context initialized by calling `asyncLocalStorage.run` or `asyncLocalStorage.runAndReturn`, it will return `undefined`. -### `asyncLocalStorage.run(callback[, ...args])` +### `asyncLocalStorage.run(store, callback[, ...args])` +* `store` {any} * `callback` {Function} * `...args` {any} Calling `asyncLocalStorage.run(callback)` will create a new asynchronous -context. -Within the callback function and the asynchronous operations from the callback, -`asyncLocalStorage.getStore()` will return an instance of `Map` known as -"the store". This store will be persistent through the following -asynchronous calls. +context. Within the callback function and the asynchronous operations from +the callback, `asyncLocalStorage.getStore()` will return the object or +the primitive value passed into the `store` argument (known as "the store"). +This store will be persistent through the following asynchronous calls. The callback will be ran asynchronously. Optionally, arguments can be passed to the function. They will be passed to the callback function. @@ -975,10 +975,11 @@ Also, the stacktrace will be impacted by the asynchronous call. Example: ```js -asyncLocalStorage.run(() => { - asyncLocalStorage.getStore(); // Returns a Map +const store = { id: 1 }; +asyncLocalStorage.run(store, () => { + asyncLocalStorage.getStore(); // Returns the store object someAsyncOperation(() => { - asyncLocalStorage.getStore(); // Returns the same Map + asyncLocalStorage.getStore(); // Returns the same object }); }); asyncLocalStorage.getStore(); // Returns undefined @@ -1007,20 +1008,21 @@ Also, the stacktrace will be impacted by the asynchronous call. Example: ```js -asyncLocalStorage.run(() => { - asyncLocalStorage.getStore(); // Returns a Map +asyncLocalStorage.run('store value', () => { + asyncLocalStorage.getStore(); // Returns 'store value' asyncLocalStorage.exit(() => { asyncLocalStorage.getStore(); // Returns undefined }); - asyncLocalStorage.getStore(); // Returns the same Map + asyncLocalStorage.getStore(); // Returns 'store value' }); ``` -### `asyncLocalStorage.runSyncAndReturn(callback[, ...args])` +### `asyncLocalStorage.runSyncAndReturn(store, callback[, ...args])` +* `store` {any} * `callback` {Function} * `...args` {any} @@ -1038,9 +1040,10 @@ the context will be exited. Example: ```js +const store = { id: 2 }; try { - asyncLocalStorage.runSyncAndReturn(() => { - asyncLocalStorage.getStore(); // Returns a Map + asyncLocalStorage.runSyncAndReturn(store, () => { + asyncLocalStorage.getStore(); // Returns the store object throw new Error(); }); } catch (e) { @@ -1073,13 +1076,13 @@ Example: ```js // Within a call to run or runSyncAndReturn try { - asyncLocalStorage.getStore(); // Returns a Map + asyncLocalStorage.getStore(); // Returns the store object or value asyncLocalStorage.exitSyncAndReturn(() => { asyncLocalStorage.getStore(); // Returns undefined throw new Error(); }); } catch (e) { - asyncLocalStorage.getStore(); // Returns the same Map + asyncLocalStorage.getStore(); // Returns the same object or value // The error will be caught here } ``` @@ -1105,8 +1108,9 @@ It cannot be promisified using `util.promisify`. If needed, the `Promise` constructor can be used: ```js +const store = new Map(); // initialize the store new Promise((resolve, reject) => { - asyncLocalStorage.run(() => { + asyncLocalStorage.run(store, () => { someFunction((err, result) => { if (err) { return reject(err); @@ -1135,7 +1139,7 @@ the following pattern should be used: ```js async function fn() { - await asyncLocalStorage.runSyncAndReturn(() => { + await asyncLocalStorage.runSyncAndReturn(new Map(), () => { asyncLocalStorage.getStore().set('key', value); return foo(); // The return value of foo will be awaited }); diff --git a/lib/async_hooks.js b/lib/async_hooks.js index 23f8ddde671e30..3797baf183250a 100644 --- a/lib/async_hooks.js +++ b/lib/async_hooks.js @@ -1,11 +1,9 @@ 'use strict'; const { - Map, NumberIsSafeInteger, ReflectApply, Symbol, - } = primordials; const { @@ -247,14 +245,14 @@ class AsyncLocalStorage { } } - _enter() { + _enter(store) { if (!this.enabled) { this.enabled = true; storageList.push(this); storageHook.enable(); } const resource = executionAsyncResource(); - resource[this.kResourceStore] = new Map(); + resource[this.kResourceStore] = store; } _exit() { @@ -264,8 +262,8 @@ class AsyncLocalStorage { } } - runSyncAndReturn(callback, ...args) { - this._enter(); + runSyncAndReturn(store, callback, ...args) { + this._enter(store); try { return callback(...args); } finally { @@ -289,8 +287,8 @@ class AsyncLocalStorage { } } - run(callback, ...args) { - this._enter(); + run(store, callback, ...args) { + this._enter(store); process.nextTick(callback, ...args); this._exit(); } diff --git a/test/async-hooks/test-async-local-storage-args.js b/test/async-hooks/test-async-local-storage-args.js index 91a3385e6eeb16..04316dff59d71a 100644 --- a/test/async-hooks/test-async-local-storage-args.js +++ b/test/async-hooks/test-async-local-storage-args.js @@ -5,14 +5,14 @@ const { AsyncLocalStorage } = require('async_hooks'); const asyncLocalStorage = new AsyncLocalStorage(); -asyncLocalStorage.run((runArg) => { +asyncLocalStorage.run({}, (runArg) => { assert.strictEqual(runArg, 1); asyncLocalStorage.exit((exitArg) => { assert.strictEqual(exitArg, 2); }, 2); }, 1); -asyncLocalStorage.runSyncAndReturn((runArg) => { +asyncLocalStorage.runSyncAndReturn({}, (runArg) => { assert.strictEqual(runArg, 'foo'); asyncLocalStorage.exitSyncAndReturn((exitArg) => { assert.strictEqual(exitArg, 'bar'); diff --git a/test/async-hooks/test-async-local-storage-async-await.js b/test/async-hooks/test-async-local-storage-async-await.js index 28c8488da62c53..a03f803186bdab 100644 --- a/test/async-hooks/test-async-local-storage-async-await.js +++ b/test/async-hooks/test-async-local-storage-async-await.js @@ -12,7 +12,7 @@ async function test() { } async function main() { - await asyncLocalStorage.runSyncAndReturn(test); + await asyncLocalStorage.runSyncAndReturn(new Map(), test); assert.strictEqual(asyncLocalStorage.getStore(), undefined); } diff --git a/test/async-hooks/test-async-local-storage-async-functions.js b/test/async-hooks/test-async-local-storage-async-functions.js index 89ac0be62c7488..a0852bc1098a1a 100644 --- a/test/async-hooks/test-async-local-storage-async-functions.js +++ b/test/async-hooks/test-async-local-storage-async-functions.js @@ -19,7 +19,7 @@ async function testAwait() { await asyncLocalStorage.exitSyncAndReturn(testOut); } -asyncLocalStorage.run(() => { +asyncLocalStorage.run(new Map(), () => { const store = asyncLocalStorage.getStore(); store.set('key', 'value'); testAwait(); // should not reject diff --git a/test/async-hooks/test-async-local-storage-enable-disable.js b/test/async-hooks/test-async-local-storage-enable-disable.js index c30d72eb805d5d..bbba8cde58d7e8 100644 --- a/test/async-hooks/test-async-local-storage-enable-disable.js +++ b/test/async-hooks/test-async-local-storage-enable-disable.js @@ -5,7 +5,7 @@ const { AsyncLocalStorage } = require('async_hooks'); const asyncLocalStorage = new AsyncLocalStorage(); -asyncLocalStorage.runSyncAndReturn(() => { +asyncLocalStorage.runSyncAndReturn(new Map(), () => { asyncLocalStorage.getStore().set('foo', 'bar'); process.nextTick(() => { assert.strictEqual(asyncLocalStorage.getStore().get('foo'), 'bar'); @@ -13,7 +13,7 @@ asyncLocalStorage.runSyncAndReturn(() => { assert.strictEqual(asyncLocalStorage.getStore(), undefined); process.nextTick(() => { assert.strictEqual(asyncLocalStorage.getStore(), undefined); - asyncLocalStorage.runSyncAndReturn(() => { + asyncLocalStorage.runSyncAndReturn(new Map(), () => { assert.notStrictEqual(asyncLocalStorage.getStore(), undefined); }); }); diff --git a/test/async-hooks/test-async-local-storage-errors-async.js b/test/async-hooks/test-async-local-storage-errors-async.js index c782b383e9ca95..b6f0b4fa742f61 100644 --- a/test/async-hooks/test-async-local-storage-errors-async.js +++ b/test/async-hooks/test-async-local-storage-errors-async.js @@ -13,7 +13,7 @@ process.setUncaughtExceptionCaptureCallback((err) => { assert.strictEqual(asyncLocalStorage.getStore().get('hello'), 'node'); }); -asyncLocalStorage.run(() => { +asyncLocalStorage.run(new Map(), () => { const store = asyncLocalStorage.getStore(); store.set('hello', 'node'); setTimeout(() => { diff --git a/test/async-hooks/test-async-local-storage-errors-sync-ret.js b/test/async-hooks/test-async-local-storage-errors-sync-ret.js index f112df2b99dff7..3b5c57a73472f6 100644 --- a/test/async-hooks/test-async-local-storage-errors-sync-ret.js +++ b/test/async-hooks/test-async-local-storage-errors-sync-ret.js @@ -14,7 +14,7 @@ process.setUncaughtExceptionCaptureCallback((err) => { }); try { - asyncLocalStorage.runSyncAndReturn(() => { + asyncLocalStorage.runSyncAndReturn(new Map(), () => { const store = asyncLocalStorage.getStore(); store.set('hello', 'node'); setTimeout(() => { diff --git a/test/async-hooks/test-async-local-storage-http.js b/test/async-hooks/test-async-local-storage-http.js index 9f107148402ec5..c7514d8280df35 100644 --- a/test/async-hooks/test-async-local-storage-http.js +++ b/test/async-hooks/test-async-local-storage-http.js @@ -10,7 +10,7 @@ const server = http.createServer((req, res) => { }); server.listen(0, () => { - asyncLocalStorage.run(() => { + asyncLocalStorage.run(new Map(), () => { const store = asyncLocalStorage.getStore(); store.set('hello', 'world'); http.get({ host: 'localhost', port: server.address().port }, () => { diff --git a/test/async-hooks/test-async-local-storage-misc-stores.js b/test/async-hooks/test-async-local-storage-misc-stores.js new file mode 100644 index 00000000000000..56873008dd644f --- /dev/null +++ b/test/async-hooks/test-async-local-storage-misc-stores.js @@ -0,0 +1,24 @@ +'use strict'; +require('../common'); +const assert = require('assert'); +const { AsyncLocalStorage } = require('async_hooks'); + +const asyncLocalStorage = new AsyncLocalStorage(); + +asyncLocalStorage.run(42, () => { + assert.strictEqual(asyncLocalStorage.getStore(), 42); +}); + +const runStore = { foo: 'bar' }; +asyncLocalStorage.run(runStore, () => { + assert.strictEqual(asyncLocalStorage.getStore(), runStore); +}); + +asyncLocalStorage.runSyncAndReturn('hello node', () => { + assert.strictEqual(asyncLocalStorage.getStore(), 'hello node'); +}); + +const runSyncStore = { hello: 'node' }; +asyncLocalStorage.runSyncAndReturn(runSyncStore, () => { + assert.strictEqual(asyncLocalStorage.getStore(), runSyncStore); +}); diff --git a/test/async-hooks/test-async-local-storage-nested.js b/test/async-hooks/test-async-local-storage-nested.js index 38330fff607ce2..1409a8ebc82a04 100644 --- a/test/async-hooks/test-async-local-storage-nested.js +++ b/test/async-hooks/test-async-local-storage-nested.js @@ -6,9 +6,9 @@ const { AsyncLocalStorage } = require('async_hooks'); const asyncLocalStorage = new AsyncLocalStorage(); setTimeout(() => { - asyncLocalStorage.run(() => { + asyncLocalStorage.run(new Map(), () => { const asyncLocalStorage2 = new AsyncLocalStorage(); - asyncLocalStorage2.run(() => { + asyncLocalStorage2.run(new Map(), () => { const store = asyncLocalStorage.getStore(); const store2 = asyncLocalStorage2.getStore(); store.set('hello', 'world'); diff --git a/test/async-hooks/test-async-local-storage-no-mix-contexts.js b/test/async-hooks/test-async-local-storage-no-mix-contexts.js index 561df546d4aa45..3a6b352c94ceee 100644 --- a/test/async-hooks/test-async-local-storage-no-mix-contexts.js +++ b/test/async-hooks/test-async-local-storage-no-mix-contexts.js @@ -7,8 +7,8 @@ const asyncLocalStorage = new AsyncLocalStorage(); const asyncLocalStorage2 = new AsyncLocalStorage(); setTimeout(() => { - asyncLocalStorage.run(() => { - asyncLocalStorage2.run(() => { + asyncLocalStorage.run(new Map(), () => { + asyncLocalStorage2.run(new Map(), () => { const store = asyncLocalStorage.getStore(); const store2 = asyncLocalStorage2.getStore(); store.set('hello', 'world'); @@ -28,7 +28,7 @@ setTimeout(() => { }, 100); setTimeout(() => { - asyncLocalStorage.run(() => { + asyncLocalStorage.run(new Map(), () => { const store = asyncLocalStorage.getStore(); store.set('hello', 'earth'); setTimeout(() => { diff --git a/test/async-hooks/test-async-local-storage-promises.js b/test/async-hooks/test-async-local-storage-promises.js index 3b05d0f1981a3c..0e4968534bc3e2 100644 --- a/test/async-hooks/test-async-local-storage-promises.js +++ b/test/async-hooks/test-async-local-storage-promises.js @@ -12,7 +12,7 @@ async function main() { throw err; }); await new Promise((resolve, reject) => { - asyncLocalStorage.run(() => { + asyncLocalStorage.run(new Map(), () => { const store = asyncLocalStorage.getStore(); store.set('a', 1); next().then(resolve, reject); From d0a00711f89489af94299fa58c814817bb924383 Mon Sep 17 00:00:00 2001 From: Robert Nagy Date: Fri, 22 Nov 2019 19:13:00 +0100 Subject: [PATCH 68/91] stream: invoke buffered write callbacks on error Refs: https://github.com/nodejs/node/pull/30596 Buffered write callbacks were only invoked upon error if `autoDestroy` was invoked. Backport-PR-URL: https://github.com/nodejs/node/pull/31179 PR-URL: https://github.com/nodejs/node/pull/30596 Reviewed-By: Matteo Collina Reviewed-By: Anna Henningsen Reviewed-By: Luigi Pinca Reviewed-By: Rich Trott --- lib/_stream_writable.js | 32 +++++++++++--- test/parallel/test-stream-writable-destroy.js | 43 +++++++++++++++++++ 2 files changed, 69 insertions(+), 6 deletions(-) diff --git a/lib/_stream_writable.js b/lib/_stream_writable.js index 102fbcd98d6109..d2c109c302b40c 100644 --- a/lib/_stream_writable.js +++ b/lib/_stream_writable.js @@ -459,6 +459,11 @@ function onwriteError(stream, state, er, cb) { --state.pendingcb; cb(er); + // Ensure callbacks are invoked even when autoDestroy is + // not enabled. Passing `er` here doesn't make sense since + // it's related to one specific write, not to the buffered + // writes. + errorBuffer(state, new ERR_STREAM_DESTROYED('write')); // This can emit error, but error must always follow cb. errorOrDestroy(stream, er); } @@ -530,9 +535,29 @@ function afterWrite(stream, state, count, cb) { cb(); } + if (state.destroyed) { + errorBuffer(state, new ERR_STREAM_DESTROYED('write')); + } + finishMaybe(stream, state); } +// If there's something in the buffer waiting, then invoke callbacks. +function errorBuffer(state, err) { + if (state.writing || !state.bufferedRequest) { + return; + } + + for (let entry = state.bufferedRequest; entry; entry = entry.next) { + const len = state.objectMode ? 1 : entry.chunk.length; + state.length -= len; + entry.callback(err); + } + state.bufferedRequest = null; + state.lastBufferedRequest = null; + state.bufferedRequestCount = 0; +} + // If there's something in the buffer waiting, then process it function clearBuffer(stream, state) { state.bufferProcessing = true; @@ -782,12 +807,7 @@ const destroy = destroyImpl.destroy; Writable.prototype.destroy = function(err, cb) { const state = this._writableState; if (!state.destroyed) { - for (let entry = state.bufferedRequest; entry; entry = entry.next) { - process.nextTick(entry.callback, new ERR_STREAM_DESTROYED('write')); - } - state.bufferedRequest = null; - state.lastBufferedRequest = null; - state.bufferedRequestCount = 0; + process.nextTick(errorBuffer, state, new ERR_STREAM_DESTROYED('write')); } destroy.call(this, err, cb); return this; diff --git a/test/parallel/test-stream-writable-destroy.js b/test/parallel/test-stream-writable-destroy.js index 4a2c7b0884b417..d67bdae3bba36e 100644 --- a/test/parallel/test-stream-writable-destroy.js +++ b/test/parallel/test-stream-writable-destroy.js @@ -292,3 +292,46 @@ const assert = require('assert'); })); write.uncork(); } + +{ + // Call buffered write callback with error + + const write = new Writable({ + write(chunk, enc, cb) { + process.nextTick(cb, new Error('asd')); + }, + autoDestroy: false + }); + write.cork(); + write.write('asd', common.mustCall((err) => { + assert.strictEqual(err.message, 'asd'); + })); + write.write('asd', common.mustCall((err) => { + assert.strictEqual(err.code, 'ERR_STREAM_DESTROYED'); + })); + write.on('error', common.mustCall((err) => { + assert.strictEqual(err.message, 'asd'); + })); + write.uncork(); +} + +{ + // Ensure callback order. + + let state = 0; + const write = new Writable({ + write(chunk, enc, cb) { + // `setImmediate()` is used on purpose to ensure the callback is called + // after `process.nextTick()` callbacks. + setImmediate(cb); + } + }); + write.write('asd', common.mustCall(() => { + assert.strictEqual(state++, 0); + })); + write.write('asd', common.mustCall((err) => { + assert.strictEqual(err.code, 'ERR_STREAM_DESTROYED'); + assert.strictEqual(state++, 1); + })); + write.destroy(); +} From 6a35b0d102440e69d3f7871b9d6f974f34921b9d Mon Sep 17 00:00:00 2001 From: Shelley Vohr Date: Fri, 21 Feb 2020 17:30:02 -0800 Subject: [PATCH 69/91] src: don't run bootstrapper in CreateEnvironment PR-URL: https://github.com/nodejs/node/pull/31910 Reviewed-By: Anna Henningsen Reviewed-By: Joyee Cheung --- src/api/environment.cc | 17 +--------------- test/cctest/test_environment.cc | 35 +++++++++++++++++---------------- 2 files changed, 19 insertions(+), 33 deletions(-) diff --git a/src/api/environment.cc b/src/api/environment.cc index 0096b498e1c511..560845da65d93d 100644 --- a/src/api/environment.cc +++ b/src/api/environment.cc @@ -353,23 +353,8 @@ Environment* CreateEnvironment(IsolateData* isolate_data, Environment::kOwnsProcessState | Environment::kOwnsInspector)); env->InitializeLibuv(per_process::v8_is_profiling); - if (env->RunBootstrapping().IsEmpty()) { + if (env->RunBootstrapping().IsEmpty()) return nullptr; - } - - std::vector> parameters = { - env->require_string(), - FIXED_ONE_BYTE_STRING(env->isolate(), "markBootstrapComplete")}; - std::vector> arguments = { - env->native_module_require(), - env->NewFunctionTemplate(MarkBootstrapComplete) - ->GetFunction(env->context()) - .ToLocalChecked()}; - if (ExecuteBootstrapper( - env, "internal/bootstrap/environment", ¶meters, &arguments) - .IsEmpty()) { - return nullptr; - } return env; } diff --git a/test/cctest/test_environment.cc b/test/cctest/test_environment.cc index 132f7b44f7db62..90c5cff5e09bf4 100644 --- a/test/cctest/test_environment.cc +++ b/test/cctest/test_environment.cc @@ -32,23 +32,24 @@ class EnvironmentTest : public EnvironmentTestFixture { } }; -TEST_F(EnvironmentTest, PreExeuctionPreparation) { - const v8::HandleScope handle_scope(isolate_); - const Argv argv; - Env env {handle_scope, argv}; - - v8::Local context = isolate_->GetCurrentContext(); - - const char* run_script = "process.argv0"; - v8::Local script = v8::Script::Compile( - context, - v8::String::NewFromOneByte(isolate_, - reinterpret_cast(run_script), - v8::NewStringType::kNormal).ToLocalChecked()) - .ToLocalChecked(); - v8::Local result = script->Run(context).ToLocalChecked(); - CHECK(result->IsString()); -} +// TODO(codebytere): re-enable this test. +// TEST_F(EnvironmentTest, PreExeuctionPreparation) { +// const v8::HandleScope handle_scope(isolate_); +// const Argv argv; +// Env env {handle_scope, argv}; + +// v8::Local context = isolate_->GetCurrentContext(); + +// const char* run_script = "process.argv0"; +// v8::Local script = v8::Script::Compile( +// context, +// v8::String::NewFromOneByte(isolate_, +// reinterpret_cast(run_script), +// v8::NewStringType::kNormal).ToLocalChecked()) +// .ToLocalChecked(); +// v8::Local result = script->Run(context).ToLocalChecked(); +// CHECK(result->IsString()); +// } TEST_F(EnvironmentTest, AtExitWithEnvironment) { const v8::HandleScope handle_scope(isolate_); From 9e3e6763fa411257f3ebf76929be63434b71a99f Mon Sep 17 00:00:00 2001 From: bcoe Date: Sun, 23 Feb 2020 19:19:15 -0800 Subject: [PATCH 70/91] module: port source map sort logic from chromium Digging in to the delta between V8's source map library, and chromium's the most significant difference that jumped out at me was that we were failing to sort generated columns. Since negative offsets are not restricted in the spec, this can lead to bugs. fixes: #31286 PR-URL: https://github.com/nodejs/node/pull/31927 Fixes: https://github.com/nodejs/node/issues/31286 Reviewed-By: Joyee Cheung Reviewed-By: Rich Trott --- lib/internal/source_map/source_map.js | 21 +++++++++++++++++++-- test/parallel/test-source-map-api.js | 25 +++++++++++++++++++++++++ 2 files changed, 44 insertions(+), 2 deletions(-) diff --git a/lib/internal/source_map/source_map.js b/lib/internal/source_map/source_map.js index c440dffdf81913..acff068be2a6e7 100644 --- a/lib/internal/source_map/source_map.js +++ b/lib/internal/source_map/source_map.js @@ -152,10 +152,12 @@ class SourceMap { * @param {SourceMapV3} mappingPayload */ #parseMappingPayload = () => { - if (this.#payload.sections) + if (this.#payload.sections) { this.#parseSections(this.#payload.sections); - else + } else { this.#parseMap(this.#payload, 0, 0); + } + this.#mappings.sort(compareSourceMapEntry); } /** @@ -321,6 +323,21 @@ function cloneSourceMapV3(payload) { return payload; } +/** + * @param {Array} entry1 source map entry [lineNumber, columnNumber, sourceURL, + * sourceLineNumber, sourceColumnNumber] + * @param {Array} entry2 source map entry. + * @return {number} + */ +function compareSourceMapEntry(entry1, entry2) { + const [lineNumber1, columnNumber1] = entry1; + const [lineNumber2, columnNumber2] = entry2; + if (lineNumber1 !== lineNumber2) { + return lineNumber1 - lineNumber2; + } + return columnNumber1 - columnNumber2; +} + module.exports = { SourceMap }; diff --git a/test/parallel/test-source-map-api.js b/test/parallel/test-source-map-api.js index 2bfbc08809e9a1..60bbb661e1c801 100644 --- a/test/parallel/test-source-map-api.js +++ b/test/parallel/test-source-map-api.js @@ -124,3 +124,28 @@ const { readFileSync } = require('fs'); assert.strictEqual(originalColumn, knownDecodings[column]); } } + +// Test that generated columns are sorted when a negative offset is +// observed, see: https://github.com/mozilla/source-map/pull/92 +{ + function makeMinimalMap(generatedColumns, originalColumns) { + return { + sources: ['test.js'], + // Mapping from the 0th line, ${g}th column of the output file to the 0th + // source file, 0th line, ${column}th column. + mappings: generatedColumns.map((g, i) => `${g}AA${originalColumns[i]}`) + .join(',') + }; + } + // U = 10 + // F = -2 + // A = 0 + // E = 2 + const sourceMap = new SourceMap(makeMinimalMap( + ['U', 'F', 'F'], + ['A', 'E', 'E'] + )); + assert.strictEqual(sourceMap.findEntry(0, 6).originalColumn, 4); + assert.strictEqual(sourceMap.findEntry(0, 8).originalColumn, 2); + assert.strictEqual(sourceMap.findEntry(0, 10).originalColumn, 0); +} From cef550205571de1d1b61ad9ab56fac6970d78188 Mon Sep 17 00:00:00 2001 From: Rusty Conover Date: Mon, 27 Jan 2020 14:03:39 -0500 Subject: [PATCH 71/91] test: remove sequential/test-https-keep-alive-large-write.js Remove a test that made a flawed assumption that a single large buffer write can be interrupted by a timeout event. PR-URL: https://github.com/nodejs/node/pull/31499 Reviewed-By: Anna Henningsen Reviewed-By: Ben Noordhuis Reviewed-By: David Carlier Reviewed-By: Rich Trott Reviewed-By: James M Snell --- .../test-https-keep-alive-large-write.js | 47 ------------------- 1 file changed, 47 deletions(-) delete mode 100644 test/sequential/test-https-keep-alive-large-write.js diff --git a/test/sequential/test-https-keep-alive-large-write.js b/test/sequential/test-https-keep-alive-large-write.js deleted file mode 100644 index 79381ba8735756..00000000000000 --- a/test/sequential/test-https-keep-alive-large-write.js +++ /dev/null @@ -1,47 +0,0 @@ -'use strict'; -const common = require('../common'); -if (!common.hasCrypto) - common.skip('missing crypto'); -const fixtures = require('../common/fixtures'); -const https = require('https'); - -// This test assesses whether long-running writes can complete -// or timeout because the socket is not aware that the backing -// stream is still writing. - -const writeSize = 30000000; -let socket; - -const server = https.createServer({ - key: fixtures.readKey('agent1-key.pem'), - cert: fixtures.readKey('agent1-cert.pem') -}, common.mustCall((req, res) => { - const content = Buffer.alloc(writeSize, 0x44); - - res.writeHead(200, { - 'Content-Type': 'application/octet-stream', - 'Content-Length': content.length.toString(), - 'Vary': 'Accept-Encoding' - }); - - socket = res.socket; - const onTimeout = socket._onTimeout; - socket._onTimeout = common.mustCallAtLeast(() => onTimeout.call(socket), 1); - res.write(content); - res.end(); -})); -server.on('timeout', common.mustNotCall()); - -server.listen(0, common.mustCall(() => { - https.get({ - path: '/', - port: server.address().port, - rejectUnauthorized: false - }, (res) => { - res.once('data', () => { - socket._onTimeout(); - res.on('data', () => {}); - }); - res.on('end', () => server.close()); - }); -})); From b6d33f671a452b1aebcc09adf4c3b9dba29da3c7 Mon Sep 17 00:00:00 2001 From: Rusty Conover Date: Tue, 25 Feb 2020 23:12:51 -0500 Subject: [PATCH 72/91] test: change test to not be sensitive to buffer send size Change the test to not be sensitive to the buffer size causing TCP resets to be received by the client causing the test to fail. The test now reads the entire expected buffer and then checks for the expected event to fire. PR-URL: https://github.com/nodejs/node/pull/31499 Reviewed-By: Anna Henningsen Reviewed-By: Ben Noordhuis Reviewed-By: David Carlier Reviewed-By: Rich Trott Reviewed-By: James M Snell --- .../test-tls-close-event-after-write.js | 17 +++++++++++------ 1 file changed, 11 insertions(+), 6 deletions(-) diff --git a/test/parallel/test-tls-close-event-after-write.js b/test/parallel/test-tls-close-event-after-write.js index 31ebc897b14758..57c79e2e5ab72d 100644 --- a/test/parallel/test-tls-close-event-after-write.js +++ b/test/parallel/test-tls-close-event-after-write.js @@ -12,23 +12,22 @@ const tls = require('tls'); const fixtures = require('../common/fixtures'); let cconn = null; let sconn = null; +let read_len = 0; +const buffer_size = 1024 * 1024; function test() { if (cconn && sconn) { cconn.resume(); sconn.resume(); - sconn.end(Buffer.alloc(1024 * 1024)); - cconn.end(); + sconn.end(Buffer.alloc(buffer_size)); } } const server = tls.createServer({ key: fixtures.readKey('agent1-key.pem'), cert: fixtures.readKey('agent1-cert.pem') -}, function(c) { - c.on('close', function() { - server.close(); - }); +}, (c) => { + c.on('close', common.mustCall(() => server.close())); sconn = c; test(); }).listen(0, common.mustCall(function() { @@ -36,6 +35,12 @@ const server = tls.createServer({ rejectUnauthorized: false }, common.mustCall(function() { cconn = this; + cconn.on('data', (d) => { + read_len += d.length; + if (read_len === buffer_size) { + cconn.end(); + } + }); test(); })); })); From 2c0b249098f3c1d0db2b0b02de20366fb9e9b5f8 Mon Sep 17 00:00:00 2001 From: Rusty Conover Date: Fri, 24 Jan 2020 12:44:26 -0500 Subject: [PATCH 73/91] tls: reduce memory copying and number of BIO buffer allocations Avoid copying buffers before passing to SSL_write if there are zero length buffers involved. Only copy the data when the buffer has a non zero length. Send a memory allocation hint to the crypto BIO about how much memory will likely be needed to be allocated by the next call to SSL_write. This makes a single allocation rather than the BIO allocating a buffer for each 16k TLS segment written. This solves a problem with large buffers written over TLS triggering V8's GC. PR-URL: https://github.com/nodejs/node/pull/31499 Reviewed-By: Anna Henningsen Reviewed-By: Ben Noordhuis Reviewed-By: David Carlier Reviewed-By: Rich Trott Reviewed-By: James M Snell --- benchmark/tls/throughput.js | 2 +- src/node_crypto_bio.cc | 7 +++++++ src/node_crypto_bio.h | 16 ++++++++++++++++ src/tls_wrap.cc | 30 ++++++++++++++++++++++++++---- 4 files changed, 50 insertions(+), 5 deletions(-) diff --git a/benchmark/tls/throughput.js b/benchmark/tls/throughput.js index a8f2d19649d04a..727d20e460008d 100644 --- a/benchmark/tls/throughput.js +++ b/benchmark/tls/throughput.js @@ -3,7 +3,7 @@ const common = require('../common.js'); const bench = common.createBenchmark(main, { dur: [5], type: ['buf', 'asc', 'utf'], - size: [2, 1024, 1024 * 1024] + size: [2, 1024, 1024 * 1024, 4 * 1024 * 1024, 16 * 1024 * 1024] }); const fixtures = require('../../test/common/fixtures'); diff --git a/src/node_crypto_bio.cc b/src/node_crypto_bio.cc index fc143043ba56b1..55f5e8a5a37650 100644 --- a/src/node_crypto_bio.cc +++ b/src/node_crypto_bio.cc @@ -438,6 +438,13 @@ void NodeBIO::TryAllocateForWrite(size_t hint) { kThroughputBufferLength; if (len < hint) len = hint; + + // If there is a one time allocation size hint, use it. + if (allocate_hint_ > len) { + len = allocate_hint_; + allocate_hint_ = 0; + } + Buffer* next = new Buffer(env_, len); if (w == nullptr) { diff --git a/src/node_crypto_bio.h b/src/node_crypto_bio.h index 5de943806a9642..333a50848c7efd 100644 --- a/src/node_crypto_bio.h +++ b/src/node_crypto_bio.h @@ -96,6 +96,21 @@ class NodeBIO : public MemoryRetainer { return length_; } + // Provide a hint about the size of the next pending set of writes. TLS + // writes records of a maximum length of 16k of data plus a 5-byte header, + // a MAC (up to 20 bytes for SSLv3, TLS 1.0, TLS 1.1, and up to 32 bytes + // for TLS 1.2), and padding if a block cipher is used. If there is a + // large write this will result in potentially many buffers being + // allocated and gc'ed which can cause long pauses. By providing a + // guess about the amount of buffer space that will be needed in the + // next allocation this overhead is removed. + inline void set_allocate_tls_hint(size_t size) { + constexpr size_t kThreshold = 16 * 1024; + if (size >= kThreshold) { + allocate_hint_ = (size / kThreshold + 1) * (kThreshold + 5 + 32); + } + } + inline void set_eof_return(int num) { eof_return_ = num; } @@ -164,6 +179,7 @@ class NodeBIO : public MemoryRetainer { Environment* env_ = nullptr; size_t initial_ = kInitialBufferLength; size_t length_ = 0; + size_t allocate_hint_ = 0; int eof_return_ = -1; Buffer* read_head_ = nullptr; Buffer* write_head_ = nullptr; diff --git a/src/tls_wrap.cc b/src/tls_wrap.cc index 82274fde6db0c1..2f8da61f647f44 100644 --- a/src/tls_wrap.cc +++ b/src/tls_wrap.cc @@ -587,6 +587,7 @@ void TLSWrap::ClearIn() { AllocatedBuffer data = std::move(pending_cleartext_input_); crypto::MarkPopErrorOnReturn mark_pop_error_on_return; + crypto::NodeBIO::FromBIO(enc_out_)->set_allocate_tls_hint(data.size()); int written = SSL_write(ssl_.get(), data.data(), data.size()); Debug(this, "Writing %zu bytes, written = %d", data.size(), written); CHECK(written == -1 || written == static_cast(data.size())); @@ -701,8 +702,15 @@ int TLSWrap::DoWrite(WriteWrap* w, size_t length = 0; size_t i; - for (i = 0; i < count; i++) + size_t nonempty_i = 0; + size_t nonempty_count = 0; + for (i = 0; i < count; i++) { length += bufs[i].len; + if (bufs[i].len > 0) { + nonempty_i = i; + nonempty_count += 1; + } + } // We want to trigger a Write() on the underlying stream to drive the stream // system, but don't want to encrypt empty buffers into a TLS frame, so see @@ -747,20 +755,34 @@ int TLSWrap::DoWrite(WriteWrap* w, crypto::MarkPopErrorOnReturn mark_pop_error_on_return; int written = 0; - if (count != 1) { + + // It is common for zero length buffers to be written, + // don't copy data if there there is one buffer with data + // and one or more zero length buffers. + // _http_outgoing.js writes a zero length buffer in + // in OutgoingMessage.prototype.end. If there was a large amount + // of data supplied to end() there is no sense allocating + // and copying it when it could just be used. + + if (nonempty_count != 1) { data = env()->AllocateManaged(length); size_t offset = 0; for (i = 0; i < count; i++) { memcpy(data.data() + offset, bufs[i].base, bufs[i].len); offset += bufs[i].len; } + + crypto::NodeBIO::FromBIO(enc_out_)->set_allocate_tls_hint(length); written = SSL_write(ssl_.get(), data.data(), length); } else { // Only one buffer: try to write directly, only store if it fails - written = SSL_write(ssl_.get(), bufs[0].base, bufs[0].len); + uv_buf_t* buf = &bufs[nonempty_i]; + crypto::NodeBIO::FromBIO(enc_out_)->set_allocate_tls_hint(buf->len); + written = SSL_write(ssl_.get(), buf->base, buf->len); + if (written == -1) { data = env()->AllocateManaged(length); - memcpy(data.data(), bufs[0].base, bufs[0].len); + memcpy(data.data(), buf->base, buf->len); } } From 4d05508aa82b1caf595109a4a3399b2ca5c8945b Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Tobias=20Nie=C3=9Fen?= Date: Mon, 24 Feb 2020 09:25:03 -0400 Subject: [PATCH 74/91] crypto: turn impossible DH errors into assertions PR-URL: https://github.com/nodejs/node/pull/31934 Reviewed-By: Colin Ihrig Reviewed-By: James M Snell Reviewed-By: David Carlier Reviewed-By: Ruben Bridgewater Reviewed-By: Luigi Pinca Reviewed-By: Shelley Vohr Reviewed-By: Anna Henningsen Reviewed-By: Rich Trott --- src/node_crypto.cc | 12 ++---------- 1 file changed, 2 insertions(+), 10 deletions(-) diff --git a/src/node_crypto.cc b/src/node_crypto.cc index a8a067086f62d8..ea719d3c5c9c05 100644 --- a/src/node_crypto.cc +++ b/src/node_crypto.cc @@ -5938,11 +5938,7 @@ void DiffieHellman::ComputeSecret(const FunctionCallbackInfo& args) { ClearErrorOnReturn clear_error_on_return; - if (args.Length() == 0) { - return THROW_ERR_MISSING_ARGS( - env, "Other party's public key argument is mandatory"); - } - + CHECK_EQ(args.Length(), 1); THROW_AND_RETURN_IF_NOT_BUFFER(env, args[0], "Other party's public key"); ArrayBufferViewContents key_buf(args[0].As()); BignumPointer key(BN_bin2bn(key_buf.data(), key_buf.length(), nullptr)); @@ -5993,11 +5989,7 @@ void DiffieHellman::SetKey(const FunctionCallbackInfo& args, char errmsg[64]; - if (args.Length() == 0) { - snprintf(errmsg, sizeof(errmsg), "%s argument is mandatory", what); - return THROW_ERR_MISSING_ARGS(env, errmsg); - } - + CHECK_EQ(args.Length(), 1); if (!Buffer::HasInstance(args[0])) { snprintf(errmsg, sizeof(errmsg), "%s must be a buffer", what); return THROW_ERR_INVALID_ARG_TYPE(env, errmsg); From 8ad64b8e53bf28b3c764499716de671a5a9dfced Mon Sep 17 00:00:00 2001 From: Robert Nagy Date: Mon, 6 Jan 2020 15:03:33 +0100 Subject: [PATCH 75/91] stream: support passing generator functions into pipeline() Backport-PR-URL: https://github.com/nodejs/node/pull/31975 PR-URL: https://github.com/nodejs/node/pull/31223 Reviewed-By: Matteo Collina Reviewed-By: Benjamin Gruenbaum Reviewed-By: Rich Trott Reviewed-By: James M Snell Reviewed-By: Anna Henningsen --- doc/api/stream.md | 52 +++- lib/internal/streams/pipeline.js | 213 ++++++++++++-- test/parallel/test-stream-pipeline.js | 397 ++++++++++++++++++++++++++ tools/doc/type-parser.js | 2 + 4 files changed, 633 insertions(+), 31 deletions(-) diff --git a/doc/api/stream.md b/doc/api/stream.md index 92dfe1a42a39bb..1a2d14f865f5e5 100644 --- a/doc/api/stream.md +++ b/doc/api/stream.md @@ -1555,17 +1555,30 @@ const cleanup = finished(rs, (err) => { }); ``` -### `stream.pipeline(...streams, callback)` +### `stream.pipeline(source, ...transforms, destination, callback)` - -* `...streams` {Stream} Two or more streams to pipe between. +changes: + - version: REPLACEME + pr-url: https://github.com/nodejs/node/pull/31223 + description: Add support for async generators. +--> + +* `source` {Stream|Iterable|AsyncIterable|Function} + * Returns: {Iterable|AsyncIterable} +* `...transforms` {Stream|Function} + * `source` {AsyncIterable} + * Returns: {AsyncIterable} +* `destination` {Stream|Function} + * `source` {AsyncIterable} + * Returns: {AsyncIterable|Promise} * `callback` {Function} Called when the pipeline is fully done. * `err` {Error} + * `val` Resolved value of `Promise` returned by `destination`. +* Returns: {Stream} -A module method to pipe between streams forwarding errors and properly cleaning -up and provide a callback when the pipeline is complete. +A module method to pipe between streams and generators forwarding errors and +properly cleaning up and provide a callback when the pipeline is complete. ```js const { pipeline } = require('stream'); @@ -1608,6 +1621,28 @@ async function run() { run().catch(console.error); ``` +The `pipeline` API also supports async generators: + +```js +const pipeline = util.promisify(stream.pipeline); +const fs = require('fs').promises; + +async function run() { + await pipeline( + fs.createReadStream('lowercase.txt'), + async function* (source) { + for await (const chunk of source) { + yield String(chunk).toUpperCase(); + } + }, + fs.createWriteStream('uppercase.txt') + ); + console.log('Pipeline succeeded.'); +} + +run().catch(console.error); +``` + `stream.pipeline()` will call `stream.destroy(err)` on all streams except: * `Readable` streams which have emitted `'end'` or `'close'`. * `Writable` streams which have emitted `'finish'` or `'close'`. @@ -2707,8 +2742,7 @@ const pipeline = util.promisify(stream.pipeline); const writable = fs.createWriteStream('./file'); (async function() { - const readable = Readable.from(iterable); - await pipeline(readable, writable); + await pipeline(iterable, writable); })(); ``` @@ -2843,7 +2877,7 @@ contain multi-byte characters. [`stream.cork()`]: #stream_writable_cork [`stream.finished()`]: #stream_stream_finished_stream_options_callback [`stream.pipe()`]: #stream_readable_pipe_destination_options -[`stream.pipeline()`]: #stream_stream_pipeline_streams_callback +[`stream.pipeline()`]: #stream_stream_pipeline_source_transforms_destination_callback [`stream.uncork()`]: #stream_writable_uncork [`stream.unpipe()`]: #stream_readable_unpipe_destination [`stream.wrap()`]: #stream_readable_wrap_stream diff --git a/lib/internal/streams/pipeline.js b/lib/internal/streams/pipeline.js index 92a91c30171af1..e0834171bfb8fc 100644 --- a/lib/internal/streams/pipeline.js +++ b/lib/internal/streams/pipeline.js @@ -5,21 +5,37 @@ const { ArrayIsArray, + SymbolAsyncIterator, + SymbolIterator } = primordials; let eos; const { once } = require('internal/util'); const { + ERR_INVALID_ARG_TYPE, + ERR_INVALID_RETURN_VALUE, ERR_INVALID_CALLBACK, ERR_MISSING_ARGS, ERR_STREAM_DESTROYED } = require('internal/errors').codes; +let EE; +let PassThrough; +let createReadableStreamAsyncIterator; + function isRequest(stream) { return stream && stream.setHeader && typeof stream.abort === 'function'; } +function destroyStream(stream, err) { + // request.destroy just do .end - .abort is what we want + if (isRequest(stream)) return stream.abort(); + if (isRequest(stream.req)) return stream.req.abort(); + if (typeof stream.destroy === 'function') return stream.destroy(err); + if (typeof stream.close === 'function') return stream.close(); +} + function destroyer(stream, reading, writing, callback) { callback = once(callback); @@ -41,19 +57,12 @@ function destroyer(stream, reading, writing, callback) { if (destroyed) return; destroyed = true; - // request.destroy just do .end - .abort is what we want - if (isRequest(stream)) return stream.abort(); - if (isRequest(stream.req)) return stream.req.abort(); - if (typeof stream.destroy === 'function') return stream.destroy(err); + destroyStream(stream, err); callback(err || new ERR_STREAM_DESTROYED('pipe')); }; } -function pipe(from, to) { - return from.pipe(to); -} - function popCallback(streams) { // Streams should never be an empty array. It should always contain at least // a single stream. Therefore optimize for the average case instead of @@ -63,8 +72,89 @@ function popCallback(streams) { return streams.pop(); } +function isPromise(obj) { + return !!(obj && typeof obj.then === 'function'); +} + +function isReadable(obj) { + return !!(obj && typeof obj.pipe === 'function'); +} + +function isWritable(obj) { + return !!(obj && typeof obj.write === 'function'); +} + +function isStream(obj) { + return isReadable(obj) || isWritable(obj); +} + +function isIterable(obj, isAsync) { + if (!obj) return false; + if (isAsync === true) return typeof obj[SymbolAsyncIterator] === 'function'; + if (isAsync === false) return typeof obj[SymbolIterator] === 'function'; + return typeof obj[SymbolAsyncIterator] === 'function' || + typeof obj[SymbolIterator] === 'function'; +} + +function makeAsyncIterable(val) { + if (isIterable(val)) { + return val; + } else if (isReadable(val)) { + // Legacy streams are not Iterable. + return _fromReadable(val); + } else { + throw new ERR_INVALID_ARG_TYPE( + 'val', ['Readable', 'Iterable', 'AsyncIterable'], val); + } +} + +async function* _fromReadable(val) { + if (!createReadableStreamAsyncIterator) { + createReadableStreamAsyncIterator = + require('internal/streams/async_iterator'); + } + + try { + if (typeof val.read !== 'function') { + // createReadableStreamAsyncIterator does not support + // v1 streams. Convert it into a v2 stream. + + if (!PassThrough) { + PassThrough = require('_stream_passthrough'); + } + + const pt = new PassThrough(); + val + .on('error', (err) => pt.destroy(err)) + .pipe(pt); + yield* createReadableStreamAsyncIterator(pt); + } else { + yield* createReadableStreamAsyncIterator(val); + } + } finally { + destroyStream(val); + } +} + +async function pump(iterable, writable, finish) { + if (!EE) { + EE = require('events'); + } + try { + for await (const chunk of iterable) { + if (!writable.write(chunk)) { + if (writable.destroyed) return; + await EE.once(writable, 'drain'); + } + } + writable.end(); + } catch (err) { + finish(err); + } +} + function pipeline(...streams) { - const callback = popCallback(streams); + const callback = once(popCallback(streams)); if (ArrayIsArray(streams[0])) streams = streams[0]; @@ -73,25 +163,104 @@ function pipeline(...streams) { } let error; - const destroys = streams.map(function(stream, i) { + const destroys = []; + + function finish(err, val, final) { + if (!error && err) { + error = err; + } + + if (error || final) { + for (const destroy of destroys) { + destroy(error); + } + } + + if (final) { + callback(error, val); + } + } + + function wrap(stream, reading, writing, final) { + destroys.push(destroyer(stream, reading, writing, (err) => { + finish(err, null, final); + })); + } + + let ret; + for (let i = 0; i < streams.length; i++) { + const stream = streams[i]; const reading = i < streams.length - 1; const writing = i > 0; - return destroyer(stream, reading, writing, function(err) { - if (!error) error = err; - if (err) { - for (const destroy of destroys) { - destroy(err); + + if (isStream(stream)) { + wrap(stream, reading, writing, !reading); + } + + if (i === 0) { + if (typeof stream === 'function') { + ret = stream(); + if (!isIterable(ret)) { + throw new ERR_INVALID_RETURN_VALUE( + 'Iterable, AsyncIterable or Stream', 'source', ret); } + } else if (isIterable(stream) || isReadable(stream)) { + ret = stream; + } else { + throw new ERR_INVALID_ARG_TYPE( + 'source', ['Stream', 'Iterable', 'AsyncIterable', 'Function'], + stream); } - if (reading) return; - for (const destroy of destroys) { - destroy(); + } else if (typeof stream === 'function') { + ret = makeAsyncIterable(ret); + ret = stream(ret); + + if (reading) { + if (!isIterable(ret, true)) { + throw new ERR_INVALID_RETURN_VALUE( + 'AsyncIterable', `transform[${i - 1}]`, ret); + } + } else { + if (!PassThrough) { + PassThrough = require('_stream_passthrough'); + } + + const pt = new PassThrough(); + if (isPromise(ret)) { + ret + .then((val) => { + pt.end(val); + finish(null, val, true); + }) + .catch((err) => { + finish(err, null, true); + }); + } else if (isIterable(ret, true)) { + pump(ret, pt, finish); + } else { + throw new ERR_INVALID_RETURN_VALUE( + 'AsyncIterable or Promise', 'destination', ret); + } + + ret = pt; + wrap(ret, true, false, true); } - callback(error); - }); - }); + } else if (isStream(stream)) { + if (isReadable(ret)) { + ret.pipe(stream); + } else { + ret = makeAsyncIterable(ret); + pump(ret, stream, finish); + } + ret = stream; + } else { + const name = reading ? `transform[${i - 1}]` : 'destination'; + throw new ERR_INVALID_ARG_TYPE( + name, ['Stream', 'Function'], ret); + } + } - return streams.reduce(pipe); + return ret; } module.exports = pipeline; diff --git a/test/parallel/test-stream-pipeline.js b/test/parallel/test-stream-pipeline.js index f6ee97ba43d053..19fc246e2bf3cd 100644 --- a/test/parallel/test-stream-pipeline.js +++ b/test/parallel/test-stream-pipeline.js @@ -516,3 +516,400 @@ const { promisify } = require('util'); }).on('error', common.mustNotCall()); }); } + +{ + let res = ''; + const w = new Writable({ + write(chunk, encoding, callback) { + res += chunk; + callback(); + } + }); + pipeline(function*() { + yield 'hello'; + yield 'world'; + }(), w, common.mustCall((err) => { + assert.ok(!err); + assert.strictEqual(res, 'helloworld'); + })); +} + +{ + let res = ''; + const w = new Writable({ + write(chunk, encoding, callback) { + res += chunk; + callback(); + } + }); + pipeline(async function*() { + await Promise.resolve(); + yield 'hello'; + yield 'world'; + }(), w, common.mustCall((err) => { + assert.ok(!err); + assert.strictEqual(res, 'helloworld'); + })); +} + +{ + let res = ''; + const w = new Writable({ + write(chunk, encoding, callback) { + res += chunk; + callback(); + } + }); + pipeline(function*() { + yield 'hello'; + yield 'world'; + }, w, common.mustCall((err) => { + assert.ok(!err); + assert.strictEqual(res, 'helloworld'); + })); +} + +{ + let res = ''; + const w = new Writable({ + write(chunk, encoding, callback) { + res += chunk; + callback(); + } + }); + pipeline(async function*() { + await Promise.resolve(); + yield 'hello'; + yield 'world'; + }, w, common.mustCall((err) => { + assert.ok(!err); + assert.strictEqual(res, 'helloworld'); + })); +} + +{ + let res = ''; + pipeline(async function*() { + await Promise.resolve(); + yield 'hello'; + yield 'world'; + }, async function*(source) { + for await (const chunk of source) { + yield chunk.toUpperCase(); + } + }, async function(source) { + for await (const chunk of source) { + res += chunk; + } + }, common.mustCall((err) => { + assert.ok(!err); + assert.strictEqual(res, 'HELLOWORLD'); + })); +} + +{ + pipeline(async function*() { + await Promise.resolve(); + yield 'hello'; + yield 'world'; + }, async function*(source) { + const ret = []; + for await (const chunk of source) { + ret.push(chunk.toUpperCase()); + } + yield ret; + }, async function(source) { + let ret = ''; + for await (const chunk of source) { + ret += chunk; + } + return ret; + }, common.mustCall((err, val) => { + assert.ok(!err); + assert.strictEqual(val, 'HELLOWORLD'); + })); +} + +{ + // AsyncIterable destination is returned and finalizes. + + const ret = pipeline(async function*() { + await Promise.resolve(); + yield 'hello'; + }, async function*(source) { + for await (const chunk of source) { + chunk; + } + }, common.mustCall((err) => { + assert.strictEqual(err, undefined); + })); + ret.resume(); + assert.strictEqual(typeof ret.pipe, 'function'); +} + +{ + // AsyncFunction destination is not returned and error is + // propagated. + + const ret = pipeline(async function*() { + await Promise.resolve(); + throw new Error('kaboom'); + }, async function*(source) { + for await (const chunk of source) { + chunk; + } + }, common.mustCall((err) => { + assert.strictEqual(err.message, 'kaboom'); + })); + ret.resume(); + assert.strictEqual(typeof ret.pipe, 'function'); +} + +{ + const s = new PassThrough(); + pipeline(async function*() { + throw new Error('kaboom'); + }, s, common.mustCall((err) => { + assert.strictEqual(err.message, 'kaboom'); + assert.strictEqual(s.destroyed, true); + })); +} + +{ + const s = new PassThrough(); + pipeline(async function*() { + throw new Error('kaboom'); + }(), s, common.mustCall((err) => { + assert.strictEqual(err.message, 'kaboom'); + assert.strictEqual(s.destroyed, true); + })); +} + +{ + const s = new PassThrough(); + pipeline(function*() { + throw new Error('kaboom'); + }, s, common.mustCall((err, val) => { + assert.strictEqual(err.message, 'kaboom'); + assert.strictEqual(s.destroyed, true); + })); +} + +{ + const s = new PassThrough(); + pipeline(function*() { + throw new Error('kaboom'); + }(), s, common.mustCall((err, val) => { + assert.strictEqual(err.message, 'kaboom'); + assert.strictEqual(s.destroyed, true); + })); +} + +{ + const s = new PassThrough(); + pipeline(async function*() { + await Promise.resolve(); + yield 'hello'; + yield 'world'; + }, s, async function(source) { + for await (const chunk of source) { + chunk; + throw new Error('kaboom'); + } + }, common.mustCall((err, val) => { + assert.strictEqual(err.message, 'kaboom'); + assert.strictEqual(s.destroyed, true); + })); +} + +{ + const s = new PassThrough(); + const ret = pipeline(function() { + return ['hello', 'world']; + }, s, async function*(source) { + for await (const chunk of source) { + chunk; + throw new Error('kaboom'); + } + }, common.mustCall((err) => { + assert.strictEqual(err.message, 'kaboom'); + assert.strictEqual(s.destroyed, true); + })); + ret.resume(); + assert.strictEqual(typeof ret.pipe, 'function'); +} + +{ + // Legacy streams without async iterator. + + const s = new PassThrough(); + s.push('asd'); + s.push(null); + s[Symbol.asyncIterator] = null; + let ret = ''; + pipeline(s, async function(source) { + for await (const chunk of source) { + ret += chunk; + } + }, common.mustCall((err) => { + assert.strictEqual(err, undefined); + assert.strictEqual(ret, 'asd'); + assert.strictEqual(s.destroyed, true); + })); +} + +{ + // v1 streams without read(). + + const s = new Stream(); + process.nextTick(() => { + s.emit('data', 'asd'); + s.emit('end'); + }); + s.close = common.mustCall(); + let ret = ''; + pipeline(s, async function(source) { + for await (const chunk of source) { + ret += chunk; + } + }, common.mustCall((err) => { + assert.strictEqual(err, undefined); + assert.strictEqual(ret, 'asd'); + assert.strictEqual(s.destroyed, true); + })); +} + +{ + // v1 error streams without read(). + + const s = new Stream(); + process.nextTick(() => { + s.emit('error', new Error('kaboom')); + }); + s.destroy = common.mustCall(); + pipeline(s, async function(source) { + }, common.mustCall((err) => { + assert.strictEqual(err.message, 'kaboom'); + })); +} + +{ + const s = new PassThrough(); + assert.throws(() => { + pipeline(function(source) { + }, s, () => {}); + }, (err) => { + assert.strictEqual(err.code, 'ERR_INVALID_RETURN_VALUE'); + assert.strictEqual(s.destroyed, false); + return true; + }); +} + +{ + const s = new PassThrough(); + assert.throws(() => { + pipeline(s, function(source) { + }, s, () => {}); + }, (err) => { + assert.strictEqual(err.code, 'ERR_INVALID_RETURN_VALUE'); + assert.strictEqual(s.destroyed, false); + return true; + }); +} + +{ + const s = new PassThrough(); + assert.throws(() => { + pipeline(s, function(source) { + }, () => {}); + }, (err) => { + assert.strictEqual(err.code, 'ERR_INVALID_RETURN_VALUE'); + assert.strictEqual(s.destroyed, false); + return true; + }); +} + +{ + const s = new PassThrough(); + assert.throws(() => { + pipeline(s, function*(source) { + }, () => {}); + }, (err) => { + assert.strictEqual(err.code, 'ERR_INVALID_RETURN_VALUE'); + assert.strictEqual(s.destroyed, false); + return true; + }); +} + +{ + let res = ''; + pipeline(async function*() { + await Promise.resolve(); + yield 'hello'; + yield 'world'; + }, new Transform({ + transform(chunk, encoding, cb) { + cb(new Error('kaboom')); + } + }), async function(source) { + for await (const chunk of source) { + res += chunk; + } + }, common.mustCall((err) => { + assert.strictEqual(err.message, 'kaboom'); + assert.strictEqual(res, ''); + })); +} + +{ + let res = ''; + pipeline(async function*() { + await Promise.resolve(); + yield 'hello'; + yield 'world'; + }, new Transform({ + transform(chunk, encoding, cb) { + process.nextTick(cb, new Error('kaboom')); + } + }), async function(source) { + for await (const chunk of source) { + res += chunk; + } + }, common.mustCall((err) => { + assert.strictEqual(err.message, 'kaboom'); + assert.strictEqual(res, ''); + })); +} + +{ + let res = ''; + pipeline(async function*() { + await Promise.resolve(); + yield 'hello'; + yield 'world'; + }, new Transform({ + decodeStrings: false, + transform(chunk, encoding, cb) { + cb(null, chunk.toUpperCase()); + } + }), async function(source) { + for await (const chunk of source) { + res += chunk; + } + }, common.mustCall((err) => { + assert.ok(!err); + assert.strictEqual(res, 'HELLOWORLD'); + })); +} + +{ + // Ensure no unhandled rejection from async function. + + pipeline(async function*() { + yield 'hello'; + }, async function(source) { + throw new Error('kaboom'); + }, common.mustCall((err) => { + assert.strictEqual(err.message, 'kaboom'); + })); +} diff --git a/tools/doc/type-parser.js b/tools/doc/type-parser.js index add331016c2204..02b59d37ffd278 100644 --- a/tools/doc/type-parser.js +++ b/tools/doc/type-parser.js @@ -28,6 +28,8 @@ const customTypesMap = { 'AsyncIterator': 'https://tc39.github.io/ecma262/#sec-asynciterator-interface', + 'AsyncIterable': 'https://tc39.github.io/ecma262/#sec-asynciterable-interface', + 'bigint': `${jsDocPrefix}Reference/Global_Objects/BigInt`, 'Iterable': From 313ecaabe5e82f48d3d8a71b234cefe5c577972c Mon Sep 17 00:00:00 2001 From: Robert Nagy Date: Fri, 21 Feb 2020 12:50:34 +0100 Subject: [PATCH 76/91] stream: fix broken pipeline error propagation If the destination was an async function any error thrown from that function would be swallowed. Backport-PR-URL: https://github.com/nodejs/node/pull/31975 PR-URL: https://github.com/nodejs/node/pull/31835 Reviewed-By: Benjamin Gruenbaum Reviewed-By: Matteo Collina Reviewed-By: Denys Otrishko --- lib/internal/streams/pipeline.js | 16 ++++++------ .../parallel/test-stream-pipeline-uncaught.js | 25 +++++++++++++++++++ test/parallel/test-stream-pipeline.js | 6 +---- 3 files changed, 34 insertions(+), 13 deletions(-) create mode 100644 test/parallel/test-stream-pipeline-uncaught.js diff --git a/lib/internal/streams/pipeline.js b/lib/internal/streams/pipeline.js index e0834171bfb8fc..fdc154c32edf5d 100644 --- a/lib/internal/streams/pipeline.js +++ b/lib/internal/streams/pipeline.js @@ -163,9 +163,10 @@ function pipeline(...streams) { } let error; + let value; const destroys = []; - function finish(err, val, final) { + function finish(err, final) { if (!error && err) { error = err; } @@ -177,13 +178,13 @@ function pipeline(...streams) { } if (final) { - callback(error, val); + callback(error, value); } } function wrap(stream, reading, writing, final) { destroys.push(destroyer(stream, reading, writing, (err) => { - finish(err, null, final); + finish(err, final); })); } @@ -229,11 +230,10 @@ function pipeline(...streams) { if (isPromise(ret)) { ret .then((val) => { + value = val; pt.end(val); - finish(null, val, true); - }) - .catch((err) => { - finish(err, null, true); + }, (err) => { + pt.destroy(err); }); } else if (isIterable(ret, true)) { pump(ret, pt, finish); @@ -243,7 +243,7 @@ function pipeline(...streams) { } ret = pt; - wrap(ret, true, false, true); + wrap(ret, false, true, true); } } else if (isStream(stream)) { if (isReadable(ret)) { diff --git a/test/parallel/test-stream-pipeline-uncaught.js b/test/parallel/test-stream-pipeline-uncaught.js new file mode 100644 index 00000000000000..90d141ec44fef1 --- /dev/null +++ b/test/parallel/test-stream-pipeline-uncaught.js @@ -0,0 +1,25 @@ +'use strict'; + +const common = require('../common'); +const { + pipeline, + PassThrough +} = require('stream'); +const assert = require('assert'); + +process.on('uncaughtException', common.mustCall((err) => { + assert.strictEqual(err.message, 'error'); +})); + +// Ensure that pipeline that ends with Promise +// still propagates error to uncaughtException. +const s = new PassThrough(); +s.end('data'); +pipeline(s, async function(source) { + for await (const chunk of source) { + chunk; + } +}, common.mustCall((err) => { + assert.ifError(err); + throw new Error('error'); +})); diff --git a/test/parallel/test-stream-pipeline.js b/test/parallel/test-stream-pipeline.js index 19fc246e2bf3cd..b3d4064c6a9783 100644 --- a/test/parallel/test-stream-pipeline.js +++ b/test/parallel/test-stream-pipeline.js @@ -613,11 +613,9 @@ const { promisify } = require('util'); yield 'hello'; yield 'world'; }, async function*(source) { - const ret = []; for await (const chunk of source) { - ret.push(chunk.toUpperCase()); + yield chunk.toUpperCase(); } - yield ret; }, async function(source) { let ret = ''; for await (const chunk of source) { @@ -754,7 +752,6 @@ const { promisify } = require('util'); }, common.mustCall((err) => { assert.strictEqual(err, undefined); assert.strictEqual(ret, 'asd'); - assert.strictEqual(s.destroyed, true); })); } @@ -775,7 +772,6 @@ const { promisify } = require('util'); }, common.mustCall((err) => { assert.strictEqual(err, undefined); assert.strictEqual(ret, 'asd'); - assert.strictEqual(s.destroyed, true); })); } From 8a2b62e4cda45c853043534273bc7df6828fd25f Mon Sep 17 00:00:00 2001 From: Robert Nagy Date: Mon, 24 Feb 2020 23:38:16 +0100 Subject: [PATCH 77/91] stream: ensure pipeline always destroys streams There was an edge case where an incorrect assumption was made in regardos whether eos/finished means that the stream is actually destroyed or not. Backport-PR-URL: https://github.com/nodejs/node/pull/31975 PR-URL: https://github.com/nodejs/node/pull/31940 Reviewed-By: Matteo Collina Reviewed-By: Ruben Bridgewater Reviewed-By: Luigi Pinca --- lib/internal/streams/pipeline.js | 17 +++++------------ test/parallel/test-stream-pipeline.js | 15 ++++++++++++++- 2 files changed, 19 insertions(+), 13 deletions(-) diff --git a/lib/internal/streams/pipeline.js b/lib/internal/streams/pipeline.js index fdc154c32edf5d..df0d7bc85d366c 100644 --- a/lib/internal/streams/pipeline.js +++ b/lib/internal/streams/pipeline.js @@ -38,27 +38,20 @@ function destroyStream(stream, err) { function destroyer(stream, reading, writing, callback) { callback = once(callback); - - let closed = false; - stream.on('close', () => { - closed = true; - }); + let destroyed = false; if (eos === undefined) eos = require('internal/streams/end-of-stream'); eos(stream, { readable: reading, writable: writing }, (err) => { - if (err) return callback(err); - closed = true; - callback(); + if (destroyed) return; + destroyed = true; + destroyStream(stream, err); + callback(err); }); - let destroyed = false; return (err) => { - if (closed) return; if (destroyed) return; destroyed = true; - destroyStream(stream, err); - callback(err || new ERR_STREAM_DESTROYED('pipe')); }; } diff --git a/test/parallel/test-stream-pipeline.js b/test/parallel/test-stream-pipeline.js index b3d4064c6a9783..6bfa1331834968 100644 --- a/test/parallel/test-stream-pipeline.js +++ b/test/parallel/test-stream-pipeline.js @@ -763,7 +763,10 @@ const { promisify } = require('util'); s.emit('data', 'asd'); s.emit('end'); }); - s.close = common.mustCall(); + // 'destroyer' can be called multiple times, + // once from stream wrapper and + // once from iterator wrapper. + s.close = common.mustCallAtLeast(1); let ret = ''; pipeline(s, async function(source) { for await (const chunk of source) { @@ -909,3 +912,13 @@ const { promisify } = require('util'); assert.strictEqual(err.message, 'kaboom'); })); } + +{ + const src = new PassThrough({ autoDestroy: false }); + const dst = new PassThrough({ autoDestroy: false }); + pipeline(src, dst, common.mustCall(() => { + assert.strictEqual(src.destroyed, true); + assert.strictEqual(dst.destroyed, true); + })); + src.end(); +} From e6125cd53b9c6813db8153b8079294a0833a7e3c Mon Sep 17 00:00:00 2001 From: Matheus Marchini Date: Tue, 25 Feb 2020 16:00:48 -0800 Subject: [PATCH 78/91] deps: V8: backport f7771e5b0cc4 Original commit message: [runtime] Recompute enumeration indices of dictionaries upon bitfield overflow Otherwise we'll get weird semantics when enumerating objects after many deletes/reinserts. Bug: chromium:1033771 Change-Id: If0a459169c3794a30d9632d09e80da3cfcd4302c Reviewed-on: https://chromium-review.googlesource.com/c/v8/v8/+/1993966 Commit-Queue: Toon Verwaest Reviewed-by: Jakob Kummerow Reviewed-by: Victor Gomes Cr-Commit-Position: refs/heads/master@{#65690} Refs: https://github.com/v8/v8/commit/f7771e5b0cc46ab75fe5291482a727b3f115dcba PR-URL: https://github.com/nodejs/node/pull/31957 Reviewed-By: Anna Henningsen Reviewed-By: Jiawen Geng Reviewed-By: Colin Ihrig Reviewed-By: Myles Borins --- common.gypi | 2 +- deps/v8/src/objects/dictionary-inl.h | 6 ++-- deps/v8/src/objects/dictionary.h | 15 ++++----- deps/v8/src/objects/hash-table.h | 2 +- deps/v8/src/objects/js-objects.cc | 2 +- deps/v8/src/objects/literal-objects.cc | 2 +- deps/v8/src/objects/lookup.cc | 4 +-- deps/v8/src/objects/objects.cc | 46 ++++++++++++++------------ 8 files changed, 40 insertions(+), 39 deletions(-) diff --git a/common.gypi b/common.gypi index b8a37491893c91..0f52e138ca2fe0 100644 --- a/common.gypi +++ b/common.gypi @@ -39,7 +39,7 @@ # Reset this number to 0 on major V8 upgrades. # Increment by one for each non-official patch applied to deps/v8. - 'v8_embedder_string': '-node.28', + 'v8_embedder_string': '-node.29', ##### V8 defaults for Node.js ##### diff --git a/deps/v8/src/objects/dictionary-inl.h b/deps/v8/src/objects/dictionary-inl.h index 18b2ee67a4db59..3a5b8cb3c7e201 100644 --- a/deps/v8/src/objects/dictionary-inl.h +++ b/deps/v8/src/objects/dictionary-inl.h @@ -61,13 +61,13 @@ BaseNameDictionary::BaseNameDictionary(Address ptr) : Dictionary(ptr) {} template -void BaseNameDictionary::SetNextEnumerationIndex(int index) { - DCHECK_NE(0, index); +void BaseNameDictionary::set_next_enumeration_index(int index) { + DCHECK_LT(0, index); this->set(kNextEnumerationIndexIndex, Smi::FromInt(index)); } template -int BaseNameDictionary::NextEnumerationIndex() { +int BaseNameDictionary::next_enumeration_index() { return Smi::ToInt(this->get(kNextEnumerationIndexIndex)); } diff --git a/deps/v8/src/objects/dictionary.h b/deps/v8/src/objects/dictionary.h index 35137c7d945430..eb15a77e33e63b 100644 --- a/deps/v8/src/objects/dictionary.h +++ b/deps/v8/src/objects/dictionary.h @@ -120,10 +120,6 @@ class EXPORT_TEMPLATE_DECLARE(V8_EXPORT_PRIVATE) BaseNameDictionary static const int kObjectHashIndex = kNextEnumerationIndexIndex + 1; static const int kEntryValueIndex = 1; - // Accessors for next enumeration index. - inline void SetNextEnumerationIndex(int index); - inline int NextEnumerationIndex(); - inline void SetHash(int hash); inline int Hash() const; @@ -138,6 +134,13 @@ class EXPORT_TEMPLATE_DECLARE(V8_EXPORT_PRIVATE) BaseNameDictionary V8_WARN_UNUSED_RESULT static ExceptionStatus CollectKeysTo( Handle dictionary, KeyAccumulator* keys); + // Allocate the next enumeration index. Possibly updates all enumeration + // indices in the table. + static int NextEnumerationIndex(Isolate* isolate, Handle dictionary); + // Accessors for next enumeration index. + inline int next_enumeration_index(); + inline void set_next_enumeration_index(int index); + // Return the key indices sorted by its enumeration index. static Handle IterationIndices(Isolate* isolate, Handle dictionary); @@ -149,10 +152,6 @@ class EXPORT_TEMPLATE_DECLARE(V8_EXPORT_PRIVATE) BaseNameDictionary Handle storage, KeyCollectionMode mode, KeyAccumulator* accumulator); - // Ensure enough space for n additional elements. - static Handle EnsureCapacity(Isolate* isolate, - Handle dictionary, int n); - V8_WARN_UNUSED_RESULT static Handle AddNoUpdateNextEnumerationIndex( Isolate* isolate, Handle dictionary, Key key, Handle value, PropertyDetails details, int* entry_out = nullptr); diff --git a/deps/v8/src/objects/hash-table.h b/deps/v8/src/objects/hash-table.h index 5cdeb0c0ec4f9d..32013e58e56a6d 100644 --- a/deps/v8/src/objects/hash-table.h +++ b/deps/v8/src/objects/hash-table.h @@ -201,7 +201,7 @@ class EXPORT_TEMPLATE_DECLARE(V8_EXPORT_PRIVATE) HashTable // Ensure enough space for n additional elements. V8_WARN_UNUSED_RESULT static Handle EnsureCapacity( - Isolate* isolate, Handle table, int n, + Isolate* isolate, Handle table, int n = 1, AllocationType allocation = AllocationType::kYoung); // Returns true if this table has sufficient capacity for adding n elements. diff --git a/deps/v8/src/objects/js-objects.cc b/deps/v8/src/objects/js-objects.cc index ea0917f18feb10..4b1d9a4c862246 100644 --- a/deps/v8/src/objects/js-objects.cc +++ b/deps/v8/src/objects/js-objects.cc @@ -2908,7 +2908,7 @@ void MigrateFastToSlow(Isolate* isolate, Handle object, } // Copy the next enumeration index from instance descriptor. - dictionary->SetNextEnumerationIndex(real_size + 1); + dictionary->set_next_enumeration_index(real_size + 1); // From here on we cannot fail and we shouldn't GC anymore. DisallowHeapAllocation no_allocation; diff --git a/deps/v8/src/objects/literal-objects.cc b/deps/v8/src/objects/literal-objects.cc index 98c41cbfb5f49f..827a8b10219290 100644 --- a/deps/v8/src/objects/literal-objects.cc +++ b/deps/v8/src/objects/literal-objects.cc @@ -363,7 +363,7 @@ class ObjectDescriptor { void Finalize(Isolate* isolate) { if (HasDictionaryProperties()) { - properties_dictionary_template_->SetNextEnumerationIndex( + properties_dictionary_template_->set_next_enumeration_index( next_enumeration_index_); computed_properties_ = FixedArray::ShrinkOrEmpty( isolate, computed_properties_, current_computed_index_); diff --git a/deps/v8/src/objects/lookup.cc b/deps/v8/src/objects/lookup.cc index 7f626cc22332e2..0700a6fc921b73 100644 --- a/deps/v8/src/objects/lookup.cc +++ b/deps/v8/src/objects/lookup.cc @@ -634,8 +634,8 @@ void LookupIterator::PrepareTransitionToDataProperty( transition_ = cell; // Assign an enumeration index to the property and update // SetNextEnumerationIndex. - int index = dictionary->NextEnumerationIndex(); - dictionary->SetNextEnumerationIndex(index + 1); + int index = GlobalDictionary::NextEnumerationIndex(isolate_, dictionary); + dictionary->set_next_enumeration_index(index + 1); property_details_ = PropertyDetails( kData, attributes, PropertyCellType::kUninitialized, index); PropertyCellType new_type = diff --git a/deps/v8/src/objects/objects.cc b/deps/v8/src/objects/objects.cc index 723023b707947f..1328b517e8ba01 100644 --- a/deps/v8/src/objects/objects.cc +++ b/deps/v8/src/objects/objects.cc @@ -6677,7 +6677,7 @@ void StringTable::EnsureCapacityForDeserialization(Isolate* isolate, int expected) { Handle table = isolate->factory()->string_table(); // We need a key instance for the virtual hash function. - table = StringTable::EnsureCapacity(isolate, table, expected); + table = EnsureCapacity(isolate, table, expected); isolate->heap()->SetRootStringTable(*table); } @@ -6729,7 +6729,7 @@ Handle StringTable::LookupKey(Isolate* isolate, StringTableKey* key) { table = StringTable::CautiousShrink(isolate, table); // Adding new string. Grow table if needed. - table = StringTable::EnsureCapacity(isolate, table, 1); + table = EnsureCapacity(isolate, table); isolate->heap()->SetRootStringTable(*table); return AddKeyNoResize(isolate, key); @@ -6870,7 +6870,7 @@ Handle StringSet::New(Isolate* isolate) { Handle StringSet::Add(Isolate* isolate, Handle stringset, Handle name) { if (!stringset->Has(isolate, name)) { - stringset = EnsureCapacity(isolate, stringset, 1); + stringset = EnsureCapacity(isolate, stringset); uint32_t hash = ShapeT::Hash(isolate, *name); int entry = stringset->FindInsertionEntry(hash); stringset->set(EntryToIndex(entry), *name); @@ -6888,7 +6888,7 @@ Handle ObjectHashSet::Add(Isolate* isolate, Handle key) { int32_t hash = key->GetOrCreateHash(isolate).value(); if (!set->Has(isolate, key, hash)) { - set = EnsureCapacity(isolate, set, 1); + set = EnsureCapacity(isolate, set); int entry = set->FindInsertionEntry(hash); set->set(EntryToIndex(entry), *key); set->ElementAdded(); @@ -7084,7 +7084,7 @@ Handle CompilationCacheTable::PutScript( src = String::Flatten(isolate, src); StringSharedKey key(src, shared, language_mode, kNoSourcePosition); Handle k = key.AsHandle(isolate); - cache = EnsureCapacity(isolate, cache, 1); + cache = EnsureCapacity(isolate, cache); int entry = cache->FindInsertionEntry(key.Hash()); cache->set(EntryToIndex(entry), *k); cache->set(EntryToIndex(entry) + 1, *value); @@ -7116,7 +7116,7 @@ Handle CompilationCacheTable::PutEval( } } - cache = EnsureCapacity(isolate, cache, 1); + cache = EnsureCapacity(isolate, cache); int entry = cache->FindInsertionEntry(key.Hash()); Handle k = isolate->factory()->NewNumber(static_cast(key.Hash())); @@ -7130,7 +7130,7 @@ Handle CompilationCacheTable::PutRegExp( Isolate* isolate, Handle cache, Handle src, JSRegExp::Flags flags, Handle value) { RegExpKey key(src, flags); - cache = EnsureCapacity(isolate, cache, 1); + cache = EnsureCapacity(isolate, cache); int entry = cache->FindInsertionEntry(key.Hash()); // We store the value in the key slot, and compare the search key // to the stored value with a custon IsMatch function during lookups. @@ -7192,15 +7192,16 @@ Handle BaseNameDictionary::New( Handle dict = Dictionary::New( isolate, at_least_space_for, allocation, capacity_option); dict->SetHash(PropertyArray::kNoHashSentinel); - dict->SetNextEnumerationIndex(PropertyDetails::kInitialIndex); + dict->set_next_enumeration_index(PropertyDetails::kInitialIndex); return dict; } template -Handle BaseNameDictionary::EnsureCapacity( - Isolate* isolate, Handle dictionary, int n) { - // Check whether there are enough enumeration indices to add n elements. - if (!PropertyDetails::IsValidIndex(dictionary->NextEnumerationIndex() + n)) { +int BaseNameDictionary::NextEnumerationIndex( + Isolate* isolate, Handle dictionary) { + int index = dictionary->next_enumeration_index(); + // Check whether the next enumeration index is valid. + if (!PropertyDetails::IsValidIndex(index)) { // If not, we generate new indices for the properties. int length = dictionary->NumberOfElements(); @@ -7221,11 +7222,12 @@ Handle BaseNameDictionary::EnsureCapacity( dictionary->DetailsAtPut(isolate, index, new_details); } - // Set the next enumeration index. - dictionary->SetNextEnumerationIndex(PropertyDetails::kInitialIndex + - length); + index = PropertyDetails::kInitialIndex + length; } - return HashTable::EnsureCapacity(isolate, dictionary, n); + + // Don't update the next enumeration index here, since we might be looking at + // an immutable empty dictionary. + return index; } template @@ -7274,13 +7276,13 @@ Handle BaseNameDictionary::Add( DCHECK_EQ(0, details.dictionary_index()); // Assign an enumeration index to the property and update // SetNextEnumerationIndex. - int index = dictionary->NextEnumerationIndex(); + int index = Derived::NextEnumerationIndex(isolate, dictionary); details = details.set_index(index); dictionary = AddNoUpdateNextEnumerationIndex(isolate, dictionary, key, value, details, entry_out); // Update enumeration index here in order to avoid potential modification of // the canonical empty dictionary which lives in read only space. - dictionary->SetNextEnumerationIndex(index + 1); + dictionary->set_next_enumeration_index(index + 1); return dictionary; } @@ -7294,7 +7296,7 @@ Handle Dictionary::Add(Isolate* isolate, // Valdate key is absent. SLOW_DCHECK((dictionary->FindEntry(isolate, key) == Dictionary::kNotFound)); // Check whether the dictionary should be extended. - dictionary = Derived::EnsureCapacity(isolate, dictionary, 1); + dictionary = Derived::EnsureCapacity(isolate, dictionary); // Compute the key object. Handle k = Shape::AsHandle(isolate, key); @@ -7644,7 +7646,7 @@ Handle ObjectHashTableBase::Put(Isolate* isolate, } // Check whether the hash table should be extended. - table = Derived::EnsureCapacity(isolate, table, 1); + table = Derived::EnsureCapacity(isolate, table); table->AddEntry(table->FindInsertionEntry(hash), *key, *value); return table; } @@ -7892,8 +7894,8 @@ Handle PropertyCell::PrepareForValue( // Preserve the enumeration index unless the property was deleted or never // initialized. if (cell->value().IsTheHole(isolate)) { - index = dictionary->NextEnumerationIndex(); - dictionary->SetNextEnumerationIndex(index + 1); + index = GlobalDictionary::NextEnumerationIndex(isolate, dictionary); + dictionary->set_next_enumeration_index(index + 1); } else { index = original_details.dictionary_index(); } From 91ce69a55424c1ef1eabb211b99d1f31a30699ff Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Thu, 27 Feb 2020 19:52:54 -0800 Subject: [PATCH 79/91] meta: move Glen Keane to Collaborator Emeritus In email, Glen confirmed that he is happy to move to Emeritus. He also informed me that his handle is no longer thekemkid and is now glentiki. At his request, I've updated the handle in the README in addition to moving him to Emeritus. PR-URL: https://github.com/nodejs/node/pull/31993 Reviewed-By: Myles Borins Reviewed-By: Gireesh Punathil Reviewed-By: Colin Ihrig Reviewed-By: Luigi Pinca --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index f58b49932965e0..0e275981c33375 100644 --- a/README.md +++ b/README.md @@ -409,8 +409,6 @@ For information about the governance of the Node.js project, see **Michaël Zasso** <targos@protonmail.com> (he/him) * [thefourtheye](https://github.com/thefourtheye) - **Sakthipriyan Vairamani** <thechargingvolcano@gmail.com> (he/him) -* [thekemkid](https://github.com/thekemkid) - -**Glen Keane** <glenkeane.94@gmail.com> (he/him) * [TimothyGu](https://github.com/TimothyGu) - **Tiancheng "Timothy" Gu** <timothygu99@gmail.com> (he/him) * [tniessen](https://github.com/tniessen) - @@ -458,6 +456,8 @@ For information about the governance of the Node.js project, see **Alexander Makarenko** <estliberitas@gmail.com> * [firedfox](https://github.com/firedfox) - **Daniel Wang** <wangyang0123@gmail.com> +* [glentiki](https://github.com/glentiki) - +**Glen Keane** <glenkeane.94@gmail.com> (he/him) * [imran-iq](https://github.com/imran-iq) - **Imran Iqbal** <imran@imraniqbal.org> * [imyller](https://github.com/imyller) - From ded3890becc404de323dcc6c37ae9b73efdf7d0f Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Thu, 27 Feb 2020 19:57:56 -0800 Subject: [PATCH 80/91] meta: move maclover7 to Emeritus In email, maclover7 confirmed that are fine to move to emeritus status. PR-URL: https://github.com/nodejs/node/pull/31994 Reviewed-By: Myles Borins Reviewed-By: Gireesh Punathil Reviewed-By: Matheus Marchini Reviewed-By: Colin Ihrig Reviewed-By: Ruben Bridgewater Reviewed-By: Luigi Pinca --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 0e275981c33375..3c8924b2b02b56 100644 --- a/README.md +++ b/README.md @@ -345,8 +345,6 @@ For information about the governance of the Node.js project, see **Luigi Pinca** <luigipinca@gmail.com> (he/him) * [lundibundi](https://github.com/lundibundi) - **Denys Otrishko** <shishugi@gmail.com> (he/him) -* [maclover7](https://github.com/maclover7) - -**Jon Moss** <me@jonathanmoss.me> (he/him) * [mafintosh](https://github.com/mafintosh) - **Mathias Buus** <mathiasbuus@gmail.com> (he/him) * [mcollina](https://github.com/mcollina) - @@ -478,6 +476,8 @@ For information about the governance of the Node.js project, see **Luca Maraschi** <luca.maraschi@gmail.com> (he/him) * [lxe](https://github.com/lxe) - **Aleksey Smolenchuk** <lxe@lxe.co> +* [maclover7](https://github.com/maclover7) - +**Jon Moss** <me@jonathanmoss.me> (he/him) * [matthewloring](https://github.com/matthewloring) - **Matthew Loring** <mattloring@google.com> * [micnic](https://github.com/micnic) - From c801045fcd068c8c6112b247749caa36840ddaf8 Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Thu, 27 Feb 2020 21:04:46 -0800 Subject: [PATCH 81/91] meta: move jbergstroem to emeritus jbergstroem confirmed in email that they should go to collaborator emeritus. They still have their toe in the Build WG and will stay on there. Who knows, maybe they'll be back as a Collaborator before we know it. PR-URL: https://github.com/nodejs/node/pull/31996 Reviewed-By: Gireesh Punathil Reviewed-By: Anna Henningsen Reviewed-By: Colin Ihrig Reviewed-By: Luigi Pinca --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 3c8924b2b02b56..bdb9ede9bef5cb 100644 --- a/README.md +++ b/README.md @@ -321,8 +321,6 @@ For information about the governance of the Node.js project, see **Jackson Tian** <shyvo1987@gmail.com> * [jasnell](https://github.com/jasnell) - **James M Snell** <jasnell@gmail.com> (he/him) -* [jbergstroem](https://github.com/jbergstroem) - -**Johan Bergström** <bugs@bergstroem.nu> * [jdalton](https://github.com/jdalton) - **John-David Dalton** <john.david.dalton@gmail.com> * [jkrems](https://github.com/jkrems) - @@ -464,6 +462,8 @@ For information about the governance of the Node.js project, see **Isaac Z. Schlueter** <i@izs.me> * [jasongin](https://github.com/jasongin) - **Jason Ginchereau** <jasongin@microsoft.com> +* [jbergstroem](https://github.com/jbergstroem) - +**Johan Bergström** <bugs@bergstroem.nu> * [jhamhader](https://github.com/jhamhader) - **Yuval Brik** <yuval@brik.org.il> * [joshgav](https://github.com/joshgav) - From 3bd8feac0c0317b4c7e685a46bb21693d5059785 Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Thu, 27 Feb 2020 21:07:30 -0800 Subject: [PATCH 82/91] meta: move aqrln to emeritus aqrln confirmed in email that it makes sense to move them to emeritus at this point. PR-URL: https://github.com/nodejs/node/pull/31997 Reviewed-By: Gireesh Punathil Reviewed-By: Anna Henningsen Reviewed-By: Colin Ihrig Reviewed-By: Alexey Orlenko Reviewed-By: Denys Otrishko Reviewed-By: Ruben Bridgewater --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index bdb9ede9bef5cb..657fc478a37e5b 100644 --- a/README.md +++ b/README.md @@ -239,8 +239,6 @@ For information about the governance of the Node.js project, see **Anto Aravinth** <anto.aravinth.cse@gmail.com> (he/him) * [apapirovski](https://github.com/apapirovski) - **Anatoli Papirovski** <apapirovski@mac.com> (he/him) -* [aqrln](https://github.com/aqrln) - -**Alexey Orlenko** <eaglexrlnk@gmail.com> (he/him) * [bcoe](https://github.com/bcoe) - **Ben Coe** <bencoe@gmail.com> (he/him) * [bengl](https://github.com/bengl) - @@ -438,6 +436,8 @@ For information about the governance of the Node.js project, see **Andras** <andras@kinvey.com> * [AnnaMag](https://github.com/AnnaMag) - **Anna M. Kedzierska** <anna.m.kedzierska@gmail.com> +* [aqrln](https://github.com/aqrln) - +**Alexey Orlenko** <eaglexrlnk@gmail.com> (he/him) * [brendanashworth](https://github.com/brendanashworth) - **Brendan Ashworth** <brendan.ashworth@me.com> * [calvinmetcalf](https://github.com/calvinmetcalf) - From 38494746a6835945e208d5fa9f85214b2ffe8c68 Mon Sep 17 00:00:00 2001 From: Robert Nagy Date: Sat, 22 Feb 2020 22:01:06 +0100 Subject: [PATCH 83/91] test: fix flaky test-gc-net-timeout If the timeout is called in the time between 'end' and 'close' that would cause a EPIPE error. Essentially making the test flaky. PR-URL: https://github.com/nodejs/node/pull/31918 Reviewed-By: Anna Henningsen --- test/parallel/test-gc-net-timeout.js | 3 +++ 1 file changed, 3 insertions(+) diff --git a/test/parallel/test-gc-net-timeout.js b/test/parallel/test-gc-net-timeout.js index 51d9b8ca09bbbc..9ba6d2bc1744f9 100644 --- a/test/parallel/test-gc-net-timeout.js +++ b/test/parallel/test-gc-net-timeout.js @@ -12,6 +12,9 @@ function serverHandler(sock) { sock.on('close', function() { clearTimeout(timer); }); + sock.on('end', function() { + clearTimeout(timer); + }); sock.on('error', function(err) { assert.strictEqual(err.code, 'ECONNRESET'); }); From 9a41ced0d186d571e63b4dc992fbbbe3dc1d9792 Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Sun, 23 Feb 2020 03:25:09 -0800 Subject: [PATCH 84/91] build: only lint markdown files that have changed (POSIX-only) Update Makefile so that only markdown files that have changed will be linted. Currently, if one file in doc/api has changed, all files in doc/api are linted. On Windows, the lint-md task currently lints all files regardless of whether any files has changed, and that behavior is unchanged here. A further improvement is that when tools/lint-md.js is rebuilt, the timestamp file is removed so that all files are linted again. This is because rebuilding lint-md.js can introduce new rules or modify existing rules, so re-linting everything helps make sure that accidental breakage doesn't slip by unnoticed. PR-URL: https://github.com/nodejs/node/pull/31923 Reviewed-By: Anna Henningsen --- Makefile | 34 +++++++++++++++------------------- 1 file changed, 15 insertions(+), 19 deletions(-) diff --git a/Makefile b/Makefile index 34cdec7f7767b5..d1705fdc49d118 100644 --- a/Makefile +++ b/Makefile @@ -1165,6 +1165,7 @@ bench-addons-clean: .PHONY: lint-md-rollup lint-md-rollup: + $(RM) tools/.*mdlintstamp cd tools/node-lint-md-cli-rollup && npm install cd tools/node-lint-md-cli-rollup && npm run build-node @@ -1177,28 +1178,23 @@ lint-md-clean: lint-md-build: $(warning "Deprecated no-op target 'lint-md-build'") -LINT_MD_DOC_FILES = $(shell find doc -type f -name '*.md') -run-lint-doc-md = tools/lint-md.js -q -f $(LINT_MD_DOC_FILES) -# Lint all changed markdown files under doc/ -tools/.docmdlintstamp: $(LINT_MD_DOC_FILES) - @echo "Running Markdown linter on docs..." - @$(call available-node,$(run-lint-doc-md)) - @touch $@ +ifeq ("$(wildcard tools/.mdlintstamp)","") + LINT_MD_NEWER = +else + LINT_MD_NEWER = -newer tools/.mdlintstamp +endif -LINT_MD_TARGETS = src lib benchmark test tools/doc tools/icu -LINT_MD_ROOT_DOCS := $(wildcard *.md) -LINT_MD_MISC_FILES := $(shell find $(LINT_MD_TARGETS) -type f \ - ! -path '*node_modules*' ! -path 'test/fixtures/*' -name '*.md') \ - $(LINT_MD_ROOT_DOCS) -run-lint-misc-md = tools/lint-md.js -q -f $(LINT_MD_MISC_FILES) -# Lint other changed markdown files maintained by us -tools/.miscmdlintstamp: $(LINT_MD_MISC_FILES) - @echo "Running Markdown linter on misc docs..." - @$(call available-node,$(run-lint-misc-md)) +LINT_MD_TARGETS = doc src lib benchmark test tools/doc tools/icu $(wildcard *.md) +LINT_MD_FILES = $(shell find $(LINT_MD_TARGETS) -type f \ + ! -path '*node_modules*' ! -path 'test/fixtures/*' -name '*.md' \ + $(LINT_MD_NEWER)) +run-lint-md = tools/lint-md.js -q -f --no-stdout $(LINT_MD_FILES) +# Lint all changed markdown files maintained by us +tools/.mdlintstamp: $(LINT_MD_FILES) + @echo "Running Markdown linter..." + @$(call available-node,$(run-lint-md)) @touch $@ -tools/.mdlintstamp: tools/.miscmdlintstamp tools/.docmdlintstamp - .PHONY: lint-md # Lints the markdown documents maintained by us in the codebase. lint-md: | tools/.mdlintstamp From 166579f84bacf8568d87aaea1673a33e639ee422 Mon Sep 17 00:00:00 2001 From: unknown Date: Thu, 27 Feb 2020 12:30:24 -0500 Subject: [PATCH 85/91] doc: add link to sem-ver info MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/31985 Reviewed-By: Anna Henningsen Reviewed-By: Rich Trott Reviewed-By: Richard Lau Reviewed-By: Luigi Pinca Reviewed-By: Michael Dawson Reviewed-By: Gireesh Punathil Reviewed-By: Myles Borins Reviewed-By: Tobias Nießen --- doc/api/documentation.md | 8 +++++--- 1 file changed, 5 insertions(+), 3 deletions(-) diff --git a/doc/api/documentation.md b/doc/api/documentation.md index 14310f7baed68a..7059c4d33e11ca 100644 --- a/doc/api/documentation.md +++ b/doc/api/documentation.md @@ -27,9 +27,10 @@ The stability indices are as follows: -> Stability: 1 - Experimental. The feature is not subject to Semantic Versioning -> rules. Non-backward compatible changes or removal may occur in any future -> release. Use of the feature is not recommended in production environments. +> Stability: 1 - Experimental. The feature is not subject to +> [Semantic Versioning][] rules. Non-backward compatible changes or removal may +> occur in any future release. Use of the feature is not recommended in +> production environments. @@ -58,6 +59,7 @@ to the corresponding man pages which describe how the system call works. Most Unix system calls have Windows analogues. Still, behavior differences may be unavoidable. +[Semantic Versioning]: https://semver.org/ [the contributing guide]: https://github.com/nodejs/node/blob/master/CONTRIBUTING.md [the issue tracker]: https://github.com/nodejs/node/issues/new [V8 JavaScript engine]: https://v8.dev/ From 49864d161e6b722cef5e95f0fc9d824d743417f3 Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Fri, 28 Feb 2020 20:37:07 -0800 Subject: [PATCH 86/91] test: fix flaky test-dns-any.js MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Remove google.com from domains tested with ANY queries. Fixes: https://github.com/nodejs/node/issues/31721 PR-URL: https://github.com/nodejs/node/pull/32017 Reviewed-By: Anna Henningsen Reviewed-By: Tobias Nießen Reviewed-By: Luigi Pinca --- test/internet/test-dns-any.js | 22 ---------------------- 1 file changed, 22 deletions(-) diff --git a/test/internet/test-dns-any.js b/test/internet/test-dns-any.js index 3e8eb07e7e6e52..d60f00f09804a7 100644 --- a/test/internet/test-dns-any.js +++ b/test/internet/test-dns-any.js @@ -115,28 +115,6 @@ function processResult(res) { return types; } -TEST(async function test_google(done) { - function validateResult(res) { - const types = processResult(res); - assert.ok( - types.A && types.AAAA && types.MX && types.NS && types.TXT && types.SOA, - `Missing record type, found ${Object.keys(types)}`); - } - - validateResult(await dnsPromises.resolve('google.com', 'ANY')); - - const req = dns.resolve( - 'google.com', - 'ANY', - common.mustCall(function(err, ret) { - assert.ifError(err); - validateResult(ret); - done(); - })); - - checkWrap(req); -}); - TEST(async function test_sip2sip_for_naptr(done) { function validateResult(res) { const types = processResult(res); From cd30dbb0d64acb709b5d3503d71a0acd5c6c5460 Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Fri, 28 Feb 2020 22:12:47 -0800 Subject: [PATCH 87/91] doc: revise --zero-fill-buffers text in buffer.md There was an unclear sentence fragment that needed fixing, so I edited the entire paragraph for clarity. I also removed irrelevant information about behavior before Node.js 8.0.0. That version of Node.js is no longer supported and these docs will never apply to 8.0.0. (At the time of this writing, 10.x is the oldest supported line, and so changes to the docs will never be backported farther than the 10.x docs.) PR-URL: https://github.com/nodejs/node/pull/32019 Reviewed-By: Anna Henningsen Reviewed-By: David Carlier Reviewed-By: Ruben Bridgewater --- doc/api/buffer.md | 15 ++++++--------- 1 file changed, 6 insertions(+), 9 deletions(-) diff --git a/doc/api/buffer.md b/doc/api/buffer.md index 0d7dee9df4528a..ed8da05bc47e7f 100644 --- a/doc/api/buffer.md +++ b/doc/api/buffer.md @@ -123,15 +123,12 @@ added: v5.10.0 --> Node.js can be started using the `--zero-fill-buffers` command line option to -cause all newly allocated `Buffer` instances to be zero-filled upon creation by -default. Before Node.js 8.0.0, this included buffers allocated by `new -Buffer(size)`. Since Node.js 8.0.0, buffers allocated with `new` are always -zero-filled, whether this option is used or not. -[`Buffer.allocUnsafe()`][], [`Buffer.allocUnsafeSlow()`][], and `new -SlowBuffer(size)`. Use of this flag can have a significant negative impact on -performance. Use of the `--zero-fill-buffers` option is recommended only when -necessary to enforce that newly allocated `Buffer` instances cannot contain old -data that is potentially sensitive. +cause all newly-allocated `Buffer` instances to be zero-filled upon creation by +default. Without the option, buffers created with [`Buffer.allocUnsafe()`][], +[`Buffer.allocUnsafeSlow()`][], and `new SlowBuffer(size)` are not zero-filled. +Use of this flag can have a significant negative impact on performance. Use the +`--zero-fill-buffers` option only when necessary to enforce that newly allocated +`Buffer` instances cannot contain old data that is potentially sensitive. ```console $ node --zero-fill-buffers From 932563473cf9b1b9c2bca7587f7e3d67e915f5e3 Mon Sep 17 00:00:00 2001 From: Andrey Pechkurov Date: Fri, 28 Feb 2020 10:07:47 +0300 Subject: [PATCH 88/91] test: improve disable AsyncLocalStorage test PR-URL: https://github.com/nodejs/node/pull/31998 Reviewed-By: Anna Henningsen Reviewed-By: Vladimir de Turckheim Reviewed-By: James M Snell --- test/async-hooks/test-async-local-storage-enable-disable.js | 3 +++ 1 file changed, 3 insertions(+) diff --git a/test/async-hooks/test-async-local-storage-enable-disable.js b/test/async-hooks/test-async-local-storage-enable-disable.js index bbba8cde58d7e8..22a3f5f6c8c43f 100644 --- a/test/async-hooks/test-async-local-storage-enable-disable.js +++ b/test/async-hooks/test-async-local-storage-enable-disable.js @@ -9,6 +9,9 @@ asyncLocalStorage.runSyncAndReturn(new Map(), () => { asyncLocalStorage.getStore().set('foo', 'bar'); process.nextTick(() => { assert.strictEqual(asyncLocalStorage.getStore().get('foo'), 'bar'); + process.nextTick(() => { + assert.strictEqual(asyncLocalStorage.getStore(), undefined); + }); asyncLocalStorage.disable(); assert.strictEqual(asyncLocalStorage.getStore(), undefined); process.nextTick(() => { From fbaab7d8541c0f26f0c73c09790d886895c96131 Mon Sep 17 00:00:00 2001 From: Adam Majer Date: Fri, 28 Feb 2020 12:14:18 +0100 Subject: [PATCH 89/91] deps: openssl: cherry-pick 4dcb150ea30f OpenSSL 1.1.1d does not ship with getrandom syscall being predefined on all architectures. So when NodeJS is run with glibc prior to 2.25, where getentropy is unavailable, and the getrandom syscall is unknown, it will fail. PPC64LE or s390 are affected by lack of this definition. Original commit message. commit 4dcb150ea30f9bbfa7946e6b39c30a86aca5ed02 Author: Kurt Roeckx Date: Sat Sep 28 14:59:32 2019 +0200 Add defines for __NR_getrandom for all Linux architectures Fixes: https://github.com/openssl/openssl/issues/10015 Reviewed-by: Bernd Edlinger GH: https://github.com/openssl/openssl/pull/10044 Fixes: https://github.com/nodejs/node/issues/31671 PR-URL: https://github.com/nodejs/node/pull/32002 Reviewed-By: Ben Noordhuis Reviewed-By: James M Snell Reviewed-By: Sam Roberts --- deps/openssl/openssl/crypto/rand/rand_unix.c | 52 ++++++++++++++++++-- 1 file changed, 49 insertions(+), 3 deletions(-) diff --git a/deps/openssl/openssl/crypto/rand/rand_unix.c b/deps/openssl/openssl/crypto/rand/rand_unix.c index 69efcdeed752d7..315af610f849d9 100644 --- a/deps/openssl/openssl/crypto/rand/rand_unix.c +++ b/deps/openssl/openssl/crypto/rand/rand_unix.c @@ -282,12 +282,58 @@ static ssize_t sysctl_random(char *buf, size_t buflen) # if defined(OPENSSL_RAND_SEED_GETRANDOM) # if defined(__linux) && !defined(__NR_getrandom) -# if defined(__arm__) && defined(__NR_SYSCALL_BASE) +# if defined(__arm__) # define __NR_getrandom (__NR_SYSCALL_BASE+384) # elif defined(__i386__) # define __NR_getrandom 355 -# elif defined(__x86_64__) && !defined(__ILP32__) -# define __NR_getrandom 318 +# elif defined(__x86_64__) +# if defined(__ILP32__) +# define __NR_getrandom (__X32_SYSCALL_BIT + 318) +# else +# define __NR_getrandom 318 +# endif +# elif defined(__xtensa__) +# define __NR_getrandom 338 +# elif defined(__s390__) || defined(__s390x__) +# define __NR_getrandom 349 +# elif defined(__bfin__) +# define __NR_getrandom 389 +# elif defined(__powerpc__) +# define __NR_getrandom 359 +# elif defined(__mips__) || defined(__mips64) +# if _MIPS_SIM == _MIPS_SIM_ABI32 +# define __NR_getrandom (__NR_Linux + 353) +# elif _MIPS_SIM == _MIPS_SIM_ABI64 +# define __NR_getrandom (__NR_Linux + 313) +# elif _MIPS_SIM == _MIPS_SIM_NABI32 +# define __NR_getrandom (__NR_Linux + 317) +# endif +# elif defined(__hppa__) +# define __NR_getrandom (__NR_Linux + 339) +# elif defined(__sparc__) +# define __NR_getrandom 347 +# elif defined(__ia64__) +# define __NR_getrandom 1339 +# elif defined(__alpha__) +# define __NR_getrandom 511 +# elif defined(__sh__) +# if defined(__SH5__) +# define __NR_getrandom 373 +# else +# define __NR_getrandom 384 +# endif +# elif defined(__avr32__) +# define __NR_getrandom 317 +# elif defined(__microblaze__) +# define __NR_getrandom 385 +# elif defined(__m68k__) +# define __NR_getrandom 352 +# elif defined(__cris__) +# define __NR_getrandom 356 +# elif defined(__aarch64__) +# define __NR_getrandom 278 +# else /* generic */ +# define __NR_getrandom 278 # endif # endif From 1bca7b6c702f765fa50872a8fd6ff228d8ce14e6 Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Sat, 29 Feb 2020 10:07:41 -0800 Subject: [PATCH 90/91] test: move test-inspector-module to parallel test-inspector-module is very fast and seems to be runnable at the same time as other tests. Move from sequential directory to parallel. PR-URL: https://github.com/nodejs/node/pull/32025 Reviewed-By: Anna Henningsen Reviewed-By: Ruben Bridgewater Reviewed-By: Luigi Pinca --- test/{sequential => parallel}/test-inspector-module.js | 0 1 file changed, 0 insertions(+), 0 deletions(-) rename test/{sequential => parallel}/test-inspector-module.js (100%) diff --git a/test/sequential/test-inspector-module.js b/test/parallel/test-inspector-module.js similarity index 100% rename from test/sequential/test-inspector-module.js rename to test/parallel/test-inspector-module.js From f6ffdc2fa3ddebba5f9ba755fdf550946eb2f218 Mon Sep 17 00:00:00 2001 From: Shelley Vohr Date: Sat, 29 Feb 2020 15:28:30 -0800 Subject: [PATCH 91/91] 2020-03-04 Version 13.10.0 (Current) Notable changes: * async_hooks * introduce async-context API (vdeturckheim) #26540 * stream * support passing generator functions into pipeline() (Robert Nagy) #31223 * tls * expose SSL\_export\_keying\_material (simon) #31814 * vm * implement vm.measureMemory() for per-context memory measurement (Joyee Cheung) #31824 PR-URL: https://github.com/nodejs/node/pull/32027 --- CHANGELOG.md | 3 +- doc/api/async_hooks.md | 16 ++--- doc/api/errors.md | 2 +- doc/api/stream.md | 2 +- doc/api/tls.md | 2 +- doc/api/vm.md | 2 +- doc/changelogs/CHANGELOG_V13.md | 107 ++++++++++++++++++++++++++++++++ src/node_version.h | 6 +- 8 files changed, 124 insertions(+), 16 deletions(-) diff --git a/CHANGELOG.md b/CHANGELOG.md index 4c48caa007f9f3..61fbc6c8f998a0 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -30,7 +30,8 @@ release. -13.9.0
+13.10.0
+13.9.0
13.8.0
13.7.0
13.6.0
diff --git a/doc/api/async_hooks.md b/doc/api/async_hooks.md index 965c64218fdaba..59902917169ed1 100644 --- a/doc/api/async_hooks.md +++ b/doc/api/async_hooks.md @@ -861,7 +861,7 @@ for (let i = 0; i < 10; i++) { ## Class: `AsyncLocalStorage` This class is used to create asynchronous state within callbacks and promise @@ -911,7 +911,7 @@ from each other. It is safe to instantiate this class multiple times. ### `new AsyncLocalStorage()` Creates a new instance of `AsyncLocalStorage`. Store is only provided within a @@ -919,7 +919,7 @@ Creates a new instance of `AsyncLocalStorage`. Store is only provided within a ### `asyncLocalStorage.disable()` This method disables the instance of `AsyncLocalStorage`. All subsequent calls @@ -940,7 +940,7 @@ in the current process. ### `asyncLocalStorage.getStore()` * Returns: {any} @@ -952,7 +952,7 @@ return `undefined`. ### `asyncLocalStorage.run(store, callback[, ...args])` * `store` {any} @@ -987,7 +987,7 @@ asyncLocalStorage.getStore(); // Returns undefined ### `asyncLocalStorage.exit(callback[, ...args])` * `callback` {Function} @@ -1019,7 +1019,7 @@ asyncLocalStorage.run('store value', () => { ### `asyncLocalStorage.runSyncAndReturn(store, callback[, ...args])` * `store` {any} @@ -1054,7 +1054,7 @@ try { ### `asyncLocalStorage.exitSyncAndReturn(callback[, ...args])` * `callback` {Function} diff --git a/doc/api/errors.md b/doc/api/errors.md index 15be1fd2061618..93d9f909193f2b 100644 --- a/doc/api/errors.md +++ b/doc/api/errors.md @@ -1869,7 +1869,7 @@ The context must be a `SecureContext`. ### `ERR_TLS_INVALID_STATE` The TLS socket must be connected and securily established. Ensure the 'secure' diff --git a/doc/api/stream.md b/doc/api/stream.md index 1a2d14f865f5e5..09713f20746180 100644 --- a/doc/api/stream.md +++ b/doc/api/stream.md @@ -1559,7 +1559,7 @@ const cleanup = finished(rs, (err) => { diff --git a/doc/api/tls.md b/doc/api/tls.md index 3341e6e9ea514b..55dbf3b8d42b18 100644 --- a/doc/api/tls.md +++ b/doc/api/tls.md @@ -1096,7 +1096,7 @@ for more information. ### `tlsSocket.exportKeyingMaterial(length, label[, context])` * `length` {number} number of bytes to retrieve from keying material diff --git a/doc/api/vm.md b/doc/api/vm.md index ed676414471b8e..91d16be3986d22 100644 --- a/doc/api/vm.md +++ b/doc/api/vm.md @@ -298,7 +298,7 @@ console.log(globalVar); ## `vm.measureMemory([options])` > Stability: 1 - Experimental diff --git a/doc/changelogs/CHANGELOG_V13.md b/doc/changelogs/CHANGELOG_V13.md index 3f255fb09d2df3..d5e33d41c23672 100644 --- a/doc/changelogs/CHANGELOG_V13.md +++ b/doc/changelogs/CHANGELOG_V13.md @@ -39,6 +39,113 @@ * [io.js](CHANGELOG_IOJS.md) * [Archive](CHANGELOG_ARCHIVE.md) + +## 2020-03-04, Version 13.10.0 (Current), @codebytere + +### Notable Changes + +* **async_hooks** + * introduce async-context API (vdeturckheim) [#26540](https://github.com/nodejs/node/pull/26540) +* **stream** + * support passing generator functions into pipeline() (Robert Nagy) [#31223](https://github.com/nodejs/node/pull/31223) +* **tls** + * expose SSL\_export\_keying\_material (simon) [#31814](https://github.com/nodejs/node/pull/31814) +* **vm** + * implement vm.measureMemory() for per-context memory measurement (Joyee Cheung) [#31824](https://github.com/nodejs/node/pull/31824) + +### Commits + +* [[`f71fc9044a`](https://github.com/nodejs/node/commit/f71fc9044a)] - **async_hooks**: add store arg in AsyncLocalStorage (Andrey Pechkurov) [#31930](https://github.com/nodejs/node/pull/31930) +* [[`6af9e7e0c3`](https://github.com/nodejs/node/commit/6af9e7e0c3)] - **async_hooks**: executionAsyncResource matches in hooks (Gerhard Stoebich) [#31821](https://github.com/nodejs/node/pull/31821) +* [[`877ab97286`](https://github.com/nodejs/node/commit/877ab97286)] - **(SEMVER-MINOR)** **async_hooks**: introduce async-context API (vdeturckheim) [#26540](https://github.com/nodejs/node/pull/26540) +* [[`9a41ced0d1`](https://github.com/nodejs/node/commit/9a41ced0d1)] - **build**: only lint markdown files that have changed (POSIX-only) (Rich Trott) [#31923](https://github.com/nodejs/node/pull/31923) +* [[`ca4407105e`](https://github.com/nodejs/node/commit/ca4407105e)] - **build**: add missing comma in node.gyp (cjihrig) [#31959](https://github.com/nodejs/node/pull/31959) +* [[`4dffd0437d`](https://github.com/nodejs/node/commit/4dffd0437d)] - **cli**: --perf-prof only works on Linux (Shelley Vohr) [#31892](https://github.com/nodejs/node/pull/31892) +* [[`4d05508aa8`](https://github.com/nodejs/node/commit/4d05508aa8)] - **crypto**: turn impossible DH errors into assertions (Tobias Nießen) [#31934](https://github.com/nodejs/node/pull/31934) +* [[`d0e94fc77e`](https://github.com/nodejs/node/commit/d0e94fc77e)] - **crypto**: fix ieee-p1363 for createVerify (Tobias Nießen) [#31876](https://github.com/nodejs/node/pull/31876) +* [[`fbaab7d854`](https://github.com/nodejs/node/commit/fbaab7d854)] - **deps**: openssl: cherry-pick 4dcb150ea30f (Adam Majer) [#32002](https://github.com/nodejs/node/pull/32002) +* [[`e6125cd53b`](https://github.com/nodejs/node/commit/e6125cd53b)] - **deps**: V8: backport f7771e5b0cc4 (Matheus Marchini) [#31957](https://github.com/nodejs/node/pull/31957) +* [[`c27f0d10c4`](https://github.com/nodejs/node/commit/c27f0d10c4)] - **deps**: update zlib to upstream d7f3ca9 (Sam Roberts) [#31800](https://github.com/nodejs/node/pull/31800) +* [[`b30a6981d3`](https://github.com/nodejs/node/commit/b30a6981d3)] - **deps**: move zlib maintenance info to guides (Sam Roberts) [#31800](https://github.com/nodejs/node/pull/31800) +* [[`cd30dbb0d6`](https://github.com/nodejs/node/commit/cd30dbb0d6)] - **doc**: revise --zero-fill-buffers text in buffer.md (Rich Trott) [#32019](https://github.com/nodejs/node/pull/32019) +* [[`166579f84b`](https://github.com/nodejs/node/commit/166579f84b)] - **doc**: add link to sem-ver info (unknown) [#31985](https://github.com/nodejs/node/pull/31985) +* [[`e3258fd148`](https://github.com/nodejs/node/commit/e3258fd148)] - **doc**: update zlib doc (James M Snell) [#31665](https://github.com/nodejs/node/pull/31665) +* [[`8516602ba0`](https://github.com/nodejs/node/commit/8516602ba0)] - **doc**: clarify http2.connect authority details (James M Snell) [#31828](https://github.com/nodejs/node/pull/31828) +* [[`c5acf0a13b`](https://github.com/nodejs/node/commit/c5acf0a13b)] - **doc**: updated YAML version representation in readline.md (Rich Trott) [#31924](https://github.com/nodejs/node/pull/31924) +* [[`4c6343fdea`](https://github.com/nodejs/node/commit/4c6343fdea)] - **doc**: describe how to update zlib (Sam Roberts) [#31800](https://github.com/nodejs/node/pull/31800) +* [[`a46839279f`](https://github.com/nodejs/node/commit/a46839279f)] - **doc**: update releases guide re pushing tags (Myles Borins) [#31855](https://github.com/nodejs/node/pull/31855) +* [[`15cc9b0126`](https://github.com/nodejs/node/commit/15cc9b0126)] - **doc**: update assert.rejects() docs with a validation function example (Eric Eastwood) [#31271](https://github.com/nodejs/node/pull/31271) +* [[`2046652b4e`](https://github.com/nodejs/node/commit/2046652b4e)] - **doc**: fix anchor for ERR\_TLS\_INVALID\_CONTEXT (Tobias Nießen) [#31915](https://github.com/nodejs/node/pull/31915) +* [[`091b4bfe2d`](https://github.com/nodejs/node/commit/091b4bfe2d)] - **doc**: add note about ssh key to releases (Shelley Vohr) [#31856](https://github.com/nodejs/node/pull/31856) +* [[`3438937a37`](https://github.com/nodejs/node/commit/3438937a37)] - **doc**: fix notable changes for v13.9.0 (Shelley Vohr) [#31857](https://github.com/nodejs/node/pull/31857) +* [[`672f76d6bd`](https://github.com/nodejs/node/commit/672f76d6bd)] - **doc**: reword possessive form of Node.js in adding-new-napi-api.md (Rich Trott) [#31748](https://github.com/nodejs/node/pull/31748) +* [[`3eaf37767e`](https://github.com/nodejs/node/commit/3eaf37767e)] - **doc**: reword possessive form of Node.js in http.md (Rich Trott) [#31748](https://github.com/nodejs/node/pull/31748) +* [[`cb210e6b16`](https://github.com/nodejs/node/commit/cb210e6b16)] - **doc**: reword possessive form of Node.js in process.md (Rich Trott) [#31748](https://github.com/nodejs/node/pull/31748) +* [[`3969af43b4`](https://github.com/nodejs/node/commit/3969af43b4)] - **doc**: reword possessive form of Node.js in debugger.md (Rich Trott) [#31748](https://github.com/nodejs/node/pull/31748) +* [[`f9526057b3`](https://github.com/nodejs/node/commit/f9526057b3)] - **doc**: move gireeshpunathil to TSC emeritus (Gireesh Punathil) [#31770](https://github.com/nodejs/node/pull/31770) +* [[`b07175853f`](https://github.com/nodejs/node/commit/b07175853f)] - **doc**: pronouns for @Fishrock123 (Jeremiah Senkpiel) [#31725](https://github.com/nodejs/node/pull/31725) +* [[`7f4d6ee8ea`](https://github.com/nodejs/node/commit/7f4d6ee8ea)] - **doc**: move @Fishrock123 to TSC Emeriti (Jeremiah Senkpiel) [#31725](https://github.com/nodejs/node/pull/31725) +* [[`b177bba555`](https://github.com/nodejs/node/commit/b177bba555)] - **doc**: move @Fishrock123 to a previous releaser (Jeremiah Senkpiel) [#31725](https://github.com/nodejs/node/pull/31725) +* [[`9e4aad705f`](https://github.com/nodejs/node/commit/9e4aad705f)] - **doc**: fix typos in doc/api/https.md (Jeff) [#31793](https://github.com/nodejs/node/pull/31793) +* [[`eb2dce8342`](https://github.com/nodejs/node/commit/eb2dce8342)] - **doc**: claim ABI version 82 for Electron 10 (Samuel Attard) [#31778](https://github.com/nodejs/node/pull/31778) +* [[`db291aaf06`](https://github.com/nodejs/node/commit/db291aaf06)] - **doc**: guide - using valgrind to debug memory leaks (Michael Dawson) [#31501](https://github.com/nodejs/node/pull/31501) +* [[`aa16d80c05`](https://github.com/nodejs/node/commit/aa16d80c05)] - **doc,crypto**: re-document oaepLabel option (Ben Noordhuis) [#31825](https://github.com/nodejs/node/pull/31825) +* [[`9079bb42ea`](https://github.com/nodejs/node/commit/9079bb42ea)] - **http2**: make compat finished match http/1 (Robert Nagy) [#24347](https://github.com/nodejs/node/pull/24347) +* [[`3bd8feac0c`](https://github.com/nodejs/node/commit/3bd8feac0c)] - **meta**: move aqrln to emeritus (Rich Trott) [#31997](https://github.com/nodejs/node/pull/31997) +* [[`c801045fcd`](https://github.com/nodejs/node/commit/c801045fcd)] - **meta**: move jbergstroem to emeritus (Rich Trott) [#31996](https://github.com/nodejs/node/pull/31996) +* [[`ded3890bec`](https://github.com/nodejs/node/commit/ded3890bec)] - **meta**: move maclover7 to Emeritus (Rich Trott) [#31994](https://github.com/nodejs/node/pull/31994) +* [[`91ce69a554`](https://github.com/nodejs/node/commit/91ce69a554)] - **meta**: move Glen Keane to Collaborator Emeritus (Rich Trott) [#31993](https://github.com/nodejs/node/pull/31993) +* [[`b74c40eda6`](https://github.com/nodejs/node/commit/b74c40eda6)] - **meta**: move not-an-aardvark to emeritus (Rich Trott) [#31928](https://github.com/nodejs/node/pull/31928) +* [[`61a0d8b6cd`](https://github.com/nodejs/node/commit/61a0d8b6cd)] - **meta**: move julianduque to emeritus (Rich Trott) [#31863](https://github.com/nodejs/node/pull/31863) +* [[`94a471a422`](https://github.com/nodejs/node/commit/94a471a422)] - **meta**: move eljefedelrodeodeljefe to emeritus (Rich Trott) [#31735](https://github.com/nodejs/node/pull/31735) +* [[`9e3e6763fa`](https://github.com/nodejs/node/commit/9e3e6763fa)] - **module**: port source map sort logic from chromium (bcoe) [#31927](https://github.com/nodejs/node/pull/31927) +* [[`b9f3bfe6c8`](https://github.com/nodejs/node/commit/b9f3bfe6c8)] - **module**: disable conditional exports, self resolve warnings (Guy Bedford) [#31845](https://github.com/nodejs/node/pull/31845) +* [[`bbb6cc733c`](https://github.com/nodejs/node/commit/bbb6cc733c)] - **module**: package "exports" error refinements (Guy Bedford) [#31625](https://github.com/nodejs/node/pull/31625) +* [[`6adbfac9b0`](https://github.com/nodejs/node/commit/6adbfac9b0)] - **repl**: eager-evaluate input in parens (Shelley Vohr) [#31943](https://github.com/nodejs/node/pull/31943) +* [[`6a35b0d102`](https://github.com/nodejs/node/commit/6a35b0d102)] - **src**: don't run bootstrapper in CreateEnvironment (Shelley Vohr) [#31910](https://github.com/nodejs/node/pull/31910) +* [[`3497370d66`](https://github.com/nodejs/node/commit/3497370d66)] - **src**: move InternalCallbackScope to StartExecution (Shelley Vohr) [#31944](https://github.com/nodejs/node/pull/31944) +* [[`f62967c827`](https://github.com/nodejs/node/commit/f62967c827)] - **src**: enable `StreamPipe` for generic `StreamBase`s (Anna Henningsen) [#31869](https://github.com/nodejs/node/pull/31869) +* [[`776f379124`](https://github.com/nodejs/node/commit/776f379124)] - **src**: include large pages source unconditionally (Gabriel Schulhof) [#31904](https://github.com/nodejs/node/pull/31904) +* [[`9f68e14052`](https://github.com/nodejs/node/commit/9f68e14052)] - **src**: elevate v8 namespaces (Harshitha KP) [#31901](https://github.com/nodejs/node/pull/31901) +* [[`8fa6373e62`](https://github.com/nodejs/node/commit/8fa6373e62)] - **src**: allow unique\_ptrs with custom deleter in memory tracker (Anna Henningsen) [#31870](https://github.com/nodejs/node/pull/31870) +* [[`88ccb444e3`](https://github.com/nodejs/node/commit/88ccb444e3)] - **src**: move BaseObject subclass dtors/ctors out of node\_crypto.h (Anna Henningsen) [#31872](https://github.com/nodejs/node/pull/31872) +* [[`98d262e5f3`](https://github.com/nodejs/node/commit/98d262e5f3)] - **src**: inform callback scopes about exceptions in HTTP parser (Anna Henningsen) [#31801](https://github.com/nodejs/node/pull/31801) +* [[`57302f866e`](https://github.com/nodejs/node/commit/57302f866e)] - **src**: prefer 3-argument Array::New() (Anna Henningsen) [#31775](https://github.com/nodejs/node/pull/31775) +* [[`8a2b62e4cd`](https://github.com/nodejs/node/commit/8a2b62e4cd)] - **stream**: ensure pipeline always destroys streams (Robert Nagy) [#31940](https://github.com/nodejs/node/pull/31940) +* [[`313ecaabe5`](https://github.com/nodejs/node/commit/313ecaabe5)] - **stream**: fix broken pipeline error propagation (Robert Nagy) [#31835](https://github.com/nodejs/node/pull/31835) +* [[`8ad64b8e53`](https://github.com/nodejs/node/commit/8ad64b8e53)] - **(SEMVER-MINOR)** **stream**: support passing generator functions into pipeline() (Robert Nagy) [#31223](https://github.com/nodejs/node/pull/31223) +* [[`d0a00711f8`](https://github.com/nodejs/node/commit/d0a00711f8)] - **stream**: invoke buffered write callbacks on error (Robert Nagy) [#30596](https://github.com/nodejs/node/pull/30596) +* [[`1bca7b6c70`](https://github.com/nodejs/node/commit/1bca7b6c70)] - **test**: move test-inspector-module to parallel (Rich Trott) [#32025](https://github.com/nodejs/node/pull/32025) +* [[`932563473c`](https://github.com/nodejs/node/commit/932563473c)] - **test**: improve disable AsyncLocalStorage test (Andrey Pechkurov) [#31998](https://github.com/nodejs/node/pull/31998) +* [[`49864d161e`](https://github.com/nodejs/node/commit/49864d161e)] - **test**: fix flaky test-dns-any.js (Rich Trott) [#32017](https://github.com/nodejs/node/pull/32017) +* [[`38494746a6`](https://github.com/nodejs/node/commit/38494746a6)] - **test**: fix flaky test-gc-net-timeout (Robert Nagy) [#31918](https://github.com/nodejs/node/pull/31918) +* [[`b6d33f671a`](https://github.com/nodejs/node/commit/b6d33f671a)] - **test**: change test to not be sensitive to buffer send size (Rusty Conover) [#31499](https://github.com/nodejs/node/pull/31499) +* [[`cef5502055`](https://github.com/nodejs/node/commit/cef5502055)] - **test**: remove sequential/test-https-keep-alive-large-write.js (Rusty Conover) [#31499](https://github.com/nodejs/node/pull/31499) +* [[`f1e76488a7`](https://github.com/nodejs/node/commit/f1e76488a7)] - **test**: validate common property usage (Denys Otrishko) [#31933](https://github.com/nodejs/node/pull/31933) +* [[`ab8f060159`](https://github.com/nodejs/node/commit/ab8f060159)] - **test**: fix usage of invalid common properties (Denys Otrishko) [#31933](https://github.com/nodejs/node/pull/31933) +* [[`49c959d636`](https://github.com/nodejs/node/commit/49c959d636)] - **test**: increase timeout in vm-timeout-escape-queuemicrotask (Denys Otrishko) [#31966](https://github.com/nodejs/node/pull/31966) +* [[`04eda02d87`](https://github.com/nodejs/node/commit/04eda02d87)] - **test**: add documentation for common.enoughTestCpu (Rich Trott) [#31931](https://github.com/nodejs/node/pull/31931) +* [[`918c2b67cc`](https://github.com/nodejs/node/commit/918c2b67cc)] - **test**: fix typo in common/index.js (Rich Trott) [#31931](https://github.com/nodejs/node/pull/31931) +* [[`f89fb2751b`](https://github.com/nodejs/node/commit/f89fb2751b)] - **test**: mark empty udp tests flaky on OS X (Sam Roberts) [#31936](https://github.com/nodejs/node/pull/31936) +* [[`e08fef1fda`](https://github.com/nodejs/node/commit/e08fef1fda)] - **test**: add secp224k1 check in crypto-dh-stateless (Daniel Bevenius) [#31715](https://github.com/nodejs/node/pull/31715) +* [[`4fe9e043ef`](https://github.com/nodejs/node/commit/4fe9e043ef)] - **test**: remove common.PORT from assorted pummel tests (Rich Trott) [#31897](https://github.com/nodejs/node/pull/31897) +* [[`7d5776e119`](https://github.com/nodejs/node/commit/7d5776e119)] - **test**: remove flaky designation for test-net-connect-options-port (Rich Trott) [#31841](https://github.com/nodejs/node/pull/31841) +* [[`1933efa62f`](https://github.com/nodejs/node/commit/1933efa62f)] - **test**: remove common.PORT from test-net-write-callbacks.js (Rich Trott) [#31839](https://github.com/nodejs/node/pull/31839) +* [[`87e9014764`](https://github.com/nodejs/node/commit/87e9014764)] - **test**: remove common.PORT from test-net-pause (Rich Trott) [#31749](https://github.com/nodejs/node/pull/31749) +* [[`3fbd5ab265`](https://github.com/nodejs/node/commit/3fbd5ab265)] - **test**: remove common.PORT from test-tls-server-large-request (Rich Trott) [#31749](https://github.com/nodejs/node/pull/31749) +* [[`e76ac1d2c9`](https://github.com/nodejs/node/commit/e76ac1d2c9)] - **test**: remove common.PORT from test-net-throttle (Rich Trott) [#31749](https://github.com/nodejs/node/pull/31749) +* [[`724bf3105b`](https://github.com/nodejs/node/commit/724bf3105b)] - **test**: remove common.PORT from test-net-timeout (Rich Trott) [#31749](https://github.com/nodejs/node/pull/31749) +* [[`60c71dcad2`](https://github.com/nodejs/node/commit/60c71dcad2)] - **test**: add known issue test for sync writable callback (James M Snell) [#31756](https://github.com/nodejs/node/pull/31756) +* [[`2c0b249098`](https://github.com/nodejs/node/commit/2c0b249098)] - **tls**: reduce memory copying and number of BIO buffer allocations (Rusty Conover) [#31499](https://github.com/nodejs/node/pull/31499) +* [[`acb3aff674`](https://github.com/nodejs/node/commit/acb3aff674)] - **(SEMVER-MINOR)** **tls**: expose SSL\_export\_keying\_material (simon) [#31814](https://github.com/nodejs/node/pull/31814) +* [[`f293dcf6de`](https://github.com/nodejs/node/commit/f293dcf6de)] - **tools**: add NODE\_TEST\_NO\_INTERNET to the doc builder (Joyee Cheung) [#31849](https://github.com/nodejs/node/pull/31849) +* [[`79b1f04b15`](https://github.com/nodejs/node/commit/79b1f04b15)] - **tools**: sync gyp code base with node-gyp repo (Michaël Zasso) [#30563](https://github.com/nodejs/node/pull/30563) +* [[`f858f2366c`](https://github.com/nodejs/node/commit/f858f2366c)] - **tools**: update lint-md task to lint for possessives of Node.js (Rich Trott) [#31862](https://github.com/nodejs/node/pull/31862) +* [[`ae3929e958`](https://github.com/nodejs/node/commit/ae3929e958)] - **(SEMVER-MINOR)** **vm**: implement vm.measureMemory() for per-context memory measurement (Joyee Cheung) [#31824](https://github.com/nodejs/node/pull/31824) +* [[`a86cb0e480`](https://github.com/nodejs/node/commit/a86cb0e480)] - **vm**: lazily initialize primordials for vm contexts (Joyee Cheung) [#31738](https://github.com/nodejs/node/pull/31738) +* [[`f2389eba99`](https://github.com/nodejs/node/commit/f2389eba99)] - **worker**: emit runtime error on loop creation failure (Harshitha KP) [#31621](https://github.com/nodejs/node/pull/31621) +* [[`f87ac90849`](https://github.com/nodejs/node/commit/f87ac90849)] - **worker**: unroll file extension regexp (Anna Henningsen) [#31779](https://github.com/nodejs/node/pull/31779) + ## 2020-02-18, Version 13.9.0 (Current), @codebytere diff --git a/src/node_version.h b/src/node_version.h index c1971839673e11..d6f5cd06f7d8b0 100644 --- a/src/node_version.h +++ b/src/node_version.h @@ -23,13 +23,13 @@ #define SRC_NODE_VERSION_H_ #define NODE_MAJOR_VERSION 13 -#define NODE_MINOR_VERSION 9 -#define NODE_PATCH_VERSION 1 +#define NODE_MINOR_VERSION 10 +#define NODE_PATCH_VERSION 0 #define NODE_VERSION_IS_LTS 0 #define NODE_VERSION_LTS_CODENAME "" -#define NODE_VERSION_IS_RELEASE 0 +#define NODE_VERSION_IS_RELEASE 1 #ifndef NODE_STRINGIFY #define NODE_STRINGIFY(n) NODE_STRINGIFY_HELPER(n)