Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Node.js Technical Steering Committee (TSC) Meeting 2022-12-14 #1319

Closed
mhdawson opened this issue Dec 14, 2022 · 16 comments
Closed

Node.js Technical Steering Committee (TSC) Meeting 2022-12-14 #1319

mhdawson opened this issue Dec 14, 2022 · 16 comments
Assignees

Comments

@mhdawson
Copy link
Member

Time

UTC Wed 14-Dec-2022 16:00 (04:00 PM):

Timezone Date/Time
US / Pacific Wed 14-Dec-2022 08:00 (08:00 AM)
US / Mountain Wed 14-Dec-2022 09:00 (09:00 AM)
US / Central Wed 14-Dec-2022 10:00 (10:00 AM)
US / Eastern Wed 14-Dec-2022 11:00 (11:00 AM)
EU / Western Wed 14-Dec-2022 16:00 (04:00 PM)
EU / Central Wed 14-Dec-2022 17:00 (05:00 PM)
EU / Eastern Wed 14-Dec-2022 18:00 (06:00 PM)
Moscow Wed 14-Dec-2022 19:00 (07:00 PM)
Chennai Wed 14-Dec-2022 21:30 (09:30 PM)
Hangzhou Thu 15-Dec-2022 00:00 (12:00 AM)
Tokyo Thu 15-Dec-2022 01:00 (01:00 AM)
Sydney Thu 15-Dec-2022 03:00 (03:00 AM)

Or in your local time:

Links

Agenda

Extracted from tsc-agenda labelled issues and pull requests from the nodejs org prior to the meeting.

nodejs/node

  • doc: change fetch from experimental to stable #45684
  • Expose Undici's ProxyAgent and setGlobalDispatcher within Node #43187

Invited

Observers/Guests

Notes

The agenda comes from issues labelled with tsc-agenda across all of the repositories in the nodejs org. Please label any additional issues that should be on the agenda before the meeting starts.

Joining the meeting

Zoom link: https://zoom.us/j/611357642
Regular password

Public participation

We stream our conference call straight to YouTube so anyone can listen to it live, it should start playing at https://www.youtube.com/c/nodejs+foundation/live when we turn it on. There's usually a short cat-herding time at the start of the meeting and then occasionally we have some quick private business to attend to before we can start recording & streaming. So be patient and it should show up.


Invitees

Please use the following emoji reactions in this post to indicate your
availability.

  • 👍 - Attending
  • 👎 - Not attending
  • 😕 - Not sure yet
@mhdawson mhdawson self-assigned this Dec 14, 2022
@mhdawson
Copy link
Member Author

Sorry this is late to be opened.

@targos
Copy link
Member

targos commented Dec 14, 2022

There are only two issues on the agenda and we discussed them last week.

@Trott
Copy link
Member

Trott commented Dec 14, 2022

I'll be there long enough to at least decide we don't need to have the meeting. 😀

@tniessen
Copy link
Member

I'll be there but slightly late.

@joyeecheung
Copy link
Member

joyeecheung commented Dec 14, 2022

I'll be late or cannot attend, as an update for startup performance:

  • There had been a gradual regression of startup performance starting from v15. v19 is about 40%+ slower than v14
  • bootstrap: lazy load non-essential modules node#45659 improved baseline startup by 17-18%
  • build: disable v8 snapshot compression node#45716 improved baseline startup again by 17-18%, reversing the regression, with a 2.9% increase in binary size.
  • bootstrap: optimize modules loaded in the built-in snapshot node#45849 will improve baseline startup again by 10-11%, resulting in a overall 14-15% improvement compared to v14. Startup involving loading of all common builtin modules would still be 4-5% slower, because we simply have more internal modules and subnamespaces now.
  • I am playing with compression of snapshot/source code/code cache in the binary. Zlib isn't a good choice at this point (as the regression fixed in build: disable v8 snapshot compression node#45716 demontrated. V8 turned off the compression on desktop by default for the same reason). lz4 and/or zstd might be better. As reference LLVM uses zstd and Deno uses a combination of both. This would be an internal util. I am leaning towards lz4 because of faster decompression & it's tiny & it's already in chromium so it might be used somewhere else in V8 in the future.

@GeoffreyBooth
Copy link
Member

@joyeecheung Can we write down somewhere, perhaps in https://github.com/nodejs/node/tree/main/doc/contributing, how to avoid these regressions in the future? Like a guide to how to make modules snapshottable, how to add them to the snapshot, how and when to lazy-load things, how and when to avoid circular dependencies, etc. cc @nodejs/performance

@tniessen
Copy link
Member

Did I miss the meeting?

@RaisinTen
Copy link
Contributor

The meeting just ended

@joyeecheung
Copy link
Member

joyeecheung commented Dec 14, 2022

how to make modules snapshottable, how to add them to the snapshot, how and when to lazy-load things

See the comments in https://github.com/nodejs/node/blob/main/lib/internal/bootstrap/node.js - I suspect this place is much more noticeable than any doc we have in the repo (I read an old version of it before I started to do any work in startup, and it was also mention in blogs/books). The file itself is one of the oldest in the code base (if we disregard the exact directory where it is).

how and when to avoid circular dependencies

That probably should go to lib/README.md (which does not yet exist).

@mhdawson
Copy link
Member Author

PR for minutes - #1320

@BridgeAR
Copy link
Member

@GeoffreyBooth

how and when to lazy-load things

We have a test that indicates when new modules are added to the startup. I originally wrote it when starting to lazy load modules to improve the startup time before.

Shall we have a stronger indicator to prevent this in the future?

@joyeecheung
Copy link
Member

joyeecheung commented Dec 14, 2022

We have a test that indicates when new modules are added to the startup

I think the issue is that we have the list but we still didn't care much when it was extended. Also the test does not show how the modules were actually loaded (e.g. from snapshot, or being compiled at runtime, the latter is much more expensive even with the help of code cache, the former is cheaper, but not free).

@joyeecheung
Copy link
Member

I think another cause was that since we stopped tracking the performance of main/releases in general, we stopped noticing about regressions like this (to catch nodejs/node#45716 in time we needed to run the benchmarks after every V8 update, to catch nodejs/node#45659 and nodejs/node#45849 in time we needed constant monitoring of the performance because it happened gradually, one module would not significantly slow down the startup but when we gradually doubled the number of internal modules, the impact would be significant).

@GeoffreyBooth
Copy link
Member

See the comments in nodejs/node@main/lib/internal/bootstrap/node.js

I’ve never heard about this file until now, and I’ve been working in this codebase for something like five years now. We could at least link to this file from the contributor guide.

@joyeecheung
Copy link
Member

joyeecheung commented Dec 14, 2022

I’ve never heard about this file until now, and I’ve been working in this codebase for something like five years now.

You've been missing all the fun ;) The comments even start with Hello, and welcome to hacking node.js! (which is 12 years-old)

We could at least link to this file from the contributor guide.

Do you want to write a guide on hacking lib/ in general, or are you asking me to?

@GeoffreyBooth
Copy link
Member

Do you want to write a guide on hacking lib/ in general, or are you asking me to?

I don’t have the time or knowledge to write a guide on hacking lib in general, but if you or someone else does I think that would be very valuable.

I’m happy to open the PR to the contributor guide to just add a link to this file, unless someone else wants to beat me to it. I’m not sure when I’ll be able to get to it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants