Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove setImmediate from Response #753

Open
mcollina opened this issue May 15, 2024 · 0 comments
Open

Remove setImmediate from Response #753

mcollina opened this issue May 15, 2024 · 0 comments

Comments

@mcollina
Copy link

In

typeof setImmediate === "undefined" ? (fn: () => any) => fn() : setImmediate;
, Response uses setImmediate() to defer responses.

This technique can significantly boost the short-term throughput by allowing the processing of more than one low-event before going back to C++. However, it uses significantly more memory, leading to more GC work, effectively slowing everything down in case there is no "spare" CPU available to run the GC (and a huge pile up of data is in old space). We reverted it in Fastify (#545) after Eran Hammer found the problem. I recommend h3 to do the same.

Server:

'use strict'

const { createServer } = require('node:http')
const { createApp, toNodeListener, eventHandler, setHeader } = require('h3')

const app = createApp()
app.use('/', eventHandler((ev) => {
  return { hello: 'world' }
}))

createServer(toNodeListener(app)).listen(process.env.PORT || 3000)

This the benchmark result on my laptop:

$ autocannon -c 1000 -d 40 -p 10 http://127.0.0.1:3000
Running 40s test @ http://127.0.0.1:3000
1000 connections with 10 pipelining factor


┌─────────┬───────┬────────┬────────┬─────────┬──────────┬───────────┬──────────┐
│ Stat    │ 2.5%  │ 50%    │ 97.5%  │ 99%     │ Avg      │ Stdev     │ Max      │
├─────────┼───────┼────────┼────────┼─────────┼──────────┼───────────┼──────────┤
│ Latency │ 58 ms │ 200 ms │ 806 ms │ 1004 ms │ 276.4 ms │ 600.84 ms │ 21301 ms │
└─────────┴───────┴────────┴────────┴─────────┴──────────┴───────────┴──────────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬──────────┬──────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg      │ Stdev    │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼──────────┼─────────┤
│ Req/Sec   │ 75,455  │ 75,455  │ 84,735  │ 88,255  │ 84,049.6 │ 3,085.43 │ 75,439  │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼──────────┼─────────┤
│ Bytes/Sec │ 14.1 MB │ 14.1 MB │ 15.8 MB │ 16.5 MB │ 15.7 MB  │ 576 kB   │ 14.1 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴──────────┴──────────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 40

3391k requests in 40.08s, 629 MB read
4k errors (2k timeouts)

The actual threshold at which this problem starts showing up is system-dependent. It happens much sooner in a constrained container than in a powerful M3.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant