Skip to content

Commit

Permalink
http2: fix session memory accounting after pausing
Browse files Browse the repository at this point in the history
The ability to pause input processing was added in 8a4a193 but
introduced a session memory accounting mismatch leading to potential
NGHTTP2_ENHANCE_YOUR_CALM errors.

After pausing
(https://github.com/nodejs/node/blob/f36331c1bfa4c4c202346b05dc3bd672f653e4df/src/node_http2.cc#L871),
the early return on line 873 skips the
DecrementCurrentSessionMemory(stream_buf_.len) call below (line 878).

When we later finished processing the input chunk
(https://github.com/nodejs/node/blob/f36331c1bfa4c4c202346b05dc3bd672f653e4df/src/node_http2.cc#L1858),
we were calling DecrementCurrentSessionMemory(stream_buf_offset_) [line
1875] which was a no-op since we just set stream_buf_offset_ to 0 [line
1873].

The correct amount to decrement by is still stream_buf_.len, since
that's the amount we skipped previously (line 878).

Fixes: #29223
Refs: 164ac5b

PR-URL: #30684
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: Denys Otrishko <shishugi@gmail.com>
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: David Carlier <devnexen@gmail.com>
  • Loading branch information
Michael Lehenbauer authored and MylesBorins committed Dec 17, 2019
1 parent 894aaa2 commit 8f3b8ca
Show file tree
Hide file tree
Showing 2 changed files with 59 additions and 1 deletion.
5 changes: 4 additions & 1 deletion src/node_http2.cc
Expand Up @@ -1937,7 +1937,10 @@ void Http2Session::OnStreamRead(ssize_t nread, const uv_buf_t& buf_) {
buf = uv_buf_init(new_buf, nread);
stream_buf_offset_ = 0;
stream_buf_ab_.Reset();
DecrementCurrentSessionMemory(stream_buf_offset_);

// We have now fully processed the stream_buf_ input chunk (by moving the
// remaining part into buf, which will be accounted for below).
DecrementCurrentSessionMemory(stream_buf_.len);
}

// Shrink to the actual amount of used data.
Expand Down
55 changes: 55 additions & 0 deletions test/parallel/test-http2-large-writes-session-memory-leak.js
@@ -0,0 +1,55 @@
'use strict';
const common = require('../common');
if (!common.hasCrypto)
common.skip('missing crypto');
const fixtures = require('../common/fixtures');
const http2 = require('http2');

// Regression test for https://github.com/nodejs/node/issues/29223.
// There was a "leak" in the accounting of session memory leading
// to streams eventually failing with NGHTTP2_ENHANCE_YOUR_CALM.

const server = http2.createSecureServer({
key: fixtures.readKey('agent2-key.pem'),
cert: fixtures.readKey('agent2-cert.pem'),
});

// Simple server that sends 200k and closes the stream.
const data200k = 'a'.repeat(200 * 1024);
server.on('stream', (stream) => {
stream.write(data200k);
stream.end();
});

server.listen(0, common.mustCall(() => {
const client = http2.connect(`https://localhost:${server.address().port}`, {
ca: fixtures.readKey('agent2-cert.pem'),
servername: 'agent2',

// Set maxSessionMemory to 1MB so the leak causes errors faster.
maxSessionMemory: 1
});

// Repeatedly create a new stream and read the incoming data. Even though we
// only have one stream active at a time, prior to the fix for #29223,
// session memory would steadily increase and we'd eventually hit the 1MB
// maxSessionMemory limit and get NGHTTP2_ENHANCE_YOUR_CALM errors trying to
// create new streams.
let streamsLeft = 50;
function newStream() {
const stream = client.request({ ':path': '/' });

stream.on('data', () => { });

stream.on('close', () => {
if (streamsLeft-- > 0) {
newStream();
} else {
client.destroy();
server.close();
}
});
}

newStream();
}));

0 comments on commit 8f3b8ca

Please sign in to comment.