Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump readable-stream from 3.6.0 to 4.0.0 #3

Closed

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Jul 1, 2022

Bumps readable-stream from 3.6.0 to 4.0.0.

Release notes

Sourced from readable-stream's releases.

v4.0.0

What's Changed

New Contributors

Full Changelog: nodejs/readable-stream@v3.6.0...v4.0.0

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [readable-stream](https://github.com/nodejs/readable-stream) from 3.6.0 to 4.0.0.
- [Release notes](https://github.com/nodejs/readable-stream/releases)
- [Commits](nodejs/readable-stream@v3.6.0...v4.0.0)

---
updated-dependencies:
- dependency-name: readable-stream
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file javascript Pull requests that update Javascript code labels Jul 1, 2022
@vweevers
Copy link
Member

vweevers commented Jul 2, 2022

In addition to the Node.js 12 test failure (Level/many-level#7 (comment)) which I intend to "fix" by dropping support of Node.js 12, there are also random failures in the multi-process.js test, which unfortunately also happens on other Node.js versions. So far I'm unable to reproduce it with readable-stream v3, only v4.

# multiple processes
ok 38 should be deeply equivalent
not ok 39 should be deeply equivalent
  ---
    operator: deepEqual
    expected: |-
      [ [ '00000', '0.6959290688256938' ], [ '00001', '0.09414721663549419' ], [ '00002', '0.7016490817431085' ], [ '00003', '0.04483456956106591' ], [ '00004', '0.9068256992630355' ], [ '00005', '0.2594698226076928' ], [ '00006', '0.6541768373887236' ], [ '00007', '0.7025560544834328' ], [ '00008', '0.6187830617601537' ], [ '00009', '0.0394596046102651' ] ]
    actual: |-
      [ [ '00001', '0.09414721663549419' ], [ '00002', '0.7016490817431085' ], [ '00003', '0.04483456956106591' ], [ '00005', '0.2594698226076928' ], [ '00008', '0.6187830617601537' ], [ '00009', '0.0394596046102651' ] ]

@vweevers
Copy link
Member

vweevers commented Jul 2, 2022

It reproduces more consistently with v4, across all node versions and operating systems: https://github.com/Level/rave-level/actions/runs/2601809114. But it also reproduces with v3 on mac: https://github.com/Level/rave-level/actions/runs/2601819174. And also with many-level@1: https://github.com/Level/rave-level/actions/runs/2601834432

In short, this is a pre-existing bug.

@vweevers
Copy link
Member

vweevers commented Jul 2, 2022

To summarize (and correct) the above, there are 3 separate issues here:

  1. pipeline() of readable-stream@4 on Node.js 12: Premature close error on socket under Node.js 12 nodejs/readable-stream#477
  2. pipeline() of readable-stream@4 on any Node.js version: calls its callback before the streams have emitted 'close', which then leads to side effects
  3. EADDRINUSE errors on Mac, which I now see are distinct from the other cases.

1 and 2 can be fixed by replacing pipeline() with a simpler, stricter function that always waits for 'close' events, instead of guessing which streams will emit 'close'.

As for 3, I have no idea. We can tackle that separately though, because it's unrelated to readable-stream.

@juliangruber
Copy link
Member

  1. Can we just drop Node 12 here?
  2. Instead of relying on "close" having been emitted, could we adjust the tests to work before that event?

@vweevers
Copy link
Member

vweevers commented Jul 2, 2022

Can we just drop Node 12 here?

Yeah, we could.

Instead of relying on "close" having been emitted, could we adjust the tests to work before that event?

The issue isn't in the tests. It's a general problem: before 'close' has been emitted, resources haven't been released yet. This problem pops up in scenarios with high contention, like on the LevelDB lock and socket path here. Node core's pipeline() wants to avoid the problem but it can't, because not all ecosystem modules are guaranteed to emit 'close'. In rave-level, all relevant streams are ours (and internal) so we do have that guarantee.

@vweevers
Copy link
Member

vweevers commented Jul 2, 2022

You did give me pause though, and made me look at my test code (with a pipeline replacement) again. I found a potential bug there, leading me to wrong conclusions. Will update.

@vweevers
Copy link
Member

vweevers commented Jul 3, 2022

Waiting for 'close' events (or just doing a setTimeout()) merely reduces the chance of it happening. The actual cause is that pipeline(socket, this.createRpcStream({ ref: socket }), socket, cb) sometimes doesn't call cb and leaves the rpc stream in a half-open state (writable side finishes but readable side does not).

This can be seen in the following repro, which also contains a fix that we could make in many-level:

const { pipeline, Duplex, PassThrough, finished } = require('readable-stream')
const net = require('net')
const cp = require('child_process')
const socketPath = process.platform === 'win32' ? '\\\\.\\pipe\\stream-test' : __dirname + '/stream-test.sock'

if (process.argv[2] === 'server') {
  const server = net.createServer(sock => {
    sock.pipe(new PassThrough()).pipe(sock)
  })

  server.listen(socketPath, function () {
    process.send('ready')
  })
} else {
  const child = cp.fork(__filename, ['server'], { timeout: 2e3 })

  child.once('message', function ready () {
    const socket = net.connect(socketPath)
    const encode = new PassThrough()
    const decode = new PassThrough()
    const proxy = Duplex.from({ writable: decode, readable: encode })

    decode.on('data', function (x) {
      console.log('received %s', x)
    })

    // (1) Notice how the encode stream does not end, finish or close
    for (const event of ['close', 'finish', 'end']) {
      socket.on(event, () => console.log('socket', event))
      proxy.on(event, () => console.log('proxy', event))
      encode.on(event, () => console.log('encode', event))
      decode.on(event, () => console.log('decode', event))
    }

    pipeline(socket, proxy, socket, function () {
      // (2) So we never get here
      console.log('done')
    })

    encode.write('x')

    // (3) To fix, uncomment:
    // finished(proxy, { writable: true, readable: false }, () => encode.destroy())
  })
}

@juliangruber
Copy link
Member

Ok, great that you were able to reproduce this using core libs only! Do you think it's an issue there or adding finished is actually the way to go?

@vweevers
Copy link
Member

vweevers commented Jul 4, 2022

I hope to further simplify the repro (removing net, child_process and the amount of duplex streams) to then open an issue in nodejs. Because it feels like a bug, but I can't yet clearly articulate how I expect pipeline() to behave. It's an odd case because net does not emit an error on connection loss of a unix socket or named pipe, unlike on tcp sockets.

@juliangruber
Copy link
Member

Cool, I'm interested in this and will take a stab at simplifying it

@juliangruber
Copy link
Member

I'm seeing the same behavior with tcp sockets, in codespaces

@juliangruber
Copy link
Member

Example without child processes:

const { pipeline, Duplex, PassThrough, finished } = require('readable-stream')
const net = require('net')

const cons = []
const server = net.createServer(sock => {
  cons.push(sock)
  sock.pipe(new PassThrough()).pipe(sock)
})

server.listen(8000, function () {
  const socket = net.connect(8000)
  const encode = new PassThrough()
  const decode = new PassThrough()
  const proxy = Duplex.from({ writable: decode, readable: encode })

  decode.on('data', function (x) {
    console.log('received %s', x)
  })

  // (1) Notice how the encode stream does not end, finish or close
  for (const event of ['close', 'finish', 'end']) {
    socket.on(event, () => console.log('socket', event))
    proxy.on(event, () => console.log('proxy', event))
    encode.on(event, () => console.log('encode', event))
    decode.on(event, () => console.log('decode', event))
  }

  pipeline(socket, proxy, socket, function () {
    // (2) So we never get here
    console.log('done')
  })

  encode.write('x')

  // (3) To fix, uncomment:
  // finished(proxy, { writable: true, readable: false }, () => encode.destroy())

  setTimeout(() => {
    server.close()
    cons[0].end()
  }, 2000)
})

@juliangruber
Copy link
Member

Also removed net code:

const { pipeline, Duplex, PassThrough, finished } = require('readable-stream')

const socket = new PassThrough()
const encode = new PassThrough()
const decode = new PassThrough()
const proxy = Duplex.from({ writable: decode, readable: encode })

decode.on('data', function (x) {
  console.log('received %s', x)
})

// (1) Notice how the encode stream does not end, finish or close
for (const event of ['close', 'finish', 'end']) {
  socket.on(event, () => console.log('socket', event))
  proxy.on(event, () => console.log('proxy', event))
  encode.on(event, () => console.log('encode', event))
  decode.on(event, () => console.log('decode', event))
}

pipeline(socket, proxy, socket, function () {
  // (2) So we never get here
  console.log('done')
})

encode.write('x')

// (3) To fix, uncomment:
// finished(proxy, { writable: true, readable: false }, () => encode.destroy())

setTimeout(() => {
  socket.end()
}, 2000)

@juliangruber
Copy link
Member

While it is reproducible like this, I'm not sure it still shows the same issue, because we're not ending the encode stream, so it looks more like a design issue here, where we need to start manually propagating stream events

@vweevers
Copy link
Member

vweevers commented Jul 4, 2022

If you do socket.destroy() on the last example then it does get to console.log('done'), due to a Premature close error that triggers everything to be destroyed. What we need here is to get that same behavior if the socket closed without an error, because either way, the streams are now unusable.

The writable side of our proxy stream behaves as I expect. The readable side - which is connected to a closed socket stream - does not. If we did pipeline(x, proxy), i.e. proxy being the last stream, then I'd understand it, because the pipeline isn't consuming the readable side of the proxy, and you can chain pipelines this way.

But in pipeline(x, proxy, x) I want the pipeline to fully manage the lifetime of the proxy.

@mcollina
Copy link
Member

mcollina commented Jul 4, 2022

Is there a bug in Node.js core v18 for this? There might be. If you have a simpler repo just open an issue and we can discuss it there.

@vweevers
Copy link
Member

vweevers commented Jul 4, 2022

Further simplified. Reproduces on 18.4.0 core streams as well. I'll test it on some more versions and then open an issue.

const { pipeline, Duplex, PassThrough, finished } = require('readable-stream')

const remote = new PassThrough()
const local = new Duplex({
  read () {},
  write (chunk, enc, callback) {
    callback()
  }
})

for (const event of ['close', 'finish', 'end']) {
  remote.on(event, () => console.log('remote', event))
  local.on(event, () => console.log('local', event))
}

pipeline(remote, local, remote, function () {
  // We never get here
  console.log('done')
})

// To fix, uncomment:
// finished(local, { writable: true, readable: false }, () => local.destroy())

setImmediate(() => {
  remote.end()
})

@mcollina
Copy link
Member

mcollina commented Jul 4, 2022

Thx!

@vweevers
Copy link
Member

vweevers commented Jul 4, 2022

nodejs/node#43682

@dependabot @github
Copy link
Contributor Author

dependabot bot commented on behalf of github Aug 1, 2022

Superseded by #6.

@dependabot dependabot bot closed this Aug 1, 2022
@dependabot dependabot bot deleted the dependabot/npm_and_yarn/readable-stream-4.0.0 branch August 1, 2022 16:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file javascript Pull requests that update Javascript code
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

None yet

3 participants