Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Duplex stream pipeline regression in 13 #32955

Closed
mafintosh opened this issue Apr 20, 2020 · 10 comments
Closed

Duplex stream pipeline regression in 13 #32955

mafintosh opened this issue Apr 20, 2020 · 10 comments
Assignees

Comments

@mafintosh
Copy link
Member

mafintosh commented Apr 20, 2020

  • Version: 13.13.0
  • Platform: Mac
  • Subsystem: stream

What steps will reproduce the bug?

pipeline in 13 seems to destroy duplex streams before the get a change to finish their writable flus h (ws.end() -> ws._final) lifecycle.

I managed to boil it down to a pretty straightforward test case below

const stream = require('stream')

// duplex stream similar to a tcp stream etc
const dup = new stream.Duplex({
  write (data, enc, cb) {
    cb()
  },
  destroy () {
    console.log('ws: am getting destroyed')
  },
  final (cb) {
    console.log('ws: flushing writable...')
    setTimeout(function () {
      console.log('ws: done flushing writable...')
      cb()
    }, 1000)
  }
})

// just some sink
const sink = new stream.Writable({
  write (data, enc, cb) {
    cb()
  }
})

// pipe readable side
stream.pipeline(dup, sink, function () { })

dup.write('test')
dup.end()

Running this produces:

ws: am getting destroyed
ws: flushing writable...
ws: done flushing writable...

Notice that the dup stream gets destroyed before it has a chance to finish it's writable flush in the _final lifecycle, due to the pipeline auto destroying it in 13.

@mafintosh
Copy link
Member Author

I think this is related to #31940 as well

@mafintosh
Copy link
Member Author

mafintosh commented Apr 20, 2020

Equivalent test case using TCP

const net = require('net')
const pipeline = require('stream').pipeline

net.createServer(function (socket) {
  // echo server
  pipeline(socket, socket, () => {})
  // 13 force destroys the socket before it has a chance to emit finish
  socket.on('finish', function () {
    console.log('finished only printed in node 12')
  })
}).listen(10000, function () {
  const socket = net.connect(10000)
  socket.end()
})

@mcollina
Copy link
Member

cc @ronag

@ronag
Copy link
Member

ronag commented Apr 21, 2020

Seems to work on master. Checking v13-staging.

@ronag
Copy link
Member

ronag commented Apr 21, 2020

This is a problem on v13 only as far as I can tell. Basically pre v14 net.Socket had a bit strange semantics regarding writable/readable which kind of short-circuits the writable/readable check which pipeline depends on.

@ronag
Copy link
Member

ronag commented Apr 21, 2020

Looking into ways to get around this.

@mafintosh
Copy link
Member Author

mafintosh commented Apr 21, 2020

This happens in master on all streams that don't have autoDestroy enabled as well (ie any readable-stream stream or any stream that opt out)

const stream = require('stream')

// duplex stream similar to a tcp stream etc
const dup = new stream.Duplex({
  autoDestroy: false,
  write (data, enc, cb) {
    cb()
  },
  read () {
    this.push(null)
  },
  destroy () {
    console.log('ws: am getting destroyed')
  },
  final (cb) {
    console.log('ws: flushing writable...')
    setTimeout(function () {
      console.log('ws: done flushing writable...')
      cb()
    }, 1000)
  }
})

// just some sink
const sink = new stream.Writable({
  write (data, enc, cb) {
    cb()
  }
})

// pipe readable side
stream.pipeline(dup, sink, function () { })

dup.write('test')
dup.end()

Above prints the following

ws: flushing writable...
ws: am getting destroyed
ws: done flushing writable...

cc @ronag @mcollina

@ronag
Copy link
Member

ronag commented Apr 21, 2020

Yes, I see the problem. Will try to prepare a PR by the end of the day. Thanks a lot @mafintosh!

@mafintosh
Copy link
Member Author

@ronag i think this is related to #32965

@ronag ronag self-assigned this Apr 21, 2020
@ronag
Copy link
Member

ronag commented Apr 21, 2020

@ronag i think this is related to #32965

I think it's unrelated. See comment in #32965.

ronag added a commit to nxtedition/node that referenced this issue Apr 21, 2020
pipeline was too agressive with destroying Duplex
streams which were the first argument into pipeline.

Just because it's !writable does not mean that it
is safe to be destroyed, unless it has also emitted
'finish'.

Fixes: nodejs#32955
ronag added a commit to nxtedition/node that referenced this issue Apr 21, 2020
This PR logically reverts nodejs#31940 which
has caused lots of unnecessary breakage in the ecosystem.

This PR also aligns better with the actual documented behavior:

`stream.pipeline()` will call `stream.destroy(err)` on all streams except:
  * `Readable` streams which have emitted `'end'` or `'close'`.
  * `Writable` streams which have emitted `'finish'` or `'close'`.

The behavior introduced in nodejs#31940
was much more aggressive in terms of destroying streams. This was
good for avoiding potential resources leaks however breaks some
common assumputions in legacy streams.

Furthermore, it makes the code simpler and removes some hacks.

Fixes: nodejs#32954
Fixes: nodejs#32955
@ronag ronag closed this as completed in 6419e59 Apr 23, 2020
ronag added a commit to nxtedition/node that referenced this issue Apr 23, 2020
This PR logically reverts nodejs#31940 which
has caused lots of unnecessary breakage in the ecosystem.

This PR also aligns better with the actual documented behavior:

`stream.pipeline()` will call `stream.destroy(err)` on all streams except:
  * `Readable` streams which have emitted `'end'` or `'close'`.
  * `Writable` streams which have emitted `'finish'` or `'close'`.

The behavior introduced in nodejs#31940
was much more aggressive in terms of destroying streams. This was
good for avoiding potential resources leaks however breaks some
common assumputions in legacy streams.

Furthermore, it makes the code simpler and removes some hacks.

Fixes: nodejs#32954
Fixes: nodejs#32955

PR-URL: nodejs#32968
Reviewed-By: Matteo Collina <matteo.collina@gmail.com>
Reviewed-By: Mathias Buus <mathiasbuus@gmail.com>
Backport-PR-URL: nodejs#32980
BethGriggs pushed a commit that referenced this issue Apr 27, 2020
This PR logically reverts #31940
which has caused lots of unnecessary breakage in the ecosystem.

This PR also aligns better with the actual documented behavior:

`stream.pipeline()` will call `stream.destroy(err)` on all streams
except:
  * `Readable` streams which have emitted `'end'` or `'close'`.
  * `Writable` streams which have emitted `'finish'` or `'close'`.

The behavior introduced in #31940
was much more aggressive in terms of destroying streams. This was
good for avoiding potential resources leaks however breaks some
common assumputions in legacy streams.

Furthermore, it makes the code simpler and removes some hacks.

Fixes: #32954
Fixes: #32955

PR-URL: #32968
Reviewed-By: Matteo Collina <matteo.collina@gmail.com>
Reviewed-By: Mathias Buus <mathiasbuus@gmail.com>
BethGriggs pushed a commit that referenced this issue Apr 28, 2020
This PR logically reverts #31940
which has caused lots of unnecessary breakage in the ecosystem.

This PR also aligns better with the actual documented behavior:

`stream.pipeline()` will call `stream.destroy(err)` on all streams
except:
  * `Readable` streams which have emitted `'end'` or `'close'`.
  * `Writable` streams which have emitted `'finish'` or `'close'`.

The behavior introduced in #31940
was much more aggressive in terms of destroying streams. This was
good for avoiding potential resources leaks however breaks some
common assumputions in legacy streams.

Furthermore, it makes the code simpler and removes some hacks.

Fixes: #32954
Fixes: #32955

PR-URL: #32968
Reviewed-By: Matteo Collina <matteo.collina@gmail.com>
Reviewed-By: Mathias Buus <mathiasbuus@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants