Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Destroying the response body stream does not abort the request in node-fetch v2 #1762

Open
emcsween opened this issue Jun 28, 2023 · 3 comments
Labels

Comments

@emcsween
Copy link

emcsween commented Jun 28, 2023

In node-fetch v3, if the response body stream is destroyed before the request completes, the request is aborted. This is useful when piping the body into another stream using pipeline():

const response = await fetch(...)
pipeline(response.body, fs.createWriteStream(...))

If an error occurs, pipeline() destroys response.body, which in turn aborts the request.

Unfortunately, this doesn't work in node-fetch v2.

Reproduction

This script requests a large file and immediately destroys the response body stream:

import { setTimeout } from 'node:timers/promises'
import fetch from 'node-fetch'

console.log("PID:", process.pid)
// Any large file will do
const url = 'https://upload.wikimedia.org/wikipedia/commons/5/58/Sunset_2007-1.jpg'
const response = await fetch(url)
response.body.destroy()
await setTimeout(100000)

With node-fetch v2, a socket stays open. (I verify this with ss -tnp | grep pid=PID)

With node-fetch v3, this doesn't happen.

Your Environment

software version
node-fetch 2.6.11
node 16.14.2
npm 8.5.0
Operating System Ubuntu 22.04

Additional context

As a workaround, one can use an AbortController to abort the request when the response body is closed. Something like:

const ctrl = new AbortController()
const response = await fetch(url, { signal: ctrl.signal })
response.body.on('close', () => {
  if (!response.bodyUsed) {
    ctrl.abort()
  }
})

But it would be nicer if node-fetch did that automatically instead.

@emcsween emcsween added the bug label Jun 28, 2023
@Yungser
Copy link

Yungser commented Jul 28, 2023

In node-fetch v3, if the response body stream is destroyed before the request completes, the request is aborted. This is useful when piping the body into another stream using pipeline():

const response = await fetch(...)
pipeline(response.body, fs.createWriteStream(...))

If an error occurs, pipeline() destroys response.body, which in turn aborts the request.

Unfortunately, this doesn't work in node-fetch v2.

Reproduction

This script requests a large file and immediately destroys the response body stream:

import { setTimeout } from 'node:timers/promises'
import fetch from 'node-fetch'

console.log("PID:", process.pid)
// Any large file will do
const url = 'https://upload.wikimedia.org/wikipedia/commons/5/58/Sunset_2007-1.jpg'
const response = await fetch(url)
response.body.destroy()
await setTimeout(100000)

With node-fetch v2, a socket stays open. (I verify this with ss -tnp | grep pid=PID)

With node-fetch v3, this doesn't happen.

Your Environment

software version
node-fetch 2.6.11
node 16.14.2
npm 8.5.0
Operating System Ubuntu 22.04
Additional context

As a workaround, one can use an AbortController to abort the request when the response body is closed. Something like:

const ctrl = new AbortController()
const response = await fetch(url, { signal: ctrl.signal })
response.body.on('close', () => {
  if (!response.bodyUsed) {
    ctrl.abort()
  }
})

But it would be nicer if node-fetch did that automatically instead.

@chungquantin
Copy link

@emcsween Have you figured out how to resolve this issue? I got the same bug.

@emcsween
Copy link
Author

@chungquantin I use the workaround described above in the description of this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants