Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unexpected Z_BUF_ERROR when using zlib #8701

Closed
olalonde opened this issue Sep 21, 2016 · 11 comments
Closed

Unexpected Z_BUF_ERROR when using zlib #8701

olalonde opened this issue Sep 21, 2016 · 11 comments
Labels
zlib Issues and PRs related to the zlib subsystem.

Comments

@olalonde
Copy link
Contributor

olalonde commented Sep 21, 2016

  • Version: 6.x
  • Platform: Linux
  • Subsystem: zlib

There's been a few Z_BUF_ERROR reported on https://github.com/ttezel/twit:

{ Error: unexpected end of file
    at Zlib._handle.onerror (zlib.js:370:17) errno: -5, code: 'Z_BUF_ERROR' }

ttezel/twit#300
ttezel/twit#269
ttezel/twit#252
ttezel/twit#247

Googling for the error code only brings back to one of those issues or #2043.

It's not clear to me what that error means... The readable stream piping to zlib ended too early? Here's the twit code that creates the zlib stream: https://github.com/ttezel/twit/blob/master/lib/streaming-api-connection.js#L89

@addaleax addaleax added zlib Issues and PRs related to the zlib subsystem. unconfirmed labels Sep 21, 2016
@addaleax
Copy link
Member

@olalonde Okay, first of all, slightly offtopic but still: You’ll probably want to use gunzip.setEncoding('utf8'); rather than calling .toString('utf8') for each decoded chunk. The latter one won’t handle multibyte characters that cross chunk boundaries cleanly.

And then… it’s pretty hard to tell what’s going on or whether this even is a bug in core without a reproducible test case. Could you add console.log debugging to the self.response.on('data') listener to see what data actually gets passed to zlib?

@olalonde
Copy link
Contributor Author

olalonde commented Sep 21, 2016

Thanks for the suggestion. I was reporting this here to be helpful in case there was a bug in core. I'm not the maintainer or owner to that project. I'll try that out though. If you think it's not a bug with zlib feel free to close.

@addaleax
Copy link
Member

If you think it's not a bug with zlib feel free to close.

I don’t know, but I would love to help find that out! :)

@bnoordhuis
Copy link
Member

I'll go ahead and close out the issue. Can reopen when new information is available.

@CTassisF
Copy link

I was having this problem running Node.js v6.9.1 (from NodeSource) on Linux.
Also tried the changes suggested by @addaleax but the problem persisted.
I decided to downgrade Node.js to v4.6.1 (from Debian Stretch official mirror) and now the problem is gone.
I would like to troubleshoot it more, because I think it might be related to Node.js, but I do not have any ideas on what to do. Do you think identifying the latest Node.js version that runs this library without crashing zlib.js would help?

@bnoordhuis
Copy link
Member

@CTassisF If you can get git bisect to pinpoint the commit, that would help tremendously, but even knowing the last known-good release could prove quite useful.

@addaleax
Copy link
Member

@CTassisF I would also still really recommend trying to give log some/all of the data that is being passed to zlib.

@CTassisF
Copy link

CTassisF commented Oct 26, 2016

@bnoordhuis Thanks for your reply!
Unfortunately I won't be able to test further because this zlib.js error was being triggered by an issue in Twitter's streaming backend that was also affecting many other Twitter libraries (including other programming languages) and that was fixed yesterday. There is a post on Twitter Developer Forum regarding this issue.
I'm back running Node.js v7.0.0 from NodeSource and everything is looking fine. I will keep you posted if it happens again.
Thanks again :)

@dege88
Copy link

dege88 commented Dec 20, 2016

I also encountered this issue, and can provide you with a test that can reproduce the issue:

While using request to fetch the homepage of a website I got this strange issue on newer node versions, everything is fine on 4.6:

$ nvm run --lts=argon reproduceissue.js
Running node LTS "argon" -> v4.6.2 (npm v2.15.11)
working without gzip
working with gzip
$ nvm run --lts=boron reproduceissue.js
Running node LTS "boron" -> v6.9.1 (npm v3.10.8)
not working with gzip
{"errno":-5,"code":"Z_BUF_ERROR"}
working without gzip

Here's the code used to reproduce issue:

var request = require('request');

request({
	url: 'http://www.mantovanispa.it',
	gzip: true
}, function(error, response, body)
{
	if(!error)
	{
		console.log('working with gzip');
	}
	else
	{
		console.log('not working with gzip');
		console.log(JSON.stringify(error));
	}
});

request({
	url: 'http://www.mantovanispa.it'
}, function(error, response, body)
{
	if(!error)
	{
		console.log('working without gzip');
	}
	else
	{
		console.log('not working without gzip');
		console.log(JSON.stringify(error));
	}
});

The issue seems linked to gzip, maybe this site use a strange way to handle compression, but on older versions of node this won't happen.

@bnoordhuis can I help you to debug in some other ways?

@addaleax
Copy link
Member

curl -H 'Accept-Encoding: gzip' http://www.mantovanispa.it | gunzip does warn about gzip: stdin: unexpected end of file, so there’s definitely something weird going on on the server side

addaleax added a commit to addaleax/request that referenced this issue Dec 21, 2016
Be explicitly lenient with gzip decompression by always requesting
`zlib` to flush the input data and never explicitly ending the
`zlib` input.

The behavioural difference is that on Node ≥ 6, which has a slightly
stricter gzip decoding process than previous Node versions, malformed
but otherwise acceptable server responses are still properly
decompressed (the most common example being a missing checksum
at the stream end).

This aligns behaviour with cURL, which always uses the `Z_SYNC_FLUSH`
flag for decompression.

On the downside, accidental truncation of a response is no longer
detected on the compression layer.

Ref: nodejs/node#8701 (comment)
addaleax added a commit to addaleax/request that referenced this issue Dec 21, 2016
Be explicitly lenient with gzip decompression by always requesting
`zlib` to flush the input data and never explicitly ending the
`zlib` input.

The behavioural difference is that on Node ≥ 6, which has a slightly
stricter gzip decoding process than previous Node versions, malformed
but otherwise acceptable server responses are still properly
decompressed (the most common example being a missing checksum
at the stream end).

This aligns behaviour with cURL, which always uses the `Z_SYNC_FLUSH`
flag for decompression.

On the downside, accidental truncation of a response is no longer
detected on the compression layer.

Ref: nodejs/node#8701 (comment)
@addaleax
Copy link
Member

@dege88 fyi, I filed request/request#2492 for the request module specifically (which should address your particular case)

addaleax added a commit to addaleax/request that referenced this issue Dec 21, 2016
Be explicitly lenient with gzip decompression by always requesting
`zlib` to flush the input data and never explicitly ending the
`zlib` input.

The behavioural difference is that on Node ≥ 6, which has a slightly
stricter gzip decoding process than previous Node versions, malformed
but otherwise acceptable server responses are still properly
decompressed (the most common example being a missing checksum
at the stream end).

This aligns behaviour with cURL, which always uses the `Z_SYNC_FLUSH`
flag for decompression.

On the downside, accidental truncation of a response is no longer
detected on the compression layer.

Ref: nodejs/node#8701 (comment)
Fixes: request#2482
fatboy0112 added a commit to fatboy0112/node-fetch that referenced this issue Sep 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
zlib Issues and PRs related to the zlib subsystem.
Projects
None yet
Development

No branches or pull requests

5 participants