-
-
Notifications
You must be signed in to change notification settings - Fork 1k
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to upload a file larger than 64KB #1713
Comments
I where about to start writing about something completely different. (about how it's better to just use blob's backed up by the filesystem but then i tough 64KB was nothing. it's so small... so it would not matter. then i investigated it and it further and tried to reproduce it (preferable without fetch - if possible) So i manage to reproduce it with just: import { Blob as fetchBlob } from 'node-fetch'
import { Blob } from 'buffer'
const buf = Buffer.from('a'.repeat(64 * 1024 + 2))
const blob = new Blob([buf])
const file = new fetchBlob([blob])
for await (const c of file.stream()) {
console.log(c)
} (therefore i'm going to close this as this issue dose not belong here - but feel free to keep commenting) This is more or less what's going on under the hood. readable web byte-streams have this kind of step where it detaches the underlying ArrayBuffer when using: ReadableStream({
type: 'bytes',
async pull (ctrl) {
ctrl.enqueue(uint8array)
}
}) And once the that's the root cause of this error: import { Blob } from 'buffer'
const buf = Buffer.from('a'.repeat(64 * 1024 + 2))
const blob = new Blob([buf])
for await (const c of blob.stream()) {
console.log(c)
}
all looks good ✅ for await (const c of blob.stream()) {
console.log(c.buffer) // log the arrayBuffer
}
that dose not look right... ❌ ArrayBuffer {
[Uint8Contents]: <61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 ... 65438 more bytes>,
byteLength: 65538
}
ArrayBuffer {
[Uint8Contents]: <61 61>,
byteLength: 2
} I took an extra look at the NodeJS source code: stream() {
if (!isBlob(this))
throw new ERR_INVALID_THIS('Blob');
const self = this;
return new lazyReadableStream({
async start() {
this[kState] = await self.arrayBuffer();
this[kIndex] = 0;
},
pull(controller) {
if (this[kState].byteLength - this[kIndex] <= kMaxChunkSize) {
controller.enqueue(new Uint8Array(this[kState], this[kIndex]));
controller.close();
this[kState] = undefined;
} else {
controller.enqueue(new Uint8Array(this[kState], this[kIndex], kMaxChunkSize));
this[kIndex] += kMaxChunkSize;
}
}
});
} the issue here is that it's using: controller.enqueue(new Uint8Array(this[kState], this[kIndex], kMaxChunkSize)); instead of doing things such as: controller.enqueue(new Uint8Array(this[kState].slice(start, end)); This is unexpected from a web spec perspective of reusing the same underlying So really this isn't a issue with return new lazyReadableStream({
async start() {
this[kState] = await self.arrayBuffer(); as that is a bad memory hog to allocate the hole arrayBuffer like that instead of creating some more stream friendlier variant But i totally forgot about even posting them to NodeJS earlier. I have always been reluctant to use NodeJS own Blob implementation and doing things such as
|
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
It will throw the error.
OS: debian 11 amd64
node: 18.10.0
node-fetch: 3.3.0
The text was updated successfully, but these errors were encountered: