New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Does form-data support fs-capacitor stream? #394
Comments
Have a similar issue too! :( Did anyone find a solution? It's been over a year and I'm still unable to find many references to an issue like this. |
@rheaditi I tested a few different approaches and ended up with this resolver for a file upload with import FormData from 'form-data';
import {GraphQLUpload} from 'apollo-upload-server';
import rawBody from 'raw-body';
import fetch from 'node-fetch';
export default {
Upload: GraphQLUpload,
Mutation: {
uploadFile: (root, {file: fileUpload}) =>
fileUpload.then(file =>
rawBody(file.createReadStream()).then(buffer => {
const form = new FormData();
// `FormData` accepts either a readable stream, a buffer or a string
// for multipart files. `file.createReadStream()` creates a
// `FileStream`, which is a subclass of `ReadableStream` implemented
// in `busboy` (dependency of `apollo-upload-server`). This kind of
// stream is apparently not discovered correctly by `form-data`,
// therefore the better option is to use a buffer, however the meta
// data needs to be specified explicitly in this case.
form.append('file', buffer, {
filename: file.filename,
contentType: file.mimetype,
knownLength: buffer.length
});
// Something like this, I'm using some abstraction for this
return fetch('/upload', {method: 'POST', body: form});
})
)
}
}; Maybe this is helpful to you! I haven't tested with newer versions of |
Hey @amannn! To get around this, we previously had to store the file to the fs temporarily & create a readstream from the fs file and this helps us completely avoid that! You, sir, are a life-saver. Thank you so much!!! 🙏 EDIT:
|
I've been there before as well 😅. Glad I could help you! 🙂 |
We'd been stuck with this too. 😅 Thank you so much for this @amannn 🙌 |
Isn't |
@gregory Yep, that's true. Ideally we'd just forward the stream to |
Got it @amannn - have a look at jaydenseric/graphql-upload#168 - I created a new option called |
Hi everybody. I want to clear up a few things here. The problem is that the
The To resolve this, form-data just needs to add support for arbitrary stream data, instead of using duck typing to only support the special cases of the |
You got it right @mike-marcacci that's the issue. Instead of trying to refactor form-data and going down a rabbit hole, I thought enhancing graphql-upload through a two line change was smarter. If you are resistant to that small change, it looks like the only quick fix is do what @amannn suggested, but at the cost of loading every single file upload in memory. Let me know if you think this was a stupid idea. |
Hi @gregory, one of the problems with using the undocumented
There isn't a way for |
I don't think that's an issue @mike-marcacci as you'll usually set those manually: // data needs to be specified explicitly in this case.
form.append('file', buffer, {
filename: file.filename,
contentType: file.mimetype,
}); As you can see, those will be returned by graphql-upload |
Figured it out -- it's not an issue of duck typing per se, see node-fetch/node-fetch#707. FormData just does not purposefully support streams of unknown length, unless you use NaN workaround. |
graphql-tools-fork v8.1.0 provides new exports to allow proxying remote file uploads for those who want to proxy all the things. It successfully sends a server-to-server multipart form request via FormData and node-fetch by extending the FormData class to handle streams of unknowable length. See: https://github.com/yaacovCR/graphql-tools-fork/blob/master/src/links/createServerHttpLink.ts#L15-L63 Feedback/suggestions welcome. I am happy to submit as a PR to this repository if there would be support for having it merged. |
It works for me with additional headers (using node-fetch) const { filename, createReadStream } = await file
const form = new FormData()
form.append('file', createReadStream(), { filename })
await fetch(FILE_SERVER_URL, {
method: 'PUT',
body: form,
headers: {
'connection': 'keep-alive',
'transfer-encoding': 'chunked',
},
}) |
I'm having a bit of a hard time following this issue. What's the status of it, and is there a reasonable workaround (i.e. one that doesn't involve loading the entire fire into memory)? |
@lynxtaa, that doesn't work for me. When using those headers from Node v12 on Windows to express server on same, I get the following error:
My summary of the issue is that the combination of the existing Node FormData and fetch polyfills do not allow adding arbitrary streams, as unless they are specific streams, the content length will be set improperly. Available Workarounds:
|
I can confirm uploading using FormData + |
After further investigation I think it works for me only because I'm using Nginx reverse proxy in front of my file-upload server. It somehow solves the issue. Maybe it has something to do with proxy_request_buffering? By default Nginx buffers a client request body |
@yaacovCR Appreciate the summary. I have been struggling with this for awhile :D
The only way I can send a multipart/form request to the server I am dealing with (Power BI API) is by loading the entire file into memory or the filesystem and sending it's length along with the request. The server response always says the file size is invalid and says it's 9223372036854775807 bytes (max value of a int64) which makes me think there is some funky C# stuff going on as well... regardless, thanks for the ideas and I'll keep this thread in mind if I happen to come across any other solutions. |
My workaround is using formdata-node. import FormData from "formdata-node";
...
const form = new FormData();
form.set("file", file.createReadStream(), file.filename);
await fetch(url, {
method: "POST",
body: form.stream,
headers: form.headers,
}); |
This issue appears to have been fixed by #382, released in v4.0.0 |
I'm using apollo-upload-server for uploading files to the server. In my resolver function, I have a
stream
. What I want to do is pass this stream as a file (multipart/form-data) to another server. I use promise-request (that under the hood use form-data to achieve that. Code look like below:But I'm getting an error :
RequestError: Error: socket hang up at new RequestError (/Users/igat/Code/projects/botsupply-oracle-backend/node_modules/request-promise-core/lib/errors.js:14:15) at Request.plumbing.callback (/Users/igat/Code/projects/botsupply-oracle-backend/node_modules/request-promise-core/lib/plumbing.js:87:29) at Request.RP$callback [as _callback] (/Users/igat/Code/projects/botsupply-oracle-backend/node_modules/request-promise-core/lib/plumbing.js:46:31) at self.callback (/Users/igat/Code/projects/botsupply-oracle-backend/node_modules/request/request.js:185:22) at emitOne (events.js:121:20) at Request.emit (events.js:211:7) at Request.onRequestError (/Users/igat/Code/projects/botsupply-oracle-backend/node_modules/request/request.js:877:8) at emitOne (events.js:116:13) at ClientRequest.emit (events.js:211:7) at Socket.socketOnEnd (_http_client.js:423:9) at emitNone (events.js:111:20) at Socket.emit (events.js:208:7) at endReadableNT (_stream_readable.js:1056:12) at _combinedTickCallback (internal/process/next_tick.js:138:11) at process._tickDomainCallback (internal/process/next_tick.js:218:9)
And as I understand
apollo-upload-server
use fs-capacitor to deal with a stream. Does form-data support fs-capacitor stream? What is wrong with my code?Guys any ideas? I'm really stuck.
The text was updated successfully, but these errors were encountered: