-
-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use native Blob on Node.js 15.7.0 #1079
Comments
|
Are you sure |
We need to make the API just work. Adding an extra option that deviates from the fetch spec might not be a good idea unless there's a problem we need to avoid. |
Good point. I think we'll either want to make Node.js Blobs pass the tests and adjust |
Oh sweet, native blob support! well, one thing that was brought up was // eventually
const blob1 = await fs.promises.readAsBlob('path/to/some/file1')
// blob1 identifies a persistent memory location where blob data is stored. We have that support already, we made it like so: Seems to live at I guess Node's Blob don't yet support We have now based ours on blob parts instead a being a buffer container containing all data like it was before when we didn't have blob-from-path. And because of that our fetch-blob can support other blob-like parts that isn't the same instanceof itself as long as it behaves as a Blob and have a way to read them. this is thanks to that it now operates on parts and not a single buffer container like how node's So it could be possible to wrap them in such a way like this if we would like to do that: import FetchBlob from 'fetch-blob'
import blobFromPath from 'fetch-blob/from.js'
import { Blob } from 'buffer'
var native = new Blob()
var fetchBlob = new FetchBlob([native, blobFromPath('./package.json')]) If you do want to upload something big, then you want to read the blob as a stream, hence why fetch(url, {
method: 'POST',
body: blob_2gb
}) There is some pro and cons to nodes and fetchs blob
So you have to weight in if you either want to have support for large data with IMO i think large data have a better usecase for now. We have talked about it before and Blob & Files have yet no real usecase in the NodeJS land if they can only be constructed with other in memory variables. then you can as well stick with the buffer or string that you already have declared as a variable. and do things with it. the really good usecase starts to come when you must handle large data and have to do so with blob's backed up by the filesystem, the 2nd good usecase is to create workers from A Blob was first invented to hold data that dosen't exactly have to live in the memory so that you could read data from since |
I think we can start of with the easiest thing first to support accepting |
PR/Issue for supporting native blob with any kind of stream or async iterator as input have been made Still doe, Buffer.Blob.prototype.stream are still missing, whatever kind of stream they implement we will support it in the feature EDIT made it support none streaming blob also by slicing/reading |
The text was updated successfully, but these errors were encountered: