You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The issue of webseeds having poor performance has been raised multiple times in many ways, this is mainly because it treats webseeds as peers, not webseeds, general idea is to keep the addWebSeed method, but allow it to accept an abstract web seed or string, if it's a string it'd be a custom webtorrent webseed, or it would use a developer's custom abstract webseed.
How the abstract ws would work:
origin property - URL origin so webtorrent can recognise what peer the data is coming from and for developers so they can for example add the url as a peer to their display/data/information
defined a static method, spawn, create, init, which most likely would be async, which as parameter would take webtorrent files, maybe torrent, this would allow the webseed to calculate a:
bitfield, which would specify what pieces/files the webseed has [partial web seeds]
maybe some URL transform methods?
get method, which takes a webtorrent file object and a http range [from, to], which should return an async iterable which returns data for that range
the get method would allow the developer which wants to have a custom webseed to create their own implementation for the data getting functionality, be that IPFS, RealDebrid, etc, it also returns an async iterable which means the web seeds would be streamed, this is also nice for services like google API which have a rate limit of 1 request per second
With this a developer could do stuff like:
importWebSeedfrom'webtorrent/lib/webseed.js'classMyWebSeedextendsWebSeed{constructor(...opts){super(...opts)}urlTransform(file,path){// do some transformreturnstring}}
and only modify partial methods for the webseeds
Notable issues are that these would likely need to bypass speed limiter, otherwise it would simply keep the data in memory as browser http streams don't really care if the request stream request is requesting more data or not, it just pulls everything at max speed
Another issue is calculating the ranges, as they need to be within the file selections, within the webseeds bitfield, respect critical requests, etc, which means the algorithm for calculating the most efficient range would overlap with peer interest, which would waste bandwidth, but it's necessary for speed as a simple peer with poor connection speed would otherwise kill the speed of streamed playback, it should also prioritize the start of a selection rather than middle or end for streaming
Once this is done, we should revert this change to torrent-piece, which amounts to a hack: webtorrent/torrent-piece#1
The text was updated successfully, but these errors were encountered:
Notable issues are that these would likely need to bypass speed limiter, otherwise it would simply keep the data in memory as browser http streams don't really care if the request stream request is requesting more data or not, it just pulls everything at max speed
You can try it out yourself here https://github.com/Banou26/webtorrent-issue-2467 or here's a video otherwise of that same repo in action on chrome showcasing the pulls, then stopping for 5 seconds and resuming them
The issue of webseeds having poor performance has been raised multiple times in many ways, this is mainly because it treats webseeds as peers, not webseeds, general idea is to keep the addWebSeed method, but allow it to accept an abstract web seed or string, if it's a string it'd be a custom webtorrent webseed, or it would use a developer's custom abstract webseed.
How the abstract ws would work:
spawn
,create
,init
, which most likely would be async, which as parameter would take webtorrent files, maybe torrent, this would allow the webseed to calculate a:the get method would allow the developer which wants to have a custom webseed to create their own implementation for the data getting functionality, be that IPFS, RealDebrid, etc, it also returns an async iterable which means the web seeds would be streamed, this is also nice for services like google API which have a rate limit of 1 request per second
With this a developer could do stuff like:
and only modify partial methods for the webseeds
Notable issues are that these would likely need to bypass speed limiter, otherwise it would simply keep the data in memory as browser http streams don't really care if the request stream request is requesting more data or not, it just pulls everything at max speed
Another issue is calculating the ranges, as they need to be within the file selections, within the webseeds bitfield, respect critical requests, etc, which means the algorithm for calculating the most efficient range would overlap with peer interest, which would waste bandwidth, but it's necessary for speed as a simple peer with poor connection speed would otherwise kill the speed of streamed playback, it should also prioritize the start of a selection rather than middle or end for streaming
Once this is done, we should revert this change to
torrent-piece
, which amounts to a hack: webtorrent/torrent-piece#1The text was updated successfully, but these errors were encountered: