-
-
Notifications
You must be signed in to change notification settings - Fork 916
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cache should include the promise of matching active requests #1078
Comments
I have some experience in this as I have worked on this library, that implements a similar feature. For in memory cache is straightforward. For distributed cache (redis), is definitely more tricky. The problem is that you will need a distributed locking (in async-deco I used redlock), and this could add an overhead to a request, that I haven't had the chance to measure yet. |
I'll close this issue since it will be possible to do this as a part of the |
Just adding some extra thoughts. The reason why I mentioned the use of dedupe (async-deco) is because in reliable-get (the http get library I was mentioning above) we knew in advance whether we cache a resource. In that case you can dedupe the entire request, and this is a very effective optimisation. In an rfc compliant caching such as GOT, you know if you should cache, only after getting a response. So basically you can't apply that optimisation. If you use an in-memory cache you have no latency, so the optimisation is not effective. If you use an external cache (redis for example) you have probably multiple node process, so the optimisation is only effective in the lucky event the request for the same resource lands on the same process. So in my opinion I think it is not really worth it |
Got will follow the RFC but people will be able to change the behavior if they would like to.
This is not C++, Node.js is single-threaded. Node.js is blazing fast, its I/O is slow. |
Thanks @sithmel and @szmarczak Reading up on the topic and looking at your examples. I have learnt a new term memoize. So it seems pretty simple enough to decorate got function for my needs where i dont have two many different requests. Of course an improved approach would likely use a keyv storage adaptor. But i really didn't look into those too much. Since the Thanks again |
If you could share a working example of that memoization, that would be great :) |
Here you go @rfgamaral https://github.com/5app/memoize |
Thank you :) |
@MrSwitch Just noticed that you could have used |
@rfgamaral i didn't know about those beforehand. They all do very much the same thing. I needed to set cache expiry on rejected and resolved Promise's differently in my app so i have defined an option useCache(cachedItem){} to customise that on implementation - sorry It's not really well documented. Those others you reference look far more established than mine |
@MrSwitch Thank you for that and sorry to bombard you so many questions but "just one more" if you don't mind... Are you using both memoization and Got's Or maybe you're just using memoization over Got's |
@rfgamaral that's the reason why i started this issue. Yes, i'm using both, but it's not ideal because memoize'ing effectively duplicates the cached response. I would have liked to understand the
The benefit of memoizing is it's a good all round tool that can be applied higher up the call stack. I hope that helps |
#875 does sounds like it would make it possible. |
Personally, I'm trying to "replicate"
If I understand everything correctly, I believe I could rely on Got's built-in cache mechanism for resource caching and use If I implement both mechanisms correctly I can potentially get something like this:
What do you think? |
That's pretty much exactly what i'm doing for the happy path. I added some extra stuff to handle exceptions https://github.com/5app/memoize/blob/master/README.md#options-memoizehandler-options |
What exactly do you mean with "handle exceptions"? What kind of exceptions? |
HTTP status codes: 4xx, 5xx responses, we wanted to cache those, but for few seconds rather than minutes. |
Alright, thanks for the clarification :) |
What problem are you trying to solve?
Performance improvement multiple simultaneous requests with caching should await for the first to resolve and use the cached results of the first one.
Describe the feature
Contrived example but let's assume got is called simultaneously twice with the same link and a third time after they have resolved.
As the outcome appears A and B are both not from the cache. Whilst C uses the cache. What i'd like to see ultimately is one request whose promise is stored in the cache map. And then awaited to serve the cached response.
To get around this one might have to maintain their own map to await on before making subsequent requests...
The text was updated successfully, but these errors were encountered: