Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Support of Caching PSR-16 #106

Open
legoheld opened this issue Nov 21, 2023 · 4 comments
Open

[FEATURE] Support of Caching PSR-16 #106

legoheld opened this issue Nov 21, 2023 · 4 comments
Labels
enhancement New feature or request Needs Triage This issue needs to be investigated by a maintainer

Comments

@legoheld
Copy link

Requirements

It would be nice if the SDK would support caching of the flag requests.
For example add a psr/simple-cache compatible cache storage and then store the results for better performance.
This way not every provider has to implement its own caching.

I looked into hooks but as far as I understand its not possible to implement a caching hook.
The return type is always just an EvaluationContext so its not possible to prevent requesting within a hook.

@legoheld legoheld added enhancement New feature or request Needs Triage This issue needs to be investigated by a maintainer labels Nov 21, 2023
@tcarrio
Copy link
Member

tcarrio commented Nov 26, 2023

This is likely a larger discussion for the OpenFeature specification. The SDK adheres to the spec and there is no formal caching definition.

Likely the best way to handle this though, in my opinion, is this can be done by decorating an existing provider with a cache proxy provider. All of the uncached interactions would pass through but otherwise the cache would return early.

Many flagging systems have some form of telemetry or analytics that provide business value around flag evaluations. Caching is something that directly hijacks the providers logic, thus having unintended impact to those analytics. For this reason I don't believe any language with an OpenFeature SDK provides this officially as part of the client

@beeme1mr
Copy link
Member

We've talked about this in the past and decided that caching should be a provider concern. It may be possible to create a caching provider that essentially wraps a provider to enable a consistent caching experience.

@legoheld
Copy link
Author

Funny I just made a refactoring in the FliptProvider where I made a wrapper CacheProvider that could be used with any other providers.

@tcarrio
Copy link
Member

tcarrio commented Dec 4, 2023

That provider could definitely be reused, but provider developers and SDK consumers need to know the trade-offs. If you are caching the response such that the underlying provider is not called at all, you are missing impressions data, which in many cases is used by businesses to understand the performance of the features behind the flags.

If you want to introduce it to the contrib or publish it as a general package that would work. Please do include the caveats though. Many underlying providers can support caching out of the box while still providing meaningful analytics around the usage, but a pattern like the CacheProvider will act as if no flag evaluation occurred.

That's primarily the reason that this is not technically supported in the specification- most vendors did not want a generalized caching mechanism like this. But consumers can decide on that trade-off themselves 🙂

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request Needs Triage This issue needs to be investigated by a maintainer
Projects
None yet
Development

No branches or pull requests

3 participants