Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

configure cache size #8

Open
leonardoanalista opened this issue Jul 3, 2016 · 13 comments
Open

configure cache size #8

leonardoanalista opened this issue Jul 3, 2016 · 13 comments

Comments

@leonardoanalista
Copy link

Is it possible to configure the max cache size?
I took a look at the source and noticed that the simple cache (Object cache) applies no limits.
I am not very comfortable to use knowing that memory could eventually blow up the browser in a matter of time.

@caiogondim
Copy link
Owner

It is possible.
We just have to be sure to not slow down the critical path (that I'm assuming it is without cache).
The memoized function without cache limit should not do checks for cache size.
Other than that I see it as a nice addition.

I see the API as:

memoize(func, {
  cacheLimit: 256
})

Thoughts about it?

@leonardoanalista
Copy link
Author

it sounds good to me. Is cacheLimit size in number of entries right?

Additional suggestion:
If this second parameter becomes option con config capability, with reselect can take a function to override the equally check. Default is obviously ===.

Leonardo

@caiogondim
Copy link
Owner

About the default equality check: I see it as a responsibility of the cache itself. And you can already pass a custom cache as const memoized = memoize(fn, customCache).

There is no documentation for it. I will update README.

@leonardoanalista
Copy link
Author

if you can add some examples of custom cache would be great.

@caiogondim
Copy link
Owner

caiogondim commented Jul 7, 2016

Documentation on usage of custom cache and serializer 187d14c

@almirfilho
Copy link

Isn't cacheLimit option worth mentioning in the docs?

@caiogondim
Copy link
Owner

@almirfilho It's not on the source code.
I not willing to implement a cacheLimit, since there is another check to be made in the hot path.
But an example with a customCache is worth doing.

@caiogondim
Copy link
Owner

Another idea is to check for opts.cacheLimit and provide a different strategy if it's present, so we don't slow down the hot path when we are not using it.

@Offirmo
Copy link

Offirmo commented Aug 22, 2018

A very nice feature would be to allow a cache size of 1 to replicate memoize-once (i.e. leveraging immutability without hogging memory)

@GlennMatthys
Copy link

This shouldn't be included into this package because its API allows a custom cache handler. A quick implementation of a sized cache could be:

function sizedCache(size)
{
    const track = new Array(size);
    const store = {};
    let idx = 0;

    return {
        has(key)
        {
            return (key in store);
        },
        get(key)
        {
            return store[key];
        },
        set(key, value)
        {
            store[key] = value;

            if(track[idx])
                delete store[track[idx]];

            track[idx] = key;

            ++idx;

            if(idx === size)
                idx = 0;
        }
    }
}

Test suite:


describe(
    'sizedCache',
    function()
    {
        it('Works with integers',
            function()
            {
                const cache = sizedCache(3);
                cache.set(1, 'foo');
                expect(cache.has(1)).toBe(true);
                expect(cache.get(1)).toEqual('foo');
                cache.set(2, 'bar');
                cache.set(3, 'baz');
                cache.set(4, 'baa');
                expect(cache.has(1)).toBe(false); expect(cache.get(1)).toEqual(undefined);
                expect(cache.has(2)).toBe(true); expect(cache.get(2)).toEqual('bar');
                expect(cache.has(3)).toBe(true); expect(cache.get(3)).toEqual('baz');
                expect(cache.has(4)).toBe(true); expect(cache.get(4)).toEqual('baa');
            }
        );

        it('Works with strings',
            function()
            {
                const cache = sizedCache(3);
                cache.set('a', 1);
                expect(cache.has('a')).toBe(true); expect(cache.get('a')).toEqual(1);
                cache.set('b', 2);
                cache.set('c', 3);
                cache.set('d', 4);
                cache.set('e', 5);
                expect(cache.has('a')).toBe(false);
                expect(cache.has('b')).toBe(false); expect(cache.get('b')).toEqual(undefined);
                expect(cache.has('c')).toBe(true); expect(cache.get('c')).toEqual(3);
                expect(cache.has('d')).toBe(true); expect(cache.get('d')).toEqual(4);
                expect(cache.has('e')).toBe(true); expect(cache.get('e')).toEqual(5);
            })
    }
);

@ecerroni
Copy link

ecerroni commented Feb 5, 2021

Here there is a method to set the cache size limit combining this package with 2 others, without the need to add any change to fast-memoize.

@peacechen
Copy link

The main point is that the memoizer's cache must have a limit in order to use this in production. Otherwise it could grow forever which is obviously a problem.
Making every developer write their own size-limited cache is unrealistic and inconvenient. This library should offer a size-limited cache in addition to the default unlimited cache. Let the developer choose which one to use.

@aaronbeall
Copy link

aaronbeall commented Mar 19, 2023

I agree with @peacechen , it doesn't seem reasonable to have an unlimited cache in the first place, this is a huge footgun. I can see having an optional cacheLimit and if it's null (default) select a cache strategy that doesn't have a limit.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants