Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory usage grows indefinitely when setting same key within eviction interval #311

Open
inliquid opened this issue Mar 24, 2022 · 3 comments
Labels

Comments

@inliquid
Copy link

This is effectively a follow-up on #109 which is closed for some reason.

We have experienced a bug in production service when hard limits were removed and OOM killed the app. The app has to hold some data in cache which is then repeatedly re-read from DB and re-set in a fixed interval of ~30 min. Keys for saving data in memory are always the same. What we observed is that after few hours memory consumption in our service has grown above any limit.

I slightly modified code snipped from #109 to experiment and reproduce Bigcache behavior:

package main

import (
	"strconv"
	"time"

	"github.com/allegro/bigcache/v3"
)

func main() {
	evictionInteval := time.Minute

	cacheCfg := bigcache.DefaultConfig(evictionInteval)
	// cacheCfg.CleanWindow = time.Second
	cacheCfg.Verbose = false
	// cacheCfg.HardMaxCacheSize = 100

	cache, _ := bigcache.NewBigCache(cacheCfg)
	data := []byte("TESTDATATESTDATATESTDATATESTDATATESTDATATESTDATATESTDATA")

	for {
		for i := 0; i < 10000; i++ {
			if err := cache.Set(strconv.Itoa(i), data); err != nil {
				panic(err)
			}
		}
		time.Sleep(100 * time.Millisecond)
	}
}

The memory usage growth depends only on evictionInterval.
So, for instance, on my linux machine, when evictionInterval set to

  • 1 minute, RSS is ~877M
  • 2 minutes, RSS is ~1680M
  • etc

So if evictionInterval is big enough and we keep setting data with same key, we would end up with OOM killer.

It doesn't matter whether GODEBUG=madvdontneed=1 is set or not. I run with this param, but it just seems to not affect anything.

Please note commented // cacheCfg.CleanWindow = time.Second line - I tried setting this param to different values starting from 1 sec, and it didn't help.

As a result the only way to limit memory consumption and prevent OOM is to set HardMaxCacheSize.

@inliquid inliquid added the bug label Mar 24, 2022
@Hojun-Cho
Copy link
Contributor

Hojun-Cho commented Sep 7, 2022

Hello, I think the way the code works is intentional.

// Config for BigCache
type Config struct {
        ...
	//Default value is 0 which means unlimited size. When the limit is higher than 0 and reached then
	HardMaxCacheSize int
	...
}

DefaultConfig sets HardMaxCacheSize to 0

func DefaultConfig(eviction time.Duration) Config {
	return Config{
                ...
		HardMaxCacheSize:   0,
                ...
	}
}

@Kaushal28
Copy link

@Hojun-Cho , How is that intentional? Why should setting the same key increase the memory usage? Shouldn't it over write the same key with new value? If that's intentional, it's a bug. HardMaxCacheSize should not be used as workaround for this.

@janisz
Copy link
Collaborator

janisz commented Aug 2, 2023

Yes, it's intentional. The reason why it's done like this it's to keep it simple. The goal of bigcache was to reduce GC time for read heavy data.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants