Skip to content

A simple wrapper over MemoryCache to prevent concurrent cache missing.

License

Notifications You must be signed in to change notification settings

YSZhuoyang/concurrent-memory-cache

Repository files navigation

concurrent-memory-cache

Build Coverage Status

Updates: IMemoryCache has been updated to support concurrent async operations, thus this library is not necessary anymore.

A simple wrapper over MemoryCache to prevent concurrent cache missing.

Purpose

The Microsoft MemoryCache API provides the ability to cache locally, however it does not guarantee the atomicity when multiple threads call GetOrCreate() with the same entry (at the moment this was written), which can lead to multiple cache missing and duplicated data fetching.

How it works

It was implemented based on the same idea used by ConcurrentDictionary, which divides the Hashmap into a fixed number of segments (16 by default). Each of them is protected by a segment lock. Write operations that fall into the same segment are sequentialized, while write operations falling into different segments are handled concurrently. All read operations are lock-free.

How to use

  1. Install package:

    dotnet add package ConcurrentCaching
    
  2. Inject caching service:

    services.AddMemoryCache(options =>
    {
        //... Setup e.g. max cache limit for LRU
    });
    services.AddSingleton<IConcurrentMemoryCache, ConcurrentMemoryCache>();
    
  3. Fetch data from cache:

    var item = cache.GetOrCreate<TItem>("<key>", entry =>
    {
        // Fetch data from elsewhere and return it ...
    });
    
    var item = await cache.GetOrCreateAsync<TItem>("<key>", async entry =>
    {
        // Fetch data from elsewhere and return it ...
    });
    

About

A simple wrapper over MemoryCache to prevent concurrent cache missing.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages