Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: isaacs/node-lru-cache
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: v7.12.1
Choose a base ref
...
head repository: isaacs/node-lru-cache
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: v7.13.0
Choose a head ref
  • 3 commits
  • 7 files changed
  • 1 contributor

Commits on Jul 12, 2022

  1. Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    13bff2a View commit details
  2. changelog

    isaacs committed Jul 12, 2022

    Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    311d6c2 View commit details
  3. 7.13.0

    isaacs committed Jul 12, 2022
    Copy the full SHA
    c028709 View commit details
Showing with 73 additions and 10 deletions.
  1. +18 −0 CHANGELOG.md
  2. +8 −1 README.md
  3. +3 −2 index.d.ts
  4. +13 −4 index.js
  5. +2 −2 package-lock.json
  6. +1 −1 package.json
  7. +28 −0 test/fetch.ts
18 changes: 18 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,23 @@
# cringe lorg

## 7.13.0

* Add `forceRefresh` option to trigger a call to the
`fetchMethod` even if the item is found in cache, and not
older than its `ttl`.

## 7.12.0

* Add `fetchContext` option to provide additional information to
the `fetchMethod`
* 7.12.1: Fix bug where adding an item with size greater than
`maxSize` would cause bizarre behavior.

## 7.11.0

* Add 'noDeleteOnStaleGet' option, to suppress behavior where a
`get()` of a stale item would remove it from the cache.

## 7.10.0

* Add `noDeleteOnFetchRejection` option, to suppress behavior
9 changes: 8 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -462,7 +462,7 @@ as in `cache.set(key, undefined)`. Use `cache.has()` to
determine whether a key is present in the cache at all.

### `async fetch(key, { updateAgeOnGet, allowStale, size,
sizeCalculation, ttl, noDisposeOnSet } = {}) => Promise`
sizeCalculation, ttl, noDisposeOnSet, forceRefresh } = {}) => Promise`

If the value is in the cache and not stale, then the returned
Promise resolves to the value.
@@ -475,6 +475,13 @@ If called with `allowStale`, and an asynchronous fetch is
currently in progress to reload a stale value, then the former
stale value will be returned.

If called with `forceRefresh`, then the cached item will be
re-fetched, even if it is not stale. However, if `allowStale` is
set, then the old value will still be returned. This is useful
in cases where you want to force a reload of a cached value. If
a background fetch is already in progress, then `forceRefresh`
has no effect.

Multiple fetches for the same `key` will only call `fetchMethod`
a single time, and all will be resolved when the value is
resolved, even if different options are used.
5 changes: 3 additions & 2 deletions index.d.ts
Original file line number Diff line number Diff line change
@@ -569,10 +569,11 @@ declare namespace LRUCache {
/**
* options which override the options set in the LRUCache constructor
* when making `cache.fetch()` calls.
* This is the union of GetOptions and SetOptions, plus the
* `noDeleteOnFetchRejection` and `fetchContext` fields.
* This is the union of GetOptions and SetOptions, plus
* `noDeleteOnFetchRejection`, `forceRefresh`, and `fetchContext`
*/
interface FetchOptions<K, V> extends FetcherFetchOptions<K, V> {
forceRefresh?: boolean
fetchContext?: any
}

17 changes: 13 additions & 4 deletions index.js
Original file line number Diff line number Diff line change
@@ -525,7 +525,9 @@ class LRUCache {
for (const i of this.indexes({ allowStale: true })) {
const key = this.keyList[i]
const v = this.valList[i]
const value = this.isBackgroundFetch(v) ? v.__staleWhileFetching : v
const value = this.isBackgroundFetch(v)
? v.__staleWhileFetching
: v
const entry = { value }
if (this.ttls) {
entry.ttl = this.ttls[i]
@@ -769,10 +771,15 @@ class LRUCache {
// fetch exclusive options
noDeleteOnFetchRejection = this.noDeleteOnFetchRejection,
fetchContext = this.fetchContext,
forceRefresh = false,
} = {}
) {
if (!this.fetchMethod) {
return this.get(k, { allowStale, updateAgeOnGet, noDeleteOnStaleGet })
return this.get(k, {
allowStale,
updateAgeOnGet,
noDeleteOnStaleGet,
})
}

const options = {
@@ -800,15 +807,17 @@ class LRUCache {
: (v.__returned = v)
}

if (!this.isStale(index)) {
// if we force a refresh, that means do NOT serve the cached value,
// unless we are already in the process of refreshing the cache.
if (!forceRefresh && !this.isStale(index)) {
this.moveToTail(index)
if (updateAgeOnGet) {
this.updateItemAge(index)
}
return v
}

// ok, it is stale, and not already fetching
// ok, it is stale or a forced refresh, and not already fetching.
// refresh the cache.
const p = this.backgroundFetch(k, index, options, fetchContext)
return allowStale && p.__staleWhileFetching !== undefined
4 changes: 2 additions & 2 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"name": "lru-cache",
"description": "A cache object that deletes the least-recently-used items.",
"version": "7.12.1",
"version": "7.13.0",
"author": "Isaac Z. Schlueter <i@izs.me>",
"keywords": [
"mru",
28 changes: 28 additions & 0 deletions test/fetch.ts
Original file line number Diff line number Diff line change
@@ -495,3 +495,31 @@ t.test('fetchContext', async t => {
// if still in cache, doesn't call fetchMethod again
t.strictSame(await cache.fetch('x', { fetchContext: 'ignored' }), ['x', 'default context'])
})

t.test('forceRefresh', async t => {
const cache = new LRU<number, number>({
max: 10,
allowStale: true,
ttl: 100,
fetchMethod: async (k, _, { options }) => {
//@ts-expect-error
t.equal(options.forceRefresh, undefined, 'do not expose forceRefresh')
return k
}
})

// put in some values that don't match what fetchMethod returns
cache.set(1, 100)
cache.set(2, 200)
t.equal(await cache.fetch(1), 100)
// still there, because we're allowing stale, and it's not stale
t.equal(await cache.fetch(1, { forceRefresh: true }), 100)
t.equal(await cache.fetch(1, { forceRefresh: true }), 100)
// if we don't allow stale though, then that means that we wait
// for the background fetch to complete, so we get the updated value.
t.equal(await cache.fetch(1, { allowStale: false }), 1)

cache.set(1, 100)
t.equal(await cache.fetch(1, { allowStale: false }), 100)
t.equal(await cache.fetch(1, { forceRefresh: true, allowStale: false }), 1)
})