Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: isaacs/node-lru-cache
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: v7.14.0
Choose a base ref
...
head repository: isaacs/node-lru-cache
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: v7.14.1
Choose a head ref
  • 7 commits
  • 6 files changed
  • 5 contributors

Commits on Aug 22, 2022

  1. docs: Fix fetch function signature in README.md

    PR-URL: #251
    Credit: @sebastinez
    Close: #251
    Reviewed-by: @isaacs
    sebastinez authored and isaacs committed Aug 22, 2022

    Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    be57c92 View commit details
  2. docs: add del() deprecation to docs

    PR-URL: #249
    Credit: @julianlam
    Close: #249
    Reviewed-by: @isaacs
    julianlam authored and isaacs committed Aug 22, 2022

    Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    9bb53af View commit details

Commits on Sep 19, 2022

  1. fix #250 remove outdated words

    isaacs committed Sep 19, 2022
    Copy the full SHA
    41c7237 View commit details

Commits on Sep 20, 2022

  1. docs: 'del' does not exist on type (safety-case)

    PR-URL: #254
    Credit: @wangcch
    Close: #254
    Reviewed-by: @isaacs
    wangcch authored and isaacs committed Sep 20, 2022
    Copy the full SHA
    fd370b8 View commit details

Commits on Nov 2, 2022

  1. Copy the full SHA
    ff254a7 View commit details
  2. Copy the full SHA
    f351e68 View commit details
  3. 7.14.1

    isaacs committed Nov 2, 2022
    Copy the full SHA
    a63ce28 View commit details
Showing with 40 additions and 16 deletions.
  1. +1 −0 CHANGELOG.md
  2. +7 −10 README.md
  3. +13 −3 index.js
  4. +2 −2 package-lock.json
  5. +1 −1 package.json
  6. +16 −0 test/size-calculation.ts
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -105,6 +105,7 @@ well.
* `maxAge` option -> `ttl`
* `length` option -> `sizeCalculation`
* `length` property -> `size`
* `del()` method -> `delete()`
* `prune()` method -> `purgeStale()`
* `reset()` method -> `clear()`
* The objects used by `cache.load()` and `cache.dump()` are incompatible
17 changes: 7 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
@@ -96,10 +96,10 @@ If you put more stuff in it, then items will fall out.

### `max`

The maximum number (or size) of items that remain in the cache
(assuming no TTL pruning or explicit deletions). Note that fewer
items may be stored if size calculation is used, and `maxSize` is
exceeded. This must be a positive finite intger.
The maximum number of items that remain in the cache (assuming no
TTL pruning or explicit deletions). Note that fewer items may be
stored if size calculation is used, and `maxSize` is exceeded.
This must be a positive finite intger.

At least one of `max`, `maxSize`, or `TTL` is required. This
must be a positive integer if set.
@@ -280,9 +280,7 @@ and MAY live in the cache, contributing to its LRU max, long
after they have expired.

Also, as this cache is optimized for LRU/MRU operations, some of
the staleness/TTL checks will reduce performance, as they will
incur overhead by deleting from Map objects rather than simply
throwing old Map objects away.
the staleness/TTL checks will reduce performance.

This is not primarily a TTL cache, and does not make strong TTL
guarantees. There is no pre-emptive pruning of expired items,
@@ -474,8 +472,7 @@ can be confusing when setting values specifically to `undefined`,
as in `cache.set(key, undefined)`. Use `cache.has()` to
determine whether a key is present in the cache at all.

### `async fetch(key, { updateAgeOnGet, allowStale, size,
sizeCalculation, ttl, noDisposeOnSet, forceRefresh } = {}) => Promise`
### `async fetch(key, { updateAgeOnGet, allowStale, size, sizeCalculation, ttl, noDisposeOnSet, forceRefresh } = {}) => Promise`

If the value is in the cache and not stale, then the returned
Promise resolves to the value.
@@ -736,7 +733,7 @@ const cache = {
if (cache.timers.has(k)) {
clearTimeout(cache.timers.get(k))
}
cache.timers.set(k, setTimeout(() => cache.del(k), ttl))
cache.timers.set(k, setTimeout(() => cache.delete(k), ttl))
cache.data.set(k, v)
},
get: k => cache.data.get(k),
16 changes: 13 additions & 3 deletions index.js
Original file line number Diff line number Diff line change
@@ -379,6 +379,11 @@ class LRUCache {
this.sizes[index] = 0
}
this.requireSize = (k, v, size, sizeCalculation) => {
// provisionally accept background fetches.
// actual value size will be checked when they return.
if (this.isBackgroundFetch(v)) {
return 0
}
if (!isPosInt(size)) {
if (sizeCalculation) {
if (typeof sizeCalculation !== 'function') {
@@ -400,9 +405,11 @@ class LRUCache {
}
this.addItemSize = (index, size) => {
this.sizes[index] = size
const maxSize = this.maxSize - this.sizes[index]
while (this.calculatedSize > maxSize) {
this.evict(true)
if (this.maxSize) {
const maxSize = this.maxSize - this.sizes[index]
while (this.calculatedSize > maxSize) {
this.evict(true)
}
}
this.calculatedSize += this.sizes[index]
}
@@ -586,6 +593,9 @@ class LRUCache {
// if the item doesn't fit, don't do anything
// NB: maxEntrySize set to maxSize by default
if (this.maxEntrySize && size > this.maxEntrySize) {
// have to delete, in case a background fetch is there already.
// in non-async cases, this is a no-op
this.delete(k)
return this
}
let index = this.size === 0 ? undefined : this.keyMap.get(k)
4 changes: 2 additions & 2 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"name": "lru-cache",
"description": "A cache object that deletes the least-recently-used items.",
"version": "7.14.0",
"version": "7.14.1",
"author": "Isaac Z. Schlueter <i@izs.me>",
"keywords": [
"mru",
16 changes: 16 additions & 0 deletions test/size-calculation.ts
Original file line number Diff line number Diff line change
@@ -242,3 +242,19 @@ t.test('large item falls out of cache because maxEntrySize', t => {

t.end()
})

t.test('maxEntrySize, no maxSize', async t => {
const c = new LRU<number, string>({
max: 10,
maxEntrySize: 10,
sizeCalculation: s => s.length,
fetchMethod: async n => 'x'.repeat(n),
})
t.equal(await c.fetch(2), 'xx')
t.equal(c.size, 1)
t.equal(await c.fetch(3), 'xxx')
t.equal(c.size, 2)
t.equal(await c.fetch(11), 'x'.repeat(11))
t.equal(c.size, 2)
t.equal(c.has(11), false)
})