Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: isaacs/node-lru-cache
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: v7.10.2
Choose a base ref
...
head repository: isaacs/node-lru-cache
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: v7.10.3
Choose a head ref
  • 4 commits
  • 12 files changed
  • 3 contributors

Commits on Jun 27, 2022

  1. docs: fix fetchMethod reference

    kentcdodds authored and isaacs committed Jun 27, 2022

    Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    34f5be6 View commit details
  2. Update breaking info in changelog entry for v7

    Add missing information about breaking change to the function signature
    of the dispose option.
    Mike Tunnicliffe authored and isaacs committed Jun 27, 2022

    Verified

    This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
    Copy the full SHA
    57fee05 View commit details

Commits on Jun 29, 2022

  1. fix: do not dump() background fetches, track start

    If saving a `dump()` to a file or database, and then expecting to load
    it again at a later time in another cache object with `cache.load()`
    then two awful things could happen.
    
    1. If a background fetch was in progress, it would dump the `Promise`
    object in the array.  This is _never_ what you want, and leads to errors
    being thrown when attempting to serialize to JSON.
    
    2. The TTL was being dumped, but the _start_ time was not, meaning that
    you could dump() an entry with only 1ms left on its 1-hour TTL, but when
    reloading it into a new cache, it would be assumed valid for a full
    hour.
    
    3. Dumps did not include stale entries at all, so there was no way for a
    cache to decide whether to allowStale or not later on.  Sometimes, this
    is preferrable.  Now, dump() will always output stale entries (since
    they can be resolved accurately for staleness later anyway).
    isaacs committed Jun 29, 2022
    Copy the full SHA
    3b2fb03 View commit details
  2. 7.10.3

    isaacs committed Jun 29, 2022
    Copy the full SHA
    b9918de View commit details
Showing with 160 additions and 16 deletions.
  1. +3 −0 CHANGELOG.md
  2. +17 −3 README.md
  3. +2 −0 index.d.ts
  4. +19 −6 index.js
  5. +2 −2 package-lock.json
  6. +1 −1 package.json
  7. +10 −0 tap-snapshots/test/fetch.ts.test.cjs
  8. +13 −0 tap-snapshots/test/map-like.ts.test.cjs
  9. +48 −0 tap-snapshots/test/ttl.ts.test.cjs
  10. +8 −0 test/fetch.ts
  11. +10 −2 test/map-like.ts
  12. +27 −2 test/ttl.ts
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -89,6 +89,9 @@ well.
* `max` and `maxSize` are now two separate options. (Previously, they were
a single `max` option, which would be based on either count or computed
size.)
* The function assigned to the `dispose` option is now expected to have signature
`(value, key, reason)` rather than `(key, value)`, reversing the order of
`value` and `key`.

## v6 - 2020-07

20 changes: 17 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -67,7 +67,7 @@ const options = {

// async method to use for cache.fetch(), for
// stale-while-revalidate type of behavior
fetch: async (key, staleValue, { options, signal }) => {}
fetchMethod: async (key, staleValue, { options, signal }) => {}
}

const cache = new LRU(options)
@@ -391,15 +391,19 @@ moment.

The total size of items in cache when using size tracking.

### `set(key, value, [{ size, sizeCalculation, ttl,
noDisposeOnSet }])`
### `set(key, value, [{ size, sizeCalculation, ttl, noDisposeOnSet, start }])`

Add a value to the cache.

Optional options object may contain `ttl` and `sizeCalculation`
as described above, which default to the settings on the cache
object.

If `start` is provided, then that will set the effective start
time for the TTL calculation. Note that this must be a previous
value of `performance.now()` if supported, or a previous value of
`Date.now()` if not.

Options object my also include `size`, which will prevent calling
the `sizeCalculation` function and just use the specified number
if it is a positive integer, and `noDisposeOnSet` which will
@@ -518,6 +522,12 @@ the item found.
Return an array of `[key, entry]` objects which can be passed to
`cache.load()`

The `start` fields are calculated relative to a portable
`Date.now()` timestamp, even if `performance.now()` is available.

Stale entries are always included in the `dump`, even if
`allowStale` is false.

Note: this returns an actual array, not a generator, so it can be
more easily passed around.

@@ -527,6 +537,10 @@ Reset the cache and load in the items in `entries` in the order
listed. Note that the shape of the resulting cache may be
different if the same options are not used in both caches.

The `start` fields are assumed to be calculated relative to a
portable `Date.now()` timestamp, even if `performance.now()` is
available.

### `purgeStale()`

Delete any stale entries. Returns `true` if anything was
2 changes: 2 additions & 0 deletions index.d.ts
Original file line number Diff line number Diff line change
@@ -500,6 +500,7 @@ declare namespace LRUCache {
size?: number
sizeCalculation?: SizeCalculator<K, V>
ttl?: number
start?: number
noDisposeOnSet?: boolean
noUpdateTTL?: boolean
}
@@ -555,6 +556,7 @@ declare namespace LRUCache {
value: V
ttl?: number
size?: number
start?: number
}
}

25 changes: 19 additions & 6 deletions index.js
Original file line number Diff line number Diff line change
@@ -285,8 +285,8 @@ class LRUCache {
this.ttls = new ZeroArray(this.max)
this.starts = new ZeroArray(this.max)

this.setItemTTL = (index, ttl) => {
this.starts[index] = ttl !== 0 ? perf.now() : 0
this.setItemTTL = (index, ttl, start = perf.now()) => {
this.starts[index] = ttl !== 0 ? start : 0
this.ttls[index] = ttl
if (ttl !== 0 && this.ttlAutopurge) {
const t = setTimeout(() => {
@@ -346,7 +346,7 @@ class LRUCache {
}
}
updateItemAge(index) {}
setItemTTL(index, ttl) {}
setItemTTL(index, ttl, start) {}
isStale(index) {
return false
}
@@ -510,12 +510,17 @@ class LRUCache {

dump() {
const arr = []
for (const i of this.indexes()) {
for (const i of this.indexes({ allowStale: true })) {
const key = this.keyList[i]
const value = this.valList[i]
const v = this.valList[i]
const value = this.isBackgroundFetch(v) ? v.__staleWhileFetching : v
const entry = { value }
if (this.ttls) {
entry.ttl = this.ttls[i]
// always dump the start relative to a portable timestamp
// it's ok for this to be a bit slow, it's a rare operation.
const age = perf.now() - this.starts[i]
entry.start = Math.floor(Date.now() - age)
}
if (this.sizes) {
entry.size = this.sizes[i]
@@ -528,6 +533,13 @@ class LRUCache {
load(arr) {
this.clear()
for (const [key, entry] of arr) {
if (entry.start) {
// entry.start is a portable timestamp, but we may be using
// node's performance.now(), so calculate the offset.
// it's ok for this to be a bit slow, it's a rare operation.
const age = Date.now() - entry.start
entry.start = perf.now() - age
}
this.set(key, entry.value, entry)
}
}
@@ -539,6 +551,7 @@ class LRUCache {
v,
{
ttl = this.ttl,
start,
noDisposeOnSet = this.noDisposeOnSet,
size = 0,
sizeCalculation = this.sizeCalculation,
@@ -583,7 +596,7 @@ class LRUCache {
this.initializeTTLTracking()
}
if (!noUpdateTTL) {
this.setItemTTL(index, ttl)
this.setItemTTL(index, ttl, start)
}
if (this.disposeAfter) {
while (this.disposed.length) {
4 changes: 2 additions & 2 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"name": "lru-cache",
"description": "A cache object that deletes the least-recently-used items.",
"version": "7.10.2",
"version": "7.10.3",
"author": "Isaac Z. Schlueter <i@izs.me>",
"keywords": [
"mru",
10 changes: 10 additions & 0 deletions tap-snapshots/test/fetch.ts.test.cjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
/* IMPORTANT
* This snapshot file is auto-generated, but designed for humans.
* It should be checked into source control and tracked carefully.
* Re-generate by setting TAP_SNAPSHOT=1 and running tests.
* Make sure to inspect the output below. Do not ignore changes!
*/
'use strict'
exports[`test/fetch.ts TAP asynchronous fetching > safe to stringify dump 1`] = `
[["key",{"value":1,"ttl":5,"start":11}]]
`
13 changes: 13 additions & 0 deletions tap-snapshots/test/map-like.ts.test.cjs
Original file line number Diff line number Diff line change
@@ -51,6 +51,7 @@ Array [
3,
Object {
"size": 1,
"start": 0,
"ttl": 0,
"value": "3",
},
@@ -59,6 +60,7 @@ Array [
5,
Object {
"size": 1,
"start": 0,
"ttl": 0,
"value": "5",
},
@@ -67,6 +69,7 @@ Array [
6,
Object {
"size": 1,
"start": 0,
"ttl": 0,
"value": "6",
},
@@ -75,10 +78,20 @@ Array [
4,
Object {
"size": 1,
"start": 0,
"ttl": 0,
"value": "new value 4",
},
],
Array [
7,
Object {
"size": 1,
"start": -10000,
"ttl": 1,
"value": "stale",
},
],
]
`

48 changes: 48 additions & 0 deletions tap-snapshots/test/ttl.ts.test.cjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
/* IMPORTANT
* This snapshot file is auto-generated, but designed for humans.
* It should be checked into source control and tracked carefully.
* Re-generate by setting TAP_SNAPSHOT=1 and running tests.
* Make sure to inspect the output below. Do not ignore changes!
*/
'use strict'
exports[`test/ttl.ts TAP tests using Date.now() set item pre-stale > dump with stale values 1`] = `
Array [
Array [
1,
Object {
"start": 3010,
"ttl": 10,
"value": 1,
},
],
Array [
2,
Object {
"start": 2999,
"ttl": 10,
"value": 2,
},
],
]
`

exports[`test/ttl.ts TAP tests with perf_hooks.performance.now() set item pre-stale > dump with stale values 1`] = `
Array [
Array [
1,
Object {
"start": 1505,
"ttl": 10,
"value": 1,
},
],
Array [
2,
Object {
"start": 1494,
"ttl": 10,
"value": 2,
},
],
]
`
8 changes: 8 additions & 0 deletions test/fetch.ts
Original file line number Diff line number Diff line change
@@ -59,6 +59,14 @@ t.test('asynchronous fetching', async t => {
)
const e = expose(c)
const v = e.valList[0]

// should not have any promises or cycles in the dump
const dump = c.dump()
for (const [_, entry] of dump) {
t.type(entry.value, 'number')
}
t.matchSnapshot(JSON.stringify(dump), 'safe to stringify dump')

t.equal(e.isBackgroundFetch(v), true)
t.equal(e.backgroundFetch('key', 0), v)
await v
12 changes: 10 additions & 2 deletions test/map-like.ts
Original file line number Diff line number Diff line change
@@ -1,7 +1,15 @@
if (typeof performance === 'undefined') {
if (typeof global.performance === 'undefined') {
global.performance = require('perf_hooks').performance
}
import t from 'tap'
const Clock = require('clock-mock')
const clock = new Clock()
const { performance, Date } = global
// @ts-ignore
t.teardown(() => Object.assign(global, { performance, Date }))
global.Date = clock.Date
global.performance = clock

import LRU from '../'
import { expose } from './fixtures/expose'

@@ -40,7 +48,7 @@ t.matchSnapshot(c.dump(), 'dump, new value 4')
c.set(7, 'stale', { ttl: 1, size: 1 })
const e = expose(c)
const idx = e.keyMap.get(7)
e.starts[idx as number] = performance.now() - 10000
e.starts[idx as number] = clock.now() - 10000
const seen: number[] = []
for (const i of e.indexes()) {
seen[i] = seen[i] || 0
29 changes: 27 additions & 2 deletions test/ttl.ts
Original file line number Diff line number Diff line change
@@ -398,13 +398,38 @@ const runTests = (LRU: typeof LRUCache, t: Tap.Test) => {
t.end()
})

t.test('set item pre-stale', t => {
const c = new LRU({
max: 3,
ttl: 10,
allowStale: true,
})
c.set(1, 1)
t.equal(c.has(1), true)
t.equal(c.get(1), 1)
c.set(2, 2, { start: clock.now() - 11 })
t.equal(c.has(2), false)
t.equal(c.get(2), 2)
t.equal(c.get(2), undefined)
c.set(2, 2, { start: clock.now() - 11 })
const dump = c.dump()
t.matchSnapshot(dump, 'dump with stale values')
const d = new LRU({ max: 3, ttl: 10, allowStale: true })
d.load(dump)
t.equal(d.has(2), false)
t.equal(d.get(2), 2)
t.equal(d.get(2), undefined)
t.end()
})

t.end()
}

t.test('tests with perf_hooks.performance.now()', t => {
const { performance } = global
const { performance, Date } = global
// @ts-ignore
t.teardown(() => (global.performance = performance))
t.teardown(() => Object.assign(global, { performance, Date }))
global.Date = clock.Date
global.performance = clock
const LRU = t.mock('../', {})
runTests(LRU, t)