New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
test(queryCache): add failing test for partially undefined key (#3741) #4616
base: main
Are you sure you want to change the base?
Conversation
…ack#3741) The current behavior when using functions such as invalidateQueries or setQueriesData is unexpected when objects with undefined properties are involved. A key containing an object with an undefined property is hashed without the undefined property, yet the property is considered in the partialDeepEqual function. This creates some confusing scenarios, as demonstrated in the discussion on TkDodo's blog <TkDodo/blog-comments#71 (comment)> and in the referenced issue This commit includes a failing test to demonstrate the expected behavior
This pull request is automatically built and testable in CodeSandbox. To see build info of the built libraries, click here or the icon next to each commit SHA. Latest deployment of this branch, based on commit 719c518:
|
const baseKey = queryKey() | ||
await queryClient.prefetchQuery([...baseKey, { a: 1 }], () => 'data1') | ||
expect(queryCache.findAll([...baseKey, { a: undefined }])).toHaveLength(1) | ||
}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks. If I get this right, the more realistic example would be something like this?
test('should return all the queries when key contains object with an undefined property', async () => {
const baseKey = queryKey()
const createKey = (id?: number) => [{ ...baseKey, a: id }]
await queryClient.prefetchQuery(createKey(1), () => 'data1')
await queryClient.prefetchQuery(createKey(), () => 'data-nothing')
expect(queryCache.findAll(createKey())).toHaveLength(2)
})
so when we use the same createKey
function to create our keys with optional filters, and then use it e.g. for invalidation, it doesn't match them all if we don't provide the param, but it should?
The thing is, there isn't really a way to make this work with either object or array keys. As soon as you add an entry to the array or object, it can't fuzzily match a non-existing entry
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Precisely, I'll edit the test accordingly.
It can also happen when simply using a custom hook with an optional parameter (a key factory is not really required).
In addition, any query created with createKey()
will be hashed as if it were created with just [{ ...baseKey }]
A temporary workaround is to wrap the filter value with an object and replace it with an empty object if undefined:
const baseKey = queryKey()
const createKey = (filter?: string) => [{...baseKey, filter: filter !== undefined ? { filter } : {}}]
// Instead of: const createKey = (filter?: string) => [{...baseKey, filter }]
await queryClient.prefetchQuery(createKey("filter1"), () => 'filtered-data')
await queryClient.prefetchQuery(createKey(), () => 'all-data')
expect(queryCache.findAll(createKey())).toHaveLength(2)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the root problem is that we somehow assume that passing an undefined
filter to the createKey
function would result in filter not being present in the queryKey. This might be amplified by the devtools showing hashed keys, and in the standard hashing process (JSON.stringify), undefined
is removed.
But in an object, undefined
as a value will be present because we add a valid key. It's the same for arrays:
const createKey = (filter?: string) => [...baseKey, filter]
this will have the exact same problem. And we can't just remove undefined
from arrays because order matters.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep, the dev tools showing the hashed keys amplifies the problem. It stems from the fact that the hash is not representative of the key (which is not what you would expect from a hash function)
That means the hash for two different keys can be the same (which means it's an inadequate hash function).
Perhaps it's more expected for tuples to be of the same length regardless of the values we pass inside, while it's unexpected for objects.
I may be wrong, as referencing an out-of-bounds element in an array also produces the value undefined
.
So there are a few ways to go about this, I guess:
- Don't change anything and add documentation to help users in this scenario (perhaps even a note on your blog can go a long way)
- Ignore undefined in objects only (not in arrays) when doing comparisons (e.g., through a new opt-in option)
- Fix the hash function to act similarly to the comparison
- Make the dev tools show the key instead of the hash
- Ignore the problem
What do you think?
Usually users won't pass `undefined` explicitly into an object but rather use it an optional function parameter (e.g. to a custom hook)
The current behavior when using functions such as invalidateQueries or setQueriesData is unexpected when objects with undefined properties are involved.
A key containing an object with an undefined property is hashed without the undefined property, yet the property is considered in the partialDeepEqual function.
This creates some confusing scenarios, as demonstrated in the discussion on TkDodo's blog
TkDodo/blog-comments#71 (comment) and in the referenced issue
This commit includes a failing test to demonstrate the expected behavior