Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cache support for useFetch/useAsyncData #15445

Closed
lucassimines opened this issue Nov 11, 2022 · 26 comments · Fixed by #20747
Closed

cache support for useFetch/useAsyncData #15445

lucassimines opened this issue Nov 11, 2022 · 26 comments · Fixed by #20747

Comments

@lucassimines
Copy link

Environment


  • Operating System: Darwin
  • Node Version: v16.14.2
  • Nuxt Version: 3.0.0-rc.14-27802701.2f53495
  • Nitro Version: 0.6.2-27796963.837f894
  • Package Manager: npm@8.8.0
  • Builder: vite
  • User Config: -
  • Runtime Modules: -
  • Build Modules: -

Reproduction

https://stackblitz.com/edit/github-ccqtvp?file=pages%2Fabout.vue,pages%2Findex.vue,package.json

As you can see, on rc.14 edge if you click "Go to about page" then "Go to home", you will see that the data will be fetched again.
If you change to rc.13, the data won't be fetched again, loading from cache.

Describe the bug

Everytime useFetch is being called, it ma dke a new request, it is not checking for the fetched cached data, as it used to be working by rc.13

Additional context

No response

Logs

No response

@manniL
Copy link
Member

manniL commented Nov 11, 2022

I think this is intended via nuxt/framework#8885? 🤔

@nearlyheadless
Copy link

Is there a workaround for this problem? Since the update to version 3.0, every API call with useFetch sends a server request again for me too. Thanks.

Copy link
Member

You can implement your own custom cache behaviour by using useAsyncData.

@lucassimines
Copy link
Author

lucassimines commented Nov 18, 2022

You can implement your own custom cache behaviour by using useAsyncData.

Do you think it could be nice to add an example in the docs?

I solved by doing this code:

const banners = useState<TBanner[]>('banners');
const pendingBanners = useState<boolean>('pendingBanners');

if (!banners.value) {
  const { data, pending } = await useLazyFetch<TBanner[]>('/banners');
  banners.value = data.value;
  pendingBanners.value = pending.value;
  watch(data, (data) => (banners.value = data));
  watch(pending, (pending) => (pendingBanners.value = pending));
}

@vanling
Copy link

vanling commented Nov 18, 2022

I've wasted some time today trying to figure out why i was seeing data refetched on back and forth page navigation hehehe. Thought i was going insane and broke something in my datafetching. I thought the built in caching was awesome!

@max3xyz
Copy link

max3xyz commented Nov 18, 2022

An example for the documentation would be really helpful.
I think caching fetched data is a very useful feature which is often needed.

@DNA
Copy link

DNA commented Nov 18, 2022

You can implement your own custom cache behaviour by using useAsyncData.

That would be a lot of boilerplate code instead of a one-liner, @danielroe… Is this cache removal a final design decision or it is related to the ongoing discussion on nuxt/framework#7569? 🙂

@anc95
Copy link

anc95 commented Nov 27, 2022

Need built-in cache

johannschopplich referenced this issue in johannschopplich/nuxt-api-party Dec 6, 2022
@Arecsu
Copy link
Contributor

Arecsu commented Dec 6, 2022

Caching fetched data is super useful and having an easy way to do this instead of writing our own cache system would be nice. Makes me wonder if there's any plan to bring a caching feature to useFetch and useAsyncData or should I start writing my own caching system?

What I've noticed (maybe I'm wrong) is that the data is indeed cached. If you go back to the previous page, nuxt uses the previous fetched data to render the page at the same time it makes a new request. Then the data from the cache is updated if the data from that last request has changed. This is clear if you use the throttling option in the DevTools of the browser to make the requests slow.

I can understand this behavior being the default way. I think is a good idea. Nonetheless, it would be awesome to have an option to not make new requests unless some event, like browser refresh, expiration time, the refresh() function, etc. It should offload the backend and client from making useless requests from data that won't change anytime soon. Maybe is out of the scope of Nuxt. I think having an easy to use integrated cache system would be nice!

@danielroe danielroe added the 3.x label Jan 19, 2023
@danielroe danielroe transferred this issue from nuxt/framework Jan 19, 2023
@AndrewSparrowww
Copy link

@danielroe Would be nice to hear your thoughts on this topic as I believe it was very useful feature to have cached version of useAsyncData out of the box.

@danielroe danielroe changed the title useFetch data is not being cached anymore cache support for useFetch/useAsyncData Feb 3, 2023
@MikeBellika
Copy link
Contributor

I've implemented my own cached version of useAsyncData. Has the same interface but the key is used for caching:

import { NuxtApp } from '#app'
import { AsyncData } from 'nuxt/dist/app/composables'
import {
  AsyncDataExecuteOptions,
  AsyncDataOptions,
  KeyOfRes,
  PickFrom,
  _Transform,
} from 'nuxt/dist/app/composables/asyncData'
export function useCachedAsyncData<Data>(
  cacheKey: string,
  handler: (ctx?: NuxtApp) => Promise<Data>,
  options?: AsyncDataOptions<Data>
): AsyncData<
  PickFrom<ReturnType<_Transform<Data>>, KeyOfRes<_Transform<Data>>>,
  Error | null
> {
  // Used to prevent collisions in nuxt data. Low likelyhood that another property in nuxt data starts with this
  const CACHE_KEY_PRREFIX = 'CACHED_ASYNC_DATA'
  const { data: cachedData } = useNuxtData(cacheKey)
  const cacheKeyAsync = `${CACHE_KEY_PRREFIX}${cacheKey}`
  const shouldRefresh = ref<boolean>(false)
  const asyncData = useAsyncData<Data, Error>(
    cacheKey,
    async () => {
      await refreshNuxtData(cacheKeyAsync)
      // If we already have data, and we're not being forced to refresh, return cached data
      if (cachedData.value && !shouldRefresh.value) {
        return cachedData.value
      }
      const result = await handler()
      shouldRefresh.value = false
      return result
    },
    options
  )
  const refresh: (
    opts?: AsyncDataExecuteOptions | undefined
  ) => Promise<void> = async (opts?: AsyncDataExecuteOptions) => {
    shouldRefresh.value = true
    await asyncData.refresh(opts)
    shouldRefresh.value = false
  }
  return { ...asyncData, refresh }
}

@vanling
Copy link

vanling commented Feb 18, 2023

@MikeBellika awsome, learned a lot from you code :) There is one small issue im trying to fix, when using transform the 'fresh' data is working fine, but the returned cache is not being transformed.

I guess we have to run the transform on return cachedData.value
^
edit: scrap that, i think it caches the transformed data, and tries to run transform again on the cached data.
edit2: still not sure.. bottomline is once i try to add the transform option it boinks on the cached value

@bootsmann1995
Copy link

@MikeBellika do you have any example of use from this setup?

@MikeBellika
Copy link
Contributor

@MikeBellika awsome, learned a lot from you code :) There is one small issue im trying to fix, when using transform the 'fresh' data is working fine, but the returned cache is not being transformed.

I guess we have to run the transform on return cachedData.value ^ edit: scrap that, i think it caches the transformed data, and tries to run transform again on the cached data. edit2: still not sure.. bottomline is once i try to add the transform option it boinks on the cached value

@vanling I see. I didn't consider the transform option. The transformed data is not cached. I've put this together quickly, I hope it solves your problem.

import { NuxtApp } from '#app';
import { AsyncData } from 'nuxt/dist/app/composables';
import {
  AsyncDataExecuteOptions,
  AsyncDataOptions,
  KeyOfRes,
  PickFrom,
  _Transform,
} from 'nuxt/dist/app/composables/asyncData';
/**
 * Composable for caching data across client requests. Prevents refetching of data when navigating on client side
 * Takes the same arguments as `useAsyncData`, with cacheKey being used to deduplicate requests
 */
export function useCachedAsyncData<Data>(
  cacheKey: string,
  handler: (ctx?: NuxtApp) => Promise<Data>,
  options?: AsyncDataOptions<Data>
): AsyncData<
  PickFrom<ReturnType<_Transform<Data>>, KeyOfRes<_Transform<Data>>>,
  Error | null
> {
  // Used to prevent collisions in nuxt data. Low likelyhood that another property in nuxt data starts with this
  const CACHE_KEY_PRREFIX = 'CACHED_ASYNC_DATA';
  const { data: cachedData } = useNuxtData(cacheKey);
  const cacheKeyAsync = `${CACHE_KEY_PRREFIX}${cacheKey}`;
  const shouldRefresh = ref<boolean>(false);

  // We need to cache transformed value to prevent value from being transformed every time.
  const transform = options?.transform;
  // Remove transform from options, so useAsyncData doesn't transform it again
  const optionsWithoutTransform = { ...options, transform: undefined };

  const asyncData = useAsyncData<Data, Error>(
    cacheKey,
    async () => {
      await refreshNuxtData(cacheKeyAsync);
      // If we already have data, and we're not being forced to refresh, return cached data
      if (cachedData.value && !shouldRefresh.value) {
        return cachedData.value;
      }
      const result = await handler();
      shouldRefresh.value = false;
      if (transform) {
        return transform(result);
      }
      return result;
    },
    optionsWithoutTransform
  );
  const refresh: (opts?: AsyncDataExecuteOptions | undefined) => Promise<void> =
    async (opts?: AsyncDataExecuteOptions) => {
      shouldRefresh.value = true;
      await asyncData.refresh(opts);
      shouldRefresh.value = false;
    };
  return { ...asyncData, refresh };
}

@bootsmann1995 Usage should be the same as useAsyncData. I've made a Stackblitz where you can see it in action: https://stackblitz.com/edit/github-npgyaf?file=pages/page1.vue

@vanling
Copy link

vanling commented Feb 22, 2023

! @MikeBellika awsome! thank you.. i was looking at how i could fix it myself. Never thought about the way you fixed it :) thanks for the learnings

@johannschopplich
Copy link
Contributor

I have a different solution (simplified), which should be more Nuxt-esque:

const cacheKey = computed(() => 'foo')
const cache = true

return useAsyncData<T, FetchError>(
  cacheKey.value,
  async (nuxt) => {
    // Workaround to persist response client-side
    // https://github.com/nuxt/framework/issues/8917
    if ((nuxt!.isHydrating || cache) && cacheKey.value in nuxt!.payload.data)
      return nuxt!.payload.data[cacheKey.value]

    const result = $fetch()

    if (cache)
      nuxt!.payload.data[cacheKey.value] = result

    return result
  },
  _asyncDataOptions,
) as AsyncData<T, FetchError>

👉 I have implemented these additions in nuxt-api-party.

@MikeBellika
Copy link
Contributor

@danielroe When the team comes agreement on how to do this, I would love to contribute

@AnzhiZhang
Copy link

Need built-in cache. The browser has caching, there should be a way to cache on nuxt server side

@Tummerhore
Copy link

Unfortunately, the posted solution from @MikeBellika doesn't seem to work correctly when passing the options {lazy: true, server: false}: When getting the page from the server the data stays null. However, after client side navigation to the page the data is loaded correctly. My own attempt had the same problem. Does anyone know why this is happening and how this can be fixed?

@bcspragu
Copy link

bcspragu commented Jun 12, 2023

Just learned about this the hard way when I noticed we were making 37 (!!) calls to our get current user endpoint, and none of a few dozen variations on useAsyncData did what I expected.

In our case, we have a useSession composable that, when we wrote the code mid-last year, did what we expected, where the first caller hit useSession, the user was loaded, and the subsequent components using useSession just got the cached user. I guess the "correct" way to do this would be load user info at the page level and pass it down into relevant components as a prop, but for an existing app with hundreds of components and pages, that's a heavy lift. Plus, isn't the whole point of composables to encapsulate relevant state to more composably add functionality to pages?

It looks like the behavior (i.e. removing caching) was changed because of a race condition, which is surprising to me. Is there some architectural limitation that prevented fixing the race condition? Plus, (and my JavaScript internals knowledge is quite limited feel free to tell me I have no idea what I'm talking about), I'd expect the Node/JavaScript event loop to avoid any race conditions in the traditional sense, since all the code actually setting variables and whatnot is happening in a single thread, and no promises and whatnot are being preempted mid-execution.

EDIT:

Here's a hideous workaround that does the trick for our specific case:

const currentUser = useState<User | undefined>(`${composableName}.currentUser`, () => undefined)
const resolvers = useState<Array<() => void>>(`${composableName}.resolvers`, () => [])
const loadCurrentUser = (hardRefresh: boolean = false): Promise<void> => {
  // Return the cached user
  if (currentUser.value && !hardRefresh) {
    return Promise.resolve()
  }

  // We're already loading a user, wait with everyone else
  if (resolvers.value.length > 0) {
    return new Promise((resolve) => {
      resolvers.value.push(resolve)
    })
  }

  // We're the first to request a user, kick of the request and hop in line at the front of the queue.
  return new Promise((resolve) => {
    resolvers.value.push(resolve)
    $graphql.me({})
      .then((resp) => {
        currentUser.value = resp.me.user
        // Let everyone else know we've loaded the user and clear the queue.
        resolvers.value.forEach((fn) => fn())
        resolvers.value = []
      })
  })
}

The main problem I see with this, aside from how verbose it is, is that resolvers is a useState of an array of functions, and that isn't serializable. That said, SSR shouldn't complete until this list is empty, so we're only ever serializing an empty array (or at least I think we should be?)

@talaxasy
Copy link

@MikeBellika how to revalidate such a cache after a certain time?

@Sandros94
Copy link

Sandros94 commented Jun 24, 2023

Just discovered this the hard way. I was trying to access a payload via useNuxtData from a different (much smaller) component, and I was getting an hydration mismatch. While digging I've noticed that it was due to useAsyncData refetching every time, filling the payload late for server side, but fast enough for client side.

UPDATE: The hydration error could be caused by the fact I do have nested layouts (in app.vue and somePage-[slug].vue), but I'm not fully sure.

UPDATE2: for lack of time I'll currently put the rendered component inside e ClientOnly, till I have time to investigate it more.

@ilyadh
Copy link

ilyadh commented Oct 10, 2023

I was trying to implement this workaround and it does not work for me (I don't use SSR) because of this bug.

Also for what it's worth, I think it's not unreasonable to expect some level of builtin caching from useFetch.
Other frameworks cache fetch requests by default, and I think Nuxt should too.
Especially when it was a feature at some point, even if not intentional.

@broox
Copy link

broox commented Oct 25, 2023

to easily cache requests on the client side, check out the new getCachedData option on useFetch and useAsyncData.

simple example using the default key:

const nuxtApp = useNuxtApp();
const { data, error, pending } = await useFetch('/api/data', {
  getCachedData(key) {
    return nuxtApp.payload.data[key] || nuxtApp.static.data[key]
  },
});

@manniL
Copy link
Member

manniL commented Oct 25, 2023

And for more info, I've created a ~10 min video about the new getCachedData feature ☺️

@Arecsu
Copy link
Contributor

Arecsu commented Nov 1, 2023

Thank you all so much for making this possible! Especially to @manniL, @danielroe and @darioferderber 🤗

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.