Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to add caching capabilities #295

Open
2 tasks done
ilijaNL opened this issue Mar 23, 2023 · 5 comments
Open
2 tasks done

How to add caching capabilities #295

ilijaNL opened this issue Mar 23, 2023 · 5 comments

Comments

@ilijaNL
Copy link

ilijaNL commented Mar 23, 2023

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

4.5.3

Plugin version

^9.0.0

Node.js version

18

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

macosx

Description

First thanks for this library,

I am trying to create a graphql proxy with caching capabilities. Therefor I am trying to use fastify.addHook('preHandler', ...) to parse the query and possibly retrieve result from cache and fastify.addHook('onSend', ...) to store the cache however I am kind of stuck on storing the cache value from onSend hook. In general, how can cache be implemented inside proxy using fastify-http-proxy?

Steps to Reproduce

fastify.register(async (fastify) => {
   fastify.addHook("preHandler", async (req) => {
      console.log({ body: req.body });
      // retrieve from cache
    });

    fastify.addHook("onSend", async (req, reply, payload) => {
      // payload is BodyReadable
      console.log({ payload: payload });
      // set the cache
    });
    fastify.register(fastifyProxy, {
      upstream: "http://localhost:8082/v1/graphql",
      proxyPayloads: false,
      prefix: "/v1/graphql",
      // undici: {},
      websocket: true,
    });
  });
})

Expected Behavior

@ilijaNL ilijaNL changed the title Not consistent behaviour How to add caching capabilities Mar 23, 2023
@marcoreni
Copy link

I managed to do something similar using onRequest and onSend.

onRequest tries to get the request from the cache. If it finds it, it adds a header and returns the response.

onSend checks if the header exist. If it does, it removes it and returns the response. Otherwise, it adds the response to the cache before returning it.

The main issue you need we encountered was compression handling; you would need to decompress and recompress the response body during onSend

@mcollina
Copy link
Member

I don't think that approach improves the overall latency. You would need to skip calling the handler to have any caching benefit, e.g. sending a response in the preHandler hook.

@ilijaNL
Copy link
Author

ilijaNL commented Apr 10, 2023

I don't think that approach improves the overall latency. You would need to skip calling the handler to have any caching benefit, e.g. sending a response in the preHandler hook.

I understand that, so need to call reply.send in prehandler. However my problem is how to cache the origin result since the fastify-reply returns a read stream from the origin.

@mcollina
Copy link
Member

You would need to accumulate the stream in the onSend hook and then send it back.

Take also a look at the cloneable-readable module to "fork" the stream so you can still stream back the response while you accumulate.

@mcollina
Copy link
Member

Note that I recommend to not use this module for that use case. GraphQL does not really follow REST semantics so it's better to cache manually or have a graphqql aware cache.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants