Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Warming the cache #59

Open
sj26 opened this issue Jul 13, 2017 · 2 comments
Open

Warming the cache #59

sj26 opened this issue Jul 13, 2017 · 2 comments

Comments

@sj26
Copy link
Contributor

sj26 commented Jul 13, 2017

We load some records to place into the graphql context before running a query. I'd like to warm the graphql batch loader caches with these records. At the moment I'm doing something like this (with a contrived example):

# app/graphql/record_loader.rb
class RecordLoader < GraphQL::Batch::Loader
  def initialize(record_class)
    @record_class = record_class
    @primary_key = record_class.primary_key
  end

  def perform(keys)
    # Resolve unfulfilled keys
    records_by_key = @record_class.where(@primary_key => keys).index_by(&@primary_key)

    # Fulfills both found keys as records, and unfound keys as nil
    keys.each { |key| fulfill(key, records_by_key[key]) }
  end

  def warm(record)
    key = record.public_send(@primary_key)
    promise = cache[cache_key(key)] ||= Promise.new.tap { |promise| promise.source = self }
    promise.fulfill(record) unless promise.fulfilled?
  end
end
# app/graphql/schema.rb
Schema = GraphQL::Schema.define do
  # ...
  use GraphQL::Batch
  use GraphQL::Batch::Warmer, with: -> (query) do
    RecordLoader.for(User).warm(query.context[:current_user])
  end
  # ...
end
# lib/graphql/batch/warmer.rb
module GraphQL::Batch::Warmer
  def self.use(schema_defn, with:)
    schema_defn.instrument(:query, QueryInstrumenter.new(with))
  end

  class QueryInstrumenter
    def initialize(block)
      @block = block
    end

    def before_query(query)
      unless GraphQL::Batch::Executor.current
        raise "Move `use GraphQL::BatchWarmer` below `use GraphQL::Batch` in your schema, or remove it"
      end

      @block.call(query)
    end

    def after_query(query)
    end
  end
end

This feels like it dips a little too much into this gem's responsibilities. Is there any interest including some sort of cache warming facility in this gem?

Update: I've simplified the example.

@dylanahsmith
Copy link
Contributor

dataloader has a prime function for this purpose. So if we provided something similar, then instead of

  def warm(record)
    key = record.public_send(@primary_key)
    promise = cache[cache_key(key)] ||= Promise.new.tap { |promise| promise.source = self }
    promise.fulfill(record) unless promise.fulfilled?
  end

you could have

  def warm(record)
    prime(record.id, record)
  end

or just use prime directly from outside of the batch loader with RecordLoader.for(User).prime(record.id, record).


I'm not as sure how the graphql-ruby plugin should work though. In your example it provides a query object, but now graphql-ruby supports multiplexing queries and graphql-batch will now do the setup around all those queries, so passing a query object might not make sense.

Are you only warming the cache using the context? If so, why not just use the context to get the record you want (e.g. current user) instead of going through a batch loader?

Also, if we allow a proc to be provided to prime the cache, then what should happen after a mutation field is resolved? In that case we clear the cache. Should we be priming the cache after it is cleared? That would assume that anything in the context is updated according to the mutation.

dataloader doesn't have these problems since the loaders typically would be part of the context and it isn't integrated with graphql-js to automatically clear the cache after mutation fields are resolved. Instead, it looks like they expect the mutations to invalidate cache keys, although that seems like an error prone approach.

@veloutin
Copy link

veloutin commented Feb 7, 2022

I suppose I may as well dump this here, in case it can be useful for someone. I attempted this when trying to optimize a query that loads 40k+ records at the first level. Preloaders and promises were eating away at all my 🐏

A couple of notes on this:

  • Use at your own risk
  • This shouldn't really have to depend on GraphQL::Batch::Loader, but I didn't want to rewrite the loader logic.
  • There is no fallback here. You match what you preloaded and miss the rest.
module Loaders
  class PreloadLoader < GraphQL::Batch::Loader
    def initialize(model, key: :id, value: nil, joins: nil, scope: nil)
      if scope.nil?
        query = model.all
      else
        query = model.public_send(scope)
      end
      query = query.joins(joins) if !joins.nil?
      if value.present?
        @map = query.pluck(key, value).to_h
      else
        @map = query.index_by{|rec| rec.public_send(key)}
      end
      super()
    end

    def load(key)
      @map[key]
    end
  end
end

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants