Last week I covered some Ruby on Rails caching basics such as page, action, and fragment caching and how you can set up a cache on your application in development to test it out. Rails offers caching features out of the box so it's simple to dip your toe into the world of caching.
For this post, I wanted to look at Cache Stores, as well as some other caching techniques I didn't cover last week.
Other Rails Caching Techniques
I previously covered page caching (cached endpoints are short-circuited by the web server and don't go through your Rails stack but not good with before filters like authentication), action caching (web request hits the Rails stack so those before filters can be run before cache data is served), and fragment caching (default Rails caching that's good for fragments and partials).
Here a few more caching methods:
Russian Doll Caching
Russian doll caching builds on fragment caching and allows for nesting cached fragments inside other cached fragments. Using the touch
method can tie the models together so that any action that changes updated_at
for the nested/inner record will also change it for the associated/outer record, making sure the cache is updated and not providing any stale data.
class Destination < Application record
has_many :activities
end
class Activity < ApplicationRecord
belongs_to :destination, touch: true
end
<% cache destination do %>
<% render destination.activities %>
<% end %>
SQL Caching
Rails features query caching that caches the result set returned by each query. Returned results from the query are stored in a query cache in memory. If the same query is made again, Rails won't query the database again but pull the result from memory. It's important to keep in mind that the query caches are created at the start of an action and destroyed at the end of that action. If you require queries to be stored in a more persistent manner, you can do so with low level caching.
Low-level Caching
There may be a situation where you'd rather cache certain values or a query instead of just view fragments. Rails caching works great for storing any kind of information. This is where low-level caching comes in.
A simple and effective way to implement low-level caching is using Rails.cache.fetch
. This method does both reading and writing to the cache.
- If passed only one argument, the key is fetched and the value from the cache is returned
- If a block is passed in, the block will only be executed in the event of a cache miss (the return value of the block will be written to the cache and attached to the key passed in, and the return value will be returned)
class Destination < ApplicationRecord
def top_activity
Rails.cache.fetch("{#cache_key_with_version}/
top_activity", expires_in: 12.hours) do
Activities::API.find_activity(id)
end
end
end
When using low-level caching for instance level information, you will need to generate a cache key. cache_key_with_version
method will generate a string key based on the model's class name, id, and updated_at attributes, something like destinations/100-1-202006026193031061005000/top_activity
.
Cache Stores
As your cached data has to live somewhere, there are many options for cache stores. Rails has a default cache store that's easy to use though with limitations. If you have caching already toggled on (as referenced in the development config/environments/*.rb
file) the cache will be implemented by default with a default bounded size of 32Mb. You can pass in a :size
option to increase the capacity to the initializer if you wish:
config.cache_store = :memory_store, { size: 64.megabytes }
For small, low traffic sites with only a handful of server processes and those in development and test environments, this default cache can work well. However, this store is not going to cut it for a large application deployment.
MemCacheStore
MemCacheStore
utilizes Danga's memcached
server and serves as a centralized cache for your app. By default, Rails uses the bundled dalli
gem. This is also the most popular cache store choice for production websites due to its high performance and redundancy with a single, shared cache cluster.
The cache will be initialized in the relevant config/environments/*.rb
file. In config/environments/production.rb
you can uncomment out the following:
# Use a different cache store in production.
# config.cache_store = :mem_cache_store
You'll need to specify the addresses for all memcached servers in your cluster. If no addresses are provided, the default assumption is that memcached is running on localhost on the default port, which is not an ideal situation for larger websites.
config.cache_store = :mem_cache_store, "cache-
1.travelexample.com", "cache-2.travelexample.com"
RedisCacheStore
Another cache store option is RedisCacheStore
. The Redis cache store makes use of Redis support for automatic eviction when it reaches max memory, letting it behave like a Memcached cache server.
You'll need to add the redis
gem to your Gemfile
. There is also additional support you can enable with the faster hiredis
connection library by adding its Ruby wrapper to your Gemfile
. The Redis cache store will automatically require and use hiredis
if it's available. And then you'll want to add the configuration to the relevant config/environments/*.rb
file.
config.cache_store = :redis_cache_store, { url: ENV['REDIS_URL'] }
Other Cache Stores
There are other cache stores available such as MemoryStore
or FileStore
but you can read more about these and get more information on Rails caching at Rails Guides's Caching With Rails.
Happy caching!
Resources
Caching With Rails: An Overview - Rails Guides
Caching Strategies for Rails - Heroku
Top comments (2)
Thank you @katkelly , I walked through all you cache articles. they are awesome, perhaps it will be better if you can make a demo: a simple implementation of cache in node server and a Bench test to show the performance before and after adding the cache ... this will help a lot
Thanks for the suggestion on the cache demo! Though I don't have plans to do that just yet, I'll keep that in mind as a future post idea.