Your Rails app is slow. You know it. Your users know it. And the worst part is, half the time your app is doing the exact same work over and over again, fetching the same data, rendering the same partials, running the same queries. Caching fixes that. You tell Rails "hey, remember this," and the next time someone asks for it, Rails hands it right back without breaking a sweat.
We're going to walk through every caching strategy Rails gives you out of the box. Fragment caching, Russian doll caching, low-level caching, collection caching, and how to configure your cache store. By the end of this, you'll know exactly which type of caching to use and where.
Enable Caching in Development
Before we do anything, caching is turned off in development by default. You need to flip it on or you're going to sit there wondering why nothing is working.
Run this in your terminal:
bin/rails dev:cache
You should see: Development mode is now being cached.
That's it. Run the same command again to toggle it off. Under the hood, this creates a file called tmp/caching-dev.txt that Rails checks on boot. When it exists, caching is on. When it doesn't, caching is off.
Fragment Caching
Fragment caching is the one you'll use the most. It wraps a chunk of your view in a cache block and serves the stored HTML on every request after the first one.
Let's say you have a products index page. Each product card has a name, price, and description. Here's how you cache each one:
<% @products.each do |product| %>
<% cache product do %>
<div class="product-card">
<h2><%= product.name %></h2>
<p class="price"><%= number_to_currency(product.price) %></p>
<p><%= product.description %></p>
</div>
<% end %>
<% end %>
That cache product line is doing all the heavy lifting. Rails generates a cache key based on the product's class name, ID, and updated_at timestamp. Something like views/products/1-20260315120000. When the product gets updated, updated_at changes, the old cache key doesn't match anymore, and Rails renders a fresh version.
The beauty of this is you don't have to manually expire anything. Update the product, the cache invalidates itself.
Conditional Caching
Sometimes you only want to cache for certain users. Maybe admins see extra buttons that regular users don't. Use cache_if:
<% cache_if !current_user&.admin?, product do %>
<div class="product-card">
<h2><%= product.name %></h2>
<p><%= product.description %></p>
</div>
<% end %>
This caches the fragment for non-admin users but renders fresh HTML for admins every time. There's also cache_unless if you prefer to think about it the other way around.
Russian Doll Caching
Russian doll caching is fragment caching with nesting. You cache the individual items, and then you cache the container that holds them. When one item changes, only that item's cache gets busted. The outer cache regenerates, but it pulls all the unchanged items from cache instead of re-rendering them.
Here's a real example. You have a project with many tasks:
<% cache @project do %>
<h1><%= @project.name %></h1>
<div class="tasks">
<% @project.tasks.each do |task| %>
<% cache task do %>
<div class="task">
<h3><%= task.title %></h3>
<p><%= task.description %></p>
<span class="status"><%= task.status %></span>
</div>
<% end %>
<% end %>
</div>
<% end %>
There's one critical piece that makes this work. When a task gets updated, the project's cache needs to know about it too. Otherwise the outer cache still serves the stale version. You fix this with touch: true on the association:
class Task < ApplicationRecord
belongs_to :project, touch: true
end
class Project < ApplicationRecord
has_many :tasks
end
Now when you update a task, Rails automatically touches the parent project's updated_at field. The project's cache key changes, the outer cache regenerates, and it pulls all the unchanged tasks from their individual caches. Only the one task that changed gets re-rendered.
This is where the "Russian doll" name comes from. Caches inside caches inside caches. The inner ones stay warm even when the outer ones expire.
Collection Caching
If you're rendering a collection of partials, Rails has a shortcut that's way faster than looping and caching individually. Instead of the each loop with cache blocks, use the cached: true option on render:
<%= render partial: 'product', collection: @products, cached: true %>
For this to work, your partial (_product.html.erb) must use the local variable product rather than an instance variable like @product.
<%# app/views/products/_product.html.erb %>
<div class="product-card">
<h2><%= product.name %></h2>
<p class="price"><%= number_to_currency(product.price) %></p>
<p><%= product.description %></p>
</div>
The reason this is faster is that Rails fetches all the cache keys at once in a single round trip to the cache store, instead of checking them one at a time. For a page with 50 products, that's 1 cache lookup instead of 50.
Your partial doesn't need any special cache blocks inside it. Rails handles all the caching at the collection level.
Low-Level Caching
Fragment caching is for views. Low-level caching is for everything else: expensive database queries, API responses, computed values, anything you want to store and retrieve by a key.
The main method is Rails.cache.fetch. You give it a key and a block. If the key exists in the cache, it returns the cached value and skips the block entirely. If the key doesn't exist, it runs the block, stores the result, and returns it.
class Product < ApplicationRecord
def competing_price
Rails.cache.fetch([self, "competing_price"], expires_in: 12.hours) do
Competitor::API.find_price(self)
end
end
end
By passing the object (self) and a string inside an array, Rails automatically manages the versioning for you. If the product is updated, the version changes, and the cache invalidates. The expires_in: 12.hours part ensures that even if the product stays the same, we still refresh the data periodically. Perfect for external API data.
You can also use low-level caching in your controllers:
class DashboardController < ApplicationController
def index
@stats = Rails.cache.fetch("dashboard_stats", expires_in: 15.minutes) do
{
total_users: User.count,
active_today: User.where("last_seen_at > ?", 24.hours.ago).count,
revenue_mtd: Order.where(created_at: Time.current.beginning_of_month..).sum(:total)
}
end
end
end
Three potentially slow queries, cached for 15 minutes. Your dashboard loads instantly for every request in that window.
Read and Write Separately
If you need more control, you can use Rails.cache.read and Rails.cache.write directly:
Rails.cache.write("latest_report", report_data, expires_in: 1.hour)
report = Rails.cache.read("latest_report")
And if you need to manually clear a cache entry:
Rails.cache.delete("dashboard_stats")
Configuring Your Cache Store
Rails needs somewhere to put all this cached data. That's the cache store. You configure it in your environment files.
Memory Store (development default)
config.cache_store = :memory_store, { size: 64.megabytes }
Stores everything in the Rails process memory. Fast, but the cache disappears when you restart the server and can't be shared between processes. Fine for development, not great for production.
Solid Cache (Rails 8 default)
Rails 8 ships with Solid Cache as the default production cache store. It stores cache data in your database instead of needing a separate service like Redis.
config.cache_store = :solid_cache_store
The configuration lives in config/cache.yml. The default settings include a max_age of 60 days and a max_size of 256 megabytes. Solid Cache uses a FIFO (first in, first out) eviction strategy and handles expiry automatically through background tasks triggered by writes.
The upside is you don't need to run and maintain a separate Redis or Memcached instance. Modern SSDs make the access-time penalty of disk vs RAM insignificant for most caching purposes. You're usually better off keeping a huge cache on disk rather than a small cache in memory.
Redis
If you need something more battle-tested for high-traffic apps, Redis is the classic choice:
config.cache_store = :redis_cache_store, { url: ENV["REDIS_URL"] }
Redis handles eviction automatically when it hits max memory, so it behaves like a proper cache without you worrying about running out of space.
Memcached
config.cache_store = :mem_cache_store, ENV["MEMCACHE_SERVERS"]
Memcached is built specifically for caching. If all you need is a cache and nothing else, it's a solid pick. But most teams these days go with Redis since it can handle caching, background jobs, and other use cases.
SQL Caching
This one is free. You don't have to configure anything. Rails automatically caches the result set of each SQL query for the duration of a single request. If the same query runs twice in one request, Rails hits the database once and serves the second one from memory.
You'll see it in your logs with a CACHE prefix. This is per-request only. It doesn't persist between requests.
When to Use What
Fragment caching is for chunks of HTML in your views that don't change often. Use it on partials, sidebars, navigation elements, any rendered content that's expensive to generate.
Russian doll caching is for nested content where parent and child records are related. Use it when you have collections inside collections (projects with tasks, posts with comments).
Collection caching is for rendering lists of partials. Use the cached: true option instead of manual cache blocks when you're rendering a collection.
Low-level caching is for anything that's not a view. Expensive queries, external API calls, computed values. Anywhere you'd want to say "remember this for X minutes."
SQL caching happens automatically. You don't have to think about it.
As for cache stores, if you're on Rails 8, Solid Cache is the default and it works great for most apps. If you're handling serious traffic or need sub-millisecond cache reads, go with Redis. Start with fragment caching on your slowest pages. Profile with the Rails logs, see where the time is going, and add caching there. You don't have to cache everything at once.

Top comments (0)