DEV Community

Cover image for Imagine: Rails 8 Solid Trifecta: Cache, Cable, Queue
reinteractive
reinteractive

Posted on • Edited on

Imagine: Rails 8 Solid Trifecta: Cache, Cable, Queue

Credited to: Allan Andal

Rails 8 - Solid Trifecta Comparison

The "Solid Trifecta" is a suite of database-backed solutions—Solid Cache, Solid Cable, and Solid Queue—added in Ruby on Rails 8 to simplify application architecture by reducing the need for external services like Redis and Memcached. These components are built on top of existing database infrastructure to handle caching, WebSocket messaging, and background job processing.

Solid Cache

Traditional RAM-based caching systems are replaced by Solid Cache which uses disk storage, including SSDs and NVMe drives. This method provides bigger cache storage at lower costs which leads to extended cache retention times and enhanced application performance. As an example, Basecamp has adopted Solid Cache to store 10 terabytes of data with a 60-day retention window, which has resulted in a significant reduction in render times.

Comparison: Solid Cache vs. Dalli/Memcached vs. Redis

Overview

Feature Solid Cache Dalli + Memcached Redis
Storage Type Disk-based (local SSD) In-memory (RAM) In-memory (RAM) + optional persistence
Persistence Yes (disk-based, survives restarts) No (data lost on restart) Yes (via RDB & AOF)
Scalability Scales with disk size Scales with RAM Scales with RAM, supports clustering
Performance Slower (disk access) Very fast Very fast
Concurrency Good for multi-threaded apps High concurrency High concurrency
Data Structures Key-value store only Key-value store only Supports lists, hashes, sets, sorted sets, streams, etc.
Best For Apps needing persistence and local caching High-speed caching across multiple servers Caching, real-time analytics, session storage, message queues

Performance & Scalability

  • Memcached and Redis are much faster than Solid Cache since they store data in-memory rather than on disk.
  • Memcached is simpler and optimized for high-speed key-value lookups, but it lacks persistence.
  • Redis can be persistent (with RDB or AOF) and supports advanced data types, making it more versatile than Memcached.
  • Solid Cache is slower but allows for larger cache sizes (limited by disk, not RAM).

Use Cases

Use Case Solid Cache Dalli + Memcached Redis
Persistent caching (survives restart) ✅ Yes ❌ No ✅ Yes
Distributed caching (multi-server) ❌ No ✅ Yes ✅ Yes
Fastest performance ❌ No (disk I/O) ✅ Yes (RAM) ✅ Yes (RAM)
Large dataset caching ✅ Yes (limited by disk) ❌ No (limited by RAM) ✅ Yes (with clustering)
Session storage ✅ Yes ✅ Yes ✅ Yes
Message queue ❌ No ❌ No ✅ Yes (Pub/Sub, Streams)
Complex data structures ❌ No ❌ No ✅ Yes (lists, sets, sorted sets, etc.)

Ease of Use & Setup

Feature Solid Cache Dalli + Memcached Redis
Setup Simplicity ✅ Easiest (built-in Rails cache store) ❌ Requires Memcached server ❌ Requires Redis server
Integration with Rails ✅ Yes (out-of-the-box) ✅ Yes (via dalli) ✅ Yes (via redis-rails, supports up to Rails 7 only)
Maintenance Overhead ✅ Low ✅ Low ❌ Higher (needs persistence configuration & monitoring)

Conclusion: Which is Best?

  • Use Solid Cache if you need persistence, local caching, and simple Rails integration, but can tolerate slower performance.
  • Use Memcached if you need the fastest possible caching performance and don’t need persistence.
  • Use Redis if you need fast caching plus advanced features (persistence, pub/sub, sorted sets, etc.).

💡 Best overall caching solution?

  • If you only need simple, fast caching → Memcached.
  • If you need caching + persistence + advanced featuresRedis.
  • If you want a simple Rails-native cache that persists to diskSolid Cache.

Solid Cable

Solid Cable acts as a database-backed option to manage WebSocket connections so applications do not need an additional pub/sub server such as Redis. The system sends messages between application processes and clients using fast polling methods which support near real-time performance. The database stores messages for at least a day which helps developers review live update history.

Comparison: Solid Cable vs. AnyCable vs. Action Cable

Overview

Feature Solid Cable AnyCable Action Cable
Performance Slower than Action Cable Fastest (gRPC/WebSockets) Fast (in-memory, event-driven)
Concurrency Slower uses database polling Very High (gRPC & WebSockets) Uses threads (Puma) and Redis Pub/Sub for real-time message broadcasting
Scalability Sacles with database Best for large-scale apps Scales well with Redis
Persistence Yes (database) No (in-memory) No (in-memory)
Best For Lower-traffic apps Large-scale, distributed WebSockets High-traffic apps

Performance & Scalability

  • AnyCable is the fastest since it offloads WebSocket handling to a separate gRPC server (often using Golang).
  • Action Cable is faster because it multi-threaded, even-driven.
  • Solid Cable is slowest because of DB read/write and database operation however it is the simpliest setup.

Scalability

Feature Solid Cable AnyCable Action Cable
Multi-server scaling ✅ Yes (database) ✅ Yes (Redis) ✅ Yes (Redis)

Ease of Use & Setup

Feature Solid Cable AnyCable Action Cable
Setup Complexity ✅ Easy (drop-in replacement for Action Cable) ❌ Complex (requires AnyCable server) ✅ Easy (built into Rails)
Integration with Rails ✅ Yes ✅ Yes ✅ Yes
Requires extra infrastructure ❌ No ✅ Yes (gRPC server) ❌ No
Works with existing Action Cable code ✅ Yes ✅ Yes ✅ Yes

Conclusion: Which is Best?

  • Use AnyCable if you need massive scalability and low-latency WebSockets across multiple servers.
  • Use Action Cable if you need better performance than Solid Cable.
  • Use Solid Cable if you just need basic real-time updates and want the simplest setup.

💡 Best overall WebSockets solution?

  • If you want basic real-time features for a small Rails app without extra infrastructureSolid Cable.
  • If you need WebSockets at scale with thousands of connectionsAnyCable.
  • Otherwise → Action Cable.

Solid Queue

For background job processing Solid Queue presents a database-driven solution that makes Sidekiq and Resque external job runners unnecessary. Solid Queue utilizes database features such as FOR UPDATE SKIP LOCKED to efficiently manage job queues. This service can function as a Puma plugin as well as through a dedicated dispatcher giving users flexible job management capabilities.

Comparison: Solid Queue vs. Sidekiq vs. Resque

Overview

Feature Solid Queue Sidekiq Resque
Performance High (multi-threaded) High (multi-threaded) Moderate (single-threaded)
Concurrency High (multi-threaded) Very High (multi-threaded) Moderate (single-threaded)
Scalability Scales well (single server) Excellent (distributed) Good (distributed)
Persistence Yes (supports Redis) Yes (supports Redis) Yes (supports Redis)
Retry Logic Yes (configurable) Yes (configurable) Yes (configurable)
Job Prioritization Yes Yes Yes
Best For High-performance background jobs High-concurrency, real-time jobs Simple, reliable job processing

Performance & Scalability

  • Solid Queue is built for performance with Ruby’s async model, making it faster than Resque (single-threaded) and comparable to Sidekiq.
  • Sidekiq is known for its high concurrency with multi-threading, handling jobs in parallel across multiple threads, which makes it extremely fast.
  • Resque uses a single-threaded model (in Ruby), meaning it's slower than both Sidekiq and Solid Queue for high-volume workloads.
Feature Solid Queue Sidekiq Resque
Multi-server scaling ✅ Yes ✅ Yes ✅ Yes
Pub/Sub support ✅ Yes ✅ Yes ✅ Yes
External adapters ✅ Redis ✅ Redis ✅ Redis
Job retry ✅ Yes ✅ Yes ✅ Yes
Concurrency Model High (async) Very high (multi-threaded) Low (single-threaded)

Use Cases

Use Case Solid Queue Sidekiq Resque
Simple background job processing ✅ Yes ✅ Yes ✅ Yes
High concurrency, real-time background jobs ✅ Yes ✅ Yes ❌ No
Large-scale job processing ✅ Yes ✅ Yes ✅ Yes
Job prioritization ✅ Yes ✅ Yes ✅ Yes
Queue with multiple workers ✅ Yes ✅ Yes ✅ Yes
Handling thousands of jobs per second ✅ Yes ✅ Yes ✅ No

Ease of Use & Setup

Feature Solid Queue Sidekiq Resque
Setup Complexity ✅ Easy (out-of-the-box integration) ✅ Moderate (setup Redis and multi-threading) ✅ Moderate (setup Redis and single-threaded workers)
Integration with Rails ✅ Yes ✅ Yes ✅ Yes
Requires extra infrastructure ❌ No (works with Redis) ✅ Yes (Redis, multi-threading setup) ✅ Yes (Redis)
Job Management UI ❌ No (CLI-based) ✅ Yes (web UI for monitoring) ✅ Yes (web UI for monitoring)

Conclusion: Which is Best?

  • Use Solid Queue if you need high-performance background job processing with single-server scaling and can benefit from Ruby-based async performance.
  • Use Sidekiq if you need multi-threaded, high-concurrency job processing with real-time capabilities and distributed scaling.
  • Use Resque if you need simple background job processing and don’t need high concurrency or performance optimization, but still want reliability.

💡 Best overall background job processor?

  • If you need high-performance job processing with RubySolid Queue.
  • If you need extreme concurrency and distributed job processingSidekiq.
  • If you need a simple, reliable queue system with less focus on performance → Resque.

Can the Solid Trifecta replace Redis/Memcached?

The Solid Trifecta replaces external services like Redis and Memcached through its built-in database-driven solutions. Most applications that require moderate performance levels will find these integrated solutions suitable because they provide necessary features without increasing deployment complexity. Traditional solutions remain preferable for applications with demanding performance requirements because Redis and Memcached deliver optimized functionality.

Rails 8's Solid Trifecta enables developers to build more streamlined systems when caching messaging and job processing by utilizing existing database systems. These solutions provide many benefits for simplicity and cost-effectiveness but application users should evaluate their particular needs to decide if they can substitute traditional services before choosing Redis or Memcached.

Key Takeaways:

New apps should start with fewer dependencies using Rails 8’s built in solutions.

Scaling for performance: To keep costs low, use specialized tools only for higher speed requirements as the app scales.

Large applications should continue using high performing tools like Sidekiq for job processing along with Memcached for caching where required.

Heroku

Build apps, not infrastructure.

Dealing with servers, hardware, and infrastructure can take up your valuable time. Discover the benefits of Heroku, the PaaS of choice for developers since 2007.

Visit Site

Top comments (0)

Billboard image

Create up to 10 Postgres Databases on Neon's free plan.

If you're starting a new project, Neon has got your databases covered. No credit cards. No trials. No getting in your way.

Try Neon for Free →