Why Valkey?
When Redis changed its license to SSPL, the Linux Foundation forked it into Valkey — a truly open-source (BSD-3) in-memory data store. Backed by AWS, Google, Oracle, Ericsson, and Snap.
Valkey is 100% Redis-compatible. Same commands, same clients, same data structures — but with a permissive open-source license.
Getting Started
Docker
docker run -d --name valkey -p 6379:6379 valkey/valkey:latest
From Source
git clone https://github.com/valkey-io/valkey.git
cd valkey
make
./src/valkey-server
Verify
redis-cli PING
# PONG — your existing Redis CLI works!
Python (Same redis-py Client!)
import redis # Same client, no changes!
r = redis.Redis(host='localhost', port=6379, decode_responses=True)
# All Redis commands work identically
r.set("hello", "from Valkey!")
print(r.get("hello")) # "from Valkey!"
# Hashes
r.hset("user:1", mapping={"name": "Alice", "email": "alice@example.com", "plan": "pro"})
user = r.hgetall("user:1")
print(user) # {'name': 'Alice', 'email': 'alice@example.com', 'plan': 'pro'}
# Sorted sets for leaderboards
r.zadd("scores", {"alice": 1500, "bob": 2200, "charlie": 1800})
top3 = r.zrevrange("scores", 0, 2, withscores=True)
for player, score in top3:
print(f"{player}: {int(score)}")
# Pub/Sub
import threading
def subscriber():
sub = redis.Redis(host='localhost', port=6379, decode_responses=True)
ps = sub.pubsub()
ps.subscribe("notifications")
for msg in ps.listen():
if msg['type'] == 'message':
print(f"Received: {msg['data']}")
thread = threading.Thread(target=subscriber, daemon=True)
thread.start()
import time; time.sleep(0.5)
r.publish("notifications", "New order placed!")
Node.js
const Redis = require("ioredis");
const valkey = new Redis(); // connects to Valkey on 6379
// Streams
await valkey.xadd("orders", "*",
"product", "Widget",
"quantity", "3",
"total", "29.97"
);
// Consumer groups
await valkey.xgroup("CREATE", "orders", "processors", "$", "MKSTREAM").catch(() => {});
await valkey.xreadgroup("GROUP", "processors", "worker-1", "COUNT", 10, "BLOCK", 5000, "STREAMS", "orders", ">");
// Pipeline for batch operations
const pipeline = valkey.pipeline();
for (let i = 0; i < 1000; i++) {
pipeline.set(`item:${i}`, JSON.stringify({ id: i, value: Math.random() }));
}
await pipeline.exec();
console.log("Batch inserted 1000 items");
Valkey 8.0 New Features
Valkey isn't just a fork — it's evolving:
- Multi-threaded I/O — better multi-core utilization
- RDMA support — kernel-bypass networking for ultra-low latency
- Improved cluster — faster slot migration, better rebalancing
- Active community — 200+ contributors, rapid development
Migration from Redis
# Step 1: Install Valkey
docker pull valkey/valkey:latest
# Step 2: Copy your Redis data
redis-cli -h old-redis BGSAVE
# Copy dump.rdb to Valkey's data directory
# Step 3: Start Valkey with the dump
docker run -d -v ./dump.rdb:/data/dump.rdb valkey/valkey:latest
# Step 4: Update connection strings
# That's it — same protocol, same port!
When to Choose Valkey
| Scenario | Choose |
|---|---|
| Need permissive license | Valkey (BSD-3) |
| AWS/cloud managed | Valkey (ElastiCache) |
| Existing Redis app | Either — zero migration |
| Enterprise support | Redis (paid) or Valkey (community) |
| Cutting-edge features | Valkey (faster development) |
Need to feed data into your Valkey cache? I build production-ready scrapers. Check out my Apify actors or email spinov001@gmail.com for custom data pipelines.
Redis or Valkey — which are you using? Share below!
Top comments (0)