DEV Community

Alex Spinov
Alex Spinov

Posted on

Redis Has a Free API That Does Way More Than Caching

Redis is the in-memory data store, and its Node.js client exposes powerful data structures — streams, pub/sub, sorted sets, and more.

ioredis: The Full-Featured Client

import Redis from "ioredis";

const redis = new Redis({ host: "localhost", port: 6379 });

// Basic operations
await redis.set("product:123", JSON.stringify({ title: "Widget", price: 29.99 }));
const product = JSON.parse(await redis.get("product:123"));

// TTL
await redis.setex("cache:search:widget", 3600, JSON.stringify(results));

// Increment
await redis.incr("scrapes:total");
await redis.incrby("scrapes:amazon", 10);
Enter fullscreen mode Exit fullscreen mode

Sorted Sets: Real-Time Rankings

// Add products with price as score
await redis.zadd("products:by_price", 29.99, "widget", 49.99, "gadget", 9.99, "tool");

// Get cheapest 10
const cheapest = await redis.zrange("products:by_price", 0, 9, "WITHSCORES");

// Get products in price range
const mid = await redis.zrangebyscore("products:by_price", 20, 50);

// Leaderboard
const rank = await redis.zrevrank("products:by_price", "gadget"); // 0 = most expensive
Enter fullscreen mode Exit fullscreen mode

Pub/Sub: Real-Time Events

// Subscriber
const sub = new Redis();
sub.subscribe("price-updates", "new-products");
sub.on("message", (channel, message) => {
  const data = JSON.parse(message);
  if (channel === "price-updates") updateUI(data);
});

// Publisher
const pub = new Redis();
await pub.publish("price-updates", JSON.stringify({ product: "widget", oldPrice: 29.99, newPrice: 24.99 }));
Enter fullscreen mode Exit fullscreen mode

Streams: Event Log

// Add to stream
await redis.xadd("scrape-events", "*", "url", "https://example.com", "status", "success", "items", "42");

// Read stream
const events = await redis.xrange("scrape-events", "-", "+", "COUNT", 10);

// Consumer groups
await redis.xgroup("CREATE", "scrape-events", "processors", "0", "MKSTREAM");
const messages = await redis.xreadgroup("GROUP", "processors", "worker-1", "COUNT", 5, "BLOCK", 2000, "STREAMS", "scrape-events", ">");
Enter fullscreen mode Exit fullscreen mode

Pipeline: Batch Operations

const pipeline = redis.pipeline();
for (const product of products) {
  pipeline.set(`product:${product.id}`, JSON.stringify(product));
  pipeline.zadd("products:by_price", product.price, product.id);
  pipeline.sadd(`category:${product.category}`, product.id);
}
const results = await pipeline.exec(); // One round trip!
Enter fullscreen mode Exit fullscreen mode

Lua Scripts: Atomic Operations

const script = `
  local current = tonumber(redis.call('get', KEYS[1]) or '0')
  if current < tonumber(ARGV[1]) then
    redis.call('set', KEYS[1], ARGV[1])
    redis.call('publish', 'price-alerts', KEYS[1] .. ':' .. ARGV[1])
    return 1
  end
  return 0
`;

const updated = await redis.eval(script, 1, "price:widget", "24.99");
Enter fullscreen mode Exit fullscreen mode

Cache scraped data with Redis? My Apify tools + Redis = lightning-fast data access.

Custom caching solution? Email spinov001@gmail.com

Top comments (0)