DEV Community

Alex Spinov
Alex Spinov

Posted on

NATS Has a Free API — Cloud-Native Messaging at 18 Million Messages per Second

NATS is a high-performance messaging system that handles 18M+ messages per second. It provides pub/sub, request-reply, and streaming (JetStream) — all in a single 15MB binary.

Free, open source, CNCF incubating. Used by Tesla, GE, Mastercard.

Why Use NATS?

  • 18M+ msg/sec — fastest messaging system available
  • Tiny footprint — single 15MB binary, starts in milliseconds
  • JetStream — built-in persistent streaming (like Kafka, but simpler)
  • Multi-pattern — pub/sub, request-reply, queue groups
  • Zero config clustering — auto-discovery and routing

Quick Setup

1. Install

# Docker
docker run -p 4222:4222 -p 8222:8222 nats -js -m 8222

# Binary
curl -L https://github.com/nats-io/nats-server/releases/latest/download/nats-server-linux-amd64.zip -o nats.zip
unzip nats.zip && ./nats-server -js -m 8222
Enter fullscreen mode Exit fullscreen mode

2. Monitor via HTTP API

NATS="http://localhost:8222"

# Server info
curl -s "$NATS/varz" | jq '{version: .version, connections: .connections, messages: .in_msgs, bytes: .in_bytes, mem: .mem}'

# Connection details
curl -s "$NATS/connz" | jq '.connections[] | {cid: .cid, name: .name, ip: .ip, msgs_from: .in_msgs, msgs_to: .out_msgs}'

# Subscriptions
curl -s "$NATS/subsz?subs=true" | jq '{num_subscriptions: .num_subscriptions}'

# JetStream info
curl -s "$NATS/jsz" | jq '{streams: .streams, consumers: .consumers, messages: .messages, bytes: .bytes}'

# Routes (cluster info)
curl -s "$NATS/routez" | jq
Enter fullscreen mode Exit fullscreen mode

3. Pub/Sub with NATS CLI

# Install NATS CLI
brew install nats-io/nats-tools/nats

# Subscribe
nats sub "events.>" &

# Publish
nats pub events.user.signup '{"user":"john","plan":"pro"}'
nats pub events.order.placed '{"order_id":123,"total":99.99}'
Enter fullscreen mode Exit fullscreen mode

4. JetStream (Persistent Streaming)

# Create a stream
nats stream add EVENTS --subjects="events.>" --retention=limits --max-msgs=1000000

# Create a consumer
nats consumer add EVENTS processor --deliver=all --ack=explicit

# Publish persistent messages
nats pub events.scrape.complete '{"url":"example.com","pages":150}'

# Consume
nats consumer next EVENTS processor --count=10
Enter fullscreen mode Exit fullscreen mode

Python Example

import nats
import asyncio
import json

async def main():
    nc = await nats.connect("nats://localhost:4222")

    # Pub/Sub
    async def handler(msg):
        data = json.loads(msg.data)
        print(f"Received on {msg.subject}: {data}")

    await nc.subscribe("events.>", cb=handler)
    await nc.publish("events.scrape", json.dumps({"url": "example.com"}).encode())

    # Request-Reply
    async def responder(msg):
        await msg.respond(json.dumps({"status": "ok"}).encode())

    await nc.subscribe("api.health", cb=responder)
    response = await nc.request("api.health", b"ping", timeout=5)
    print(f"Response: {response.data.decode()}")

    # JetStream
    js = nc.jetstream()
    await js.publish("events.log", json.dumps({"level": "info", "msg": "started"}).encode())

    await asyncio.sleep(1)
    await nc.close()

asyncio.run(main())
Enter fullscreen mode Exit fullscreen mode

Key Monitoring Endpoints

Endpoint Description
/varz Server stats
/connz Connection info
/subsz Subscription info
/routez Cluster routes
/jsz JetStream stats
/healthz Health check
/accountz Account info

Need custom data extraction or scraping solution? I build production-grade scrapers for any website. Email: Spinov001@gmail.com | My Apify Actors

Top comments (0)