Memphis is a real-time data processing platform — simpler than Kafka, more powerful than RabbitMQ. Built-in schema validation, dead letter queue, and a GUI dashboard.
Why Memphis?
- No Zookeeper: Single binary, runs in minutes
- Schema enforcement: Validate messages at the broker level
- Dead letter queue: Automatic poison message handling
- GUI dashboard: Monitor stations, consumers, producers
- Multi-tenancy: Namespace isolation
- Cloud + self-hosted: Both options available
Docker Setup
curl -s https://memphisdev.github.io/memphis-docker/docker-compose.yml -o docker-compose.yml
docker compose up -d
Dashboard available at http://localhost:9000
Node.js Producer
import memphis from 'memphis-dev';
await memphis.connect({
host: 'localhost',
username: 'root',
password: 'memphis',
});
const producer = await memphis.producer({
stationName: 'orders',
producerName: 'order-service',
});
await producer.produce({
message: Buffer.from(JSON.stringify({
orderId: '12345',
amount: 99.99,
items: ['widget-a', 'widget-b'],
})),
});
Node.js Consumer
const consumer = await memphis.consumer({
stationName: 'orders',
consumerName: 'payment-processor',
consumerGroup: 'payments',
});
consumer.on('message', (message) => {
const order = JSON.parse(message.getData().toString());
console.log('Processing order:', order.orderId);
message.ack();
});
consumer.on('error', (error) => {
console.error('Consumer error:', error);
});
Python Producer
import asyncio
from memphis import Memphis
async def main():
memphis = Memphis()
await memphis.connect(host='localhost', username='root', password='memphis')
producer = await memphis.producer(station_name='events', producer_name='py-producer')
await producer.produce(bytearray('{"type": "click", "page": "/home"}'.encode()))
await memphis.close()
asyncio.run(main())
Schema Enforcement
Attach a JSON Schema to a station:
{
"type": "object",
"properties": {
"orderId": {"type": "string"},
"amount": {"type": "number", "minimum": 0}
},
"required": ["orderId", "amount"]
}
Messages that don't match the schema are automatically rejected.
REST API
# Create station
curl -X POST http://localhost:5555/api/v1/stations \
-H 'Authorization: Bearer TOKEN' \
-d '{"name": "orders", "retention_type": "messages", "retention_value": 1000}'
# Get station info
curl http://localhost:5555/api/v1/stations/orders \
-H 'Authorization: Bearer TOKEN'
# Produce message
curl -X POST http://localhost:5555/api/v1/produce \
-H 'Authorization: Bearer TOKEN' \
-d '{"station_name": "orders", "producer_name": "api", "message": "base64-encoded-data"}'
Real-World Use Case
A startup replaced their Kafka cluster (3 brokers + Zookeeper, $600/mo in cloud costs) with Memphis. Setup took 10 minutes vs 2 days. Dead letter queue caught 47 malformed messages in the first week that Kafka had silently dropped.
Need to automate data collection? Check out my Apify actors for ready-made scrapers, or email spinov001@gmail.com for custom solutions.
Top comments (0)