The Communication Problem in Microservices
When you split a monolith into services, every function call becomes a network call. Network calls fail. They're slow. They're asynchronous.
Choosing the right communication pattern determines whether your microservices work together or fight each other.
Three Patterns
1. Synchronous REST
Service A calls Service B, waits for a response.
// Order Service calls Inventory Service
async function createOrder(items: OrderItem[]) {
// Check inventory (synchronous call)
const availability = await fetch('http://inventory-service/api/check', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ items }),
}).then(r => r.json());
if (!availability.allAvailable) {
throw new Error('Some items are out of stock');
}
// Create order...
}
Good for: Simple request-response. When you need the result immediately.
Problem: If Inventory Service is down, Order Service is down too. Cascading failures.
// Mitigation: circuit breaker
import CircuitBreaker from 'opossum';
const checkInventory = new CircuitBreaker(callInventoryService, {
timeout: 3000, // fail if takes > 3s
errorThresholdPercentage: 50, // open after 50% failure rate
resetTimeout: 30000, // try again after 30s
});
checkInventory.fallback(() => ({ allAvailable: false, reason: 'inventory-unavailable' }));
2. gRPC
Binary protocol over HTTP/2. Faster than REST, strongly typed.
// inventory.proto
syntax = "proto3";
service InventoryService {
rpc CheckAvailability (CheckRequest) returns (CheckResponse);
rpc ReserveItems (ReserveRequest) returns (ReserveResponse);
}
message CheckRequest {
repeated OrderItem items = 1;
}
message CheckResponse {
bool all_available = 1;
repeated string unavailable_ids = 2;
}
// Client
import { InventoryServiceClient } from './generated/inventory_grpc_pb';
const client = new InventoryServiceClient('inventory-service:50051', credentials.createInsecure());
const response = await new Promise((resolve, reject) => {
client.checkAvailability(request, (error, response) => {
if (error) reject(error);
else resolve(response);
});
});
Good for: High-performance internal communication, streaming, polyglot microservices.
Not good for: Browser clients, simple APIs, small teams.
3. Message Queues (Async)
Services communicate through a broker. No direct dependency.
// Order Service publishes an event
import { Queue } from 'bullmq';
const orderQueue = new Queue('orders', { connection: redis });
async function createOrder(data: OrderData) {
const order = await db.orders.create({ data });
// Publish event — don't wait for inventory/email/analytics
await orderQueue.add('order.created', {
orderId: order.id,
userId: order.userId,
items: order.items,
});
return order; // Return immediately
}
// Inventory Service subscribes and processes independently
const worker = new Worker('orders', async (job) => {
if (job.name === 'order.created') {
await decrementInventory(job.data.items);
}
}, { connection: redis });
// Email Service also subscribes
const emailWorker = new Worker('orders', async (job) => {
if (job.name === 'order.created') {
await sendConfirmationEmail(job.data.userId, job.data.orderId);
}
}, { connection: redis });
Good for: Decoupled workflows, high throughput, resilience to downstream failures.
Not good for: When you need an immediate response (e.g., checking if items are available before accepting an order).
The Pattern for Each Use Case
User-facing request needing immediate response:
→ Synchronous REST or gRPC
→ Use circuit breakers + timeouts
Background processing, notifications, auditing:
→ Message queue (BullMQ, SQS, RabbitMQ)
→ Services are fully decoupled
High-performance internal service calls:
→ gRPC
→ Streaming if needed
Event sourcing / audit trail:
→ Kafka or similar (ordered, persistent, replayable)
Service Discovery
// Kubernetes: use service names as hostnames
const INVENTORY_URL = process.env.INVENTORY_SERVICE_URL ?? 'http://inventory-service:3001';
const EMAIL_URL = process.env.EMAIL_SERVICE_URL ?? 'http://email-service:3002';
// Docker Compose: same pattern
// services:
// inventory-service:
// ports: ['3001:3001']
// order-service:
// environment:
// INVENTORY_SERVICE_URL: http://inventory-service:3001
The Honest Advice
Don't build microservices unless you have a specific reason:
- Team ownership boundaries (different teams own different services)
- Wildly different scaling requirements
- Technology isolation requirements
A well-structured monolith outperforms poorly designed microservices every time. Microservices are an organizational solution as much as a technical one.
If you do go microservices: start with async messaging for workflows, synchronous calls for user-facing queries, and circuit breakers everywhere.
BullMQ message queue integration and service communication patterns: Whoff Agents AI SaaS Starter Kit.
Top comments (0)