Introduction
When frontends call microservices directly, Waterfall problems occur with multiple API requests. Aggregate with BFF (Backend for Frontend) to return optimized responses to the frontend. Generate designs with Claude Code.
CLAUDE.md BFF Design Rules
## API Gateway / BFF Design Rules
### BFF Responsibilities
- Aggregate data from multiple services (N→1 requests)
- Return only fields the frontend needs (prevent over-fetching)
- Centralize authentication/authorization here
- Response caching (Redis TTL 1-5 minutes)
### Aggregation Strategy
- Parallel execution: Promise.all for services with no dependencies
- Sequential execution: only when dependencies exist
- Partial failure: return other data even if 1 service fails
### Security
- Prohibit direct microservice access from frontend
- Service-to-service communication via mTLS or API Key (internal only)
- Implement rate limiting at BFF layer
Generate BFF
Design a BFF endpoint for the dashboard screen.
Required data:
- User profile (UserService)
- 5 most recent orders (OrderService)
- Unread notification count (NotificationService)
Requirements:
- Call 3 services in parallel and aggregate
- Return other data even if 1 service fails
- Cache response in Redis (TTL 2 minutes)
Generate file: src/bff/dashboardBff.ts
Generated BFF Implementation
// src/bff/dashboardBff.ts
interface DashboardResponse {
user: UserProfile | null;
recentOrders: Order[];
unreadNotificationCount: number;
errors: string[];
}
async function fetchUserProfile(userId: string): Promise<UserProfile> {
const res = await fetch(`${process.env.USER_SERVICE_URL}/users/${userId}`, {
headers: { 'X-Internal-Token': process.env.INTERNAL_API_KEY! },
});
if (!res.ok) throw new Error(`UserService: ${res.status}`);
return res.json();
}
async function fetchRecentOrders(userId: string): Promise<Order[]> {
const res = await fetch(
`${process.env.ORDER_SERVICE_URL}/orders?userId=${userId}&limit=5`,
{ headers: { 'X-Internal-Token': process.env.INTERNAL_API_KEY! } }
);
if (!res.ok) throw new Error(`OrderService: ${res.status}`);
const data = await res.json();
return data.orders;
}
async function fetchUnreadCount(userId: string): Promise<number> {
const res = await fetch(
`${process.env.NOTIFICATION_SERVICE_URL}/notifications/unread-count?userId=${userId}`,
{ headers: { 'X-Internal-Token': process.env.INTERNAL_API_KEY! } }
);
if (!res.ok) throw new Error(`NotificationService: ${res.status}`);
const data = await res.json();
return data.count;
}
export async function getDashboardData(userId: string): Promise<DashboardResponse> {
const cacheKey = `bff:dashboard:${userId}`;
// Cache check
const cached = await redis.get(cacheKey);
if (cached) {
logger.debug({ userId, cacheHit: true }, 'BFF cache hit');
return JSON.parse(cached);
}
// Call 3 services in parallel (Promise.allSettled allows partial failure)
const [userResult, ordersResult, notificationsResult] = await Promise.allSettled([
fetchUserProfile(userId),
fetchRecentOrders(userId),
fetchUnreadCount(userId),
]);
const errors: string[] = [];
const response: DashboardResponse = {
user: userResult.status === 'fulfilled' ? userResult.value : null,
recentOrders: ordersResult.status === 'fulfilled' ? ordersResult.value : [],
unreadNotificationCount:
notificationsResult.status === 'fulfilled' ? notificationsResult.value : 0,
errors,
};
// Collect errors from failed services (inform client of partial failure)
if (userResult.status === 'rejected') errors.push('user_service_unavailable');
if (ordersResult.status === 'rejected') errors.push('order_service_unavailable');
if (notificationsResult.status === 'rejected') errors.push('notification_service_unavailable');
// Only cache when no errors (don't cache partial failure data)
if (errors.length === 0) {
await redis.set(cacheKey, JSON.stringify(response), { EX: 120 }); // 2 minutes
}
return response;
}
// src/routes/bff.ts
router.get('/bff/dashboard', authenticate, async (req, res) => {
const data = await getDashboardData(req.user.id);
// Return partial failures as 200 (frontend decides)
res.status(200).json(data);
});
Fetch with Timeout
// Always set timeouts on service calls
async function fetchWithTimeout<T>(
url: string,
options: RequestInit,
timeoutMs = 3000
): Promise<T> {
const controller = new AbortController();
const timeout = setTimeout(() => controller.abort(), timeoutMs);
try {
const res = await fetch(url, { ...options, signal: controller.signal });
if (!res.ok) throw new Error(`HTTP ${res.status}`);
return res.json();
} finally {
clearTimeout(timeout);
}
}
Summary
Design BFF with Claude Code:
- CLAUDE.md — document aggregation responsibilities, parallel execution, partial failure tolerance, cache TTL
- Promise.allSettled — allow partial failure (dashboard doesn't crash if 1 service is down)
- Include errors in response — enables frontend to implement degraded display
- Redis cache — reduce load on backend services
Review BFF/API Gateway designs with **Code Review Pack (¥980)* using /code-review at prompt-works.jp*
myouga (@myougatheaxo) — Axolotl VTuber.
Top comments (0)