DEV Community

Asad Abdullah Zafar
Asad Abdullah Zafar

Posted on • Originally published at kolachitech.com

Shopify Load Balancing: What Every App Developer Needs to Know Before Scaling

Shopify processed $9.3B in BFCM sales in 2023. At that volume, load balancing is not infrastructure trivia — it is the layer that determines whether your app stays up or takes merchants down with it.
Here are the five most critical load balancing decisions for Shopify app developers, with production-ready configurations.

  1. Match the Algorithm to the Workload
    Not all load balancing algorithms suit Shopify's traffic patterns.
    AlgorithmUse CaseWhyRound RobinStateless API workersUniform request durationLeast ConnectionsWebhook worker poolsVariable job durationIP HashOAuth flows, WebSocketsSession continuity requiredWeighted Round RobinMixed-capacity fleetsProportional distribution
    For most Shopify apps, round robin handles API requests and least-connections handles webhook workers. That single distinction prevents the most common load distribution failures.

  2. Stateless Design is a Prerequisite, Not an Optimization
    Before you add a second server, externalize all state:

js// ✅ Session in Redis — any instance handles any request
app.use(session({
  store: new RedisStore({ client: redisClient }),
  secret: process.env.SESSION_SECRET,
  resave: false,
  saveUninitialized: false,
  cookie: { secure: true, httpOnly: true }
}));
Enter fullscreen mode Exit fullscreen mode

Sessions in memory, local file writes, in-process job workers — these all break silently when a load balancer routes a request to an instance that wasn't involved in the previous one.

  1. Health Checks Control Traffic Routing Your load balancer only routes to healthy instances if it knows which instances are healthy. Expose a proper health endpoint:
jsapp.get('/health', async (req, res) => {
  try {
    await Promise.all([db.query('SELECT 1'), redisClient.ping()]);
    res.status(200).json({ status: 'healthy' });
  } catch (err) {
    res.status(503).json({ status: 'unhealthy', error: err.message });
  }
});
Enter fullscreen mode Exit fullscreen mode

Poll every 10–15 seconds. Remove instances after 2 consecutive failures. This one endpoint prevents your load balancer from routing live traffic to instances that have silently lost their database connection.

  1. Nginx Least-Connections Config for Webhook Workers
nginxupstream shopify_webhooks {
  least_conn;

  server app1.internal:3000 max_fails=3 fail_timeout=30s;
  server app2.internal:3000 max_fails=3 fail_timeout=30s;
  server app3.internal:3000 max_fails=3 fail_timeout=30s;

  keepalive 32;
}

location /webhooks/ {
  proxy_pass         http://shopify_webhooks;
  proxy_read_timeout 10s;
  proxy_set_header   X-Real-IP $remote_addr;
}
Enter fullscreen mode Exit fullscreen mode

max_fails=3 fail_timeout=30s removes a server from rotation after 3 consecutive failures within 30 seconds. keepalive 32 maintains persistent upstream connections and eliminates TCP handshake overhead on every webhook.

  1. Circuit Breaking Prevents Cascading Failures Load balancing distributes across your healthy instances. Circuit breaking stops failures from external dependencies — the Shopify Admin API, fulfillment APIs — from taking down your entire worker pool. jsimport CircuitBreaker from 'opossum';
const breaker = new CircuitBreaker(callShopifyAPI, {
  timeout: 5000,
  errorThresholdPercentage: 50,
  resetTimeout: 30000,
  volumeThreshold: 10,
});

breaker.fallback((shop, endpoint) => getCachedResponse(shop, endpoint));
When Shopify API latency spikes, the circuit opens at 50% error rate. Subsequent calls fail immediately rather than hanging for 5 seconds each, freeing your worker threads for requests that can actually succeed.

Bonus: Blue-Green Deploys with Nginx Traffic Splitting
nginxsplit_clients '${remote_addr}${request_uri}' $app_version {
  10%   green;   # New version
  *     blue;    # Stable version
}

location / {
  proxy_pass http://shopify_$app_version;
}
Enter fullscreen mode Exit fullscreen mode

Start at 10% to the new version. Watch error rates. Increase gradually. Roll back by changing one number. Zero-downtime deploys without Kubernetes.

The Load Balancing Stack Summary
LayerTool / PatternPriorityEdge routingFastly / Shopify CDNManaged by ShopifyApp tierNginx least-connectionsP0Stateless designRedis sessions + S3P0Health checks/health endpointP0Circuit breakingopossum / resilience4jP1Multi-regionGeoDNS + read replicasP1Blue-green deploysNginx split_clientsP2

Full guide with Hydrogen/Oxygen edge patterns, multi-region webhook deduplication, and load shedding strategies: https://kolachitech.com/shopify-load-balancing

Top comments (0)