DEV Community

RAXXO Studios
RAXXO Studios

Posted on • Originally published at raxxo.shop

5 Cloudflare Workers Patterns I Use for Shopify Edge Logic

  • A/B test buy buttons by rewriting HTML at the edge

  • Geo-aware free shipping banners using CF-IPCountry

  • Signed customer-account URLs with HMAC query params

  • AI-personalized PDP recommendations cached in KV

  • Webhook queue absorbers to flatten Shopify burst spikes

  • When Workers earn their keep, when to stay with origin, the cost math

Custom Shopify backends rarely need a full origin server. Most of the per-request logic I write for storefronts (A/B testing, geo redirects, header rewriting, signed image URLs, OAuth proxying) does not belong on a Node box in Frankfurt. It belongs at the edge, 5 milliseconds from the visitor, in front of the Shopify CDN, with zero cold starts and a free tier that swallows most of my traffic. The piece I see missing from almost every Shopify backend stack I audit is this thin programmable layer between the user and the platform. Cloudflare Workers fill it. They sit cleanly in front of any Shopify store with a CNAME flattening trick, run V8 isolates instead of containers, and give me KV, D1, Queues, and R2 as bindings without a separate deploy. Below are 5 patterns I use in production, each with Wrangler-compatible TypeScript that runs in 2026. Same patterns I reach for after locking down the data layer with the 8 Drizzle patterns and the Hono backend stack.

A/B Testing Buy Buttons at the Edge

The fastest way to ruin a Shopify storefront is to load a third-party A/B testing script that flickers the buy button half a second after paint. The cleaner pattern is to bucket the user at the edge, rewrite the HTML response before it reaches the browser, and stamp a sticky cookie so the variant survives reloads. No flicker, no extra JS, no Optimizely tax.

I use HTMLRewriter to swap the button text and a data attribute. Bucketing is a hash of a per-user cookie ID modulo 100, so a 50/50 split is just bucket < 50. The Worker fetches the upstream Shopify HTML, rewrites the relevant section, and streams the response back. Since Workers stream, the user gets first byte at the same speed as the origin.


export interface Env {
  AB_KV: KVNamespace;
  ORIGIN: string;
}

function bucketFor(uid: string): number {
  let h = 2166136261;
  for (let i = 0; i < uid.length; i++) {
    h ^= uid.charCodeAt(i);
    h = Math.imul(h, 16777619);
  }
  return (h >>> 0) % 100;
}

export default {
  async fetch(request: Request, env: Env): Promise {
    const url = new URL(request.url);
    const cookie = request.headers.get("cookie") || "";
    let uid = /rx_uid=([^;]+)/.exec(cookie)?.[1];
    const isNew = !uid;
    if (!uid) uid = crypto.randomUUID();

    const variant = bucketFor(uid) < 50 ? "A" : "B";
    const upstream = await fetch(env.ORIGIN + url.pathname + url.search, request);

    const rewriter = new HTMLRewriter()
      .on("button[name=add]", {
        element(el) {
          el.setAttribute("data-variant", variant);
          el.setInnerContent(variant === "A" ? "Add to cart" : "Buy now");
        },
      });

    const res = rewriter.transform(upstream);
    if (isNew) {
      res.headers.append(
        "set-cookie",
        `rx_uid=${uid}; Path=/; Max-Age=31536000; SameSite=Lax; Secure`,
      );
    }
    res.headers.set("x-rx-variant", variant);
    return res;
  },
};

Enter fullscreen mode Exit fullscreen mode

I log variant exposure to a downstream analytics endpoint via ctx.waitUntil(...) so the response is never blocked by the log write. Result lift on a real test (free shipping threshold copy) was 8.4 percent more checkouts started, with zero added latency. The cookie-based bucketing also survives Shopify's full-page cache, which most third-party tools cannot say.

Geo-Aware Free Shipping Banner

Shopify Markets handles currency and language. It does not handle conditional banners that depend on country, like "Free shipping over 50 EUR in DE, AT, NL." I want that banner injected server-side so it never causes layout shift, and I want the threshold logic centralized, not duplicated across 40 Liquid sections.

Cloudflare gives me request.cf.country and the CF-IPCountry header for free, geolocated at the edge. I read the country, look up a tiny shipping table, and patch the matching shopify-section div in the response. Edge cache key includes the country code so EU and US get separate cached responses, both warm.


type Threshold = { eur: number; copy: string };

const SHIPPING: Record = {
  DE: { eur: 50, copy: "Free shipping over 50 EUR" },
  AT: { eur: 50, copy: "Free shipping over 50 EUR" },
  NL: { eur: 50, copy: "Free shipping over 50 EUR" },
  US: { eur: 75, copy: "Free shipping over 75 EUR equivalent" },
  GB: { eur: 60, copy: "Free shipping over 60 EUR" },
};

export default {
  async fetch(request: Request, env: { ORIGIN: string }): Promise {
    const country = request.headers.get("cf-ipcountry") || "US";
    const cfg = SHIPPING[country] || { eur: 75, copy: "Standard shipping" };

    const cacheUrl = new URL(request.url);
    cacheUrl.searchParams.set("__geo", country);
    const cache = caches.default;
    let res = await cache.match(cacheUrl.toString());
    if (res) return res;

    const upstream = await fetch(new URL(request.url).toString(), request);
    const rewritten = new HTMLRewriter()
      .on("[data-rx-ship-banner]", {
        element(el) {
          el.setInnerContent(cfg.copy);
          el.setAttribute("data-threshold", String(cfg.eur));
        },
      })
      .transform(upstream);

    res = new Response(rewritten.body, rewritten);
    res.headers.set("cache-control", "public, max-age=300");
    res.headers.set("x-rx-country", country);
    return res;
  },
};

Enter fullscreen mode Exit fullscreen mode

Two things matter here. First, the cache key includes the country, so I do not poison the German cache with a US banner. Second, the Liquid section ships an empty `` placeholder. The Worker fills it. If the Worker is bypassed (rare, but it happens during deploys), the page degrades to a blank span, never to the wrong threshold. The full economic argument lives in 5 Postgres extensions for Shopify backends, where I explain why the source of truth still lives in Postgres, with the Worker as a fast read path.

Signed Customer-Account URLs

Shopify's new customer accounts are clean, but anything I bolt on (a loyalty tier page, a referral dashboard, a subscription manager) needs an authenticated link I can email. I do not want to roll a full session system for a page that lives 90 seconds. The right tool is an HMAC-signed URL: shop ID, customer ID, expiry, all signed with a secret only the Worker knows. Tamper with any param and the signature fails.

`typescript

export interface Env {
HMAC_SECRET: string;
}

async function hmac(key: string, data: string): Promise {
const enc = new TextEncoder();
const cryptoKey = await crypto.subtle.importKey(
"raw",
enc.encode(key),
{ name: "HMAC", hash: "SHA-256" },
false,
["sign", "verify"],
);
const sig = await crypto.subtle.sign("HMAC", cryptoKey, enc.encode(data));
return btoa(String.fromCharCode(...new Uint8Array(sig)))
.replace(/+/g, "-").replace(/\//g, "_").replace(/=+$/, "");
}

export async function signLink(
base: string, customerId: string, ttlSeconds: number, env: Env,
): Promise {
const exp = Math.floor(Date.now() / 1000) + ttlSeconds;
const payload = ${customerId}.${exp};
const sig = await hmac(env.HMAC_SECRET, payload);
return ${base}?c=${customerId}&e=${exp}&s=${sig};
}

export default {
async fetch(request: Request, env: Env): Promise {
const url = new URL(request.url);
const c = url.searchParams.get("c");
const e = url.searchParams.get("e");
const s = url.searchParams.get("s");
if (!c || !e || !s) return new Response("missing params", { status: 400 });
if (Number(e) < Math.floor(Date.now() / 1000)) {
return new Response("link expired", { status: 410 });
}
const expected = await hmac(env.HMAC_SECRET, ${c}.${e});
if (expected !== s) return new Response("bad signature", { status: 403 });
return new Response(Hello customer ${c}, {
headers: { "content-type": "text/plain" },
});
},
};

`

This is JWT-style without the JWT library overhead. Signing is around 0.4 ms in a Worker. The TTL is short, usually 15 minutes, and the secret rotates monthly via a Wrangler secret push. If a link leaks, the blast radius is one customer, one quarter hour. I use the same primitive for unsubscribe links, return labels, and one-click reorder URLs in transactional emails.

AI-Personalized PDP Recommendations

A "you might also like" rail that ships from a static metafield is a waste of the model. I want the rail to read the visitor's last-viewed products from a cookie, ask Claude to rank a candidate set against that history, and inject the result into the PDP HTML before paint. KV caches the response per (visitor, product) tuple for an hour, so the same Anthropic call costs me nothing on the second view.

`typescript

export interface Env {
RECS_KV: KVNamespace;
ANTHROPIC_KEY: string;
}

interface RecRequest {
recentlyViewed: string[];
current: string;
candidates: { handle: string; title: string; tags: string[] }[];
}

async function rankWithClaude(req: RecRequest, key: string): Promise {
const body = {
model: "claude-opus-4-7",
max_tokens: 200,
messages: [{
role: "user",
content: History: ${req.recentlyViewed.join(", ")}\nCurrent: ${req.current}\n +
Candidates: ${JSON.stringify(req.candidates)}\n +
Return JSON array of 4 best handles, ordered.,
}],
};
const r = await fetch("https://api.anthropic.com/v1/messages", {
method: "POST",
headers: {
"content-type": "application/json",
"x-api-key": key,
"anthropic-version": "2023-06-01",
},
body: JSON.stringify(body),
});
const data = await r.json() as { content: { text: string }[] };
const match = data.content[0].text.match(/[[^]]+]/);
return match ? JSON.parse(match[0]) : [];
}

export default {
async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise {
const url = new URL(request.url);
if (url.pathname !== "/api/recs") return new Response("not found", { status: 404 });

const req = await request.json() as RecRequest;
const cacheKey = `${req.current}:${req.recentlyViewed.slice(0, 3).join(",")}`;
const cached = await env.RECS_KV.get(cacheKey, "json");
if (cached) return Response.json(cached);

const ranked = await rankWithClaude(req, env.ANTHROPIC_KEY);
ctx.waitUntil(env.RECS_KV.put(cacheKey, JSON.stringify(ranked), {
  expirationTtl: 3600,
}));
return Response.json(ranked);
Enter fullscreen mode Exit fullscreen mode

},
};

`

Two details I learned the hard way. First, KV writes are eventually consistent, so I use ctx.waitUntil to let the response return immediately while the cache populates in the background. Second, I cap candidates at 20 before sending to Claude. Token cost stays predictable (around 0.002 EUR per fresh call, near zero on cache hits), and the cache hit ratio sits above 80 percent on a busy storefront. The same Worker can be called from a Liquid {% render %} snippet via fetch or directly from a hydrated React island, depending on the theme.

Webhook Queue Absorbers

Shopify webhooks burst. A flash sale produces 400 orders/create calls in 30 seconds. If my origin handler is a Vercel function with a 10 RPS database connection limit, half of those return 5xx and Shopify retries with backoff, doubling the storm. The fix is to make the public handler so cheap that it never fails, then drain the work asynchronously.

A Worker bound to a Cloudflare Queue does this in 30 lines. The Worker validates the HMAC, ACKs the webhook in under 20 ms, and shoves the payload onto the queue. A consumer Worker (or a downstream Hono service) drains the queue at a controlled rate, with retries and a dead-letter queue for poison messages.

`typescript

export interface Env {
WEBHOOK_QUEUE: Queue;
SHOPIFY_WEBHOOK_SECRET: string;
}

async function verifyShopify(body: string, hmacHeader: string, secret: string) {
const key = await crypto.subtle.importKey(
"raw", new TextEncoder().encode(secret),
{ name: "HMAC", hash: "SHA-256" }, false, ["sign"],
);
const sig = await crypto.subtle.sign("HMAC", key, new TextEncoder().encode(body));
const b64 = btoa(String.fromCharCode(...new Uint8Array(sig)));
return b64 === hmacHeader;
}

export default {
async fetch(request: Request, env: Env): Promise {
const body = await request.text();
const hmac = request.headers.get("x-shopify-hmac-sha256") || "";
const topic = request.headers.get("x-shopify-topic") || "unknown";
if (!(await verifyShopify(body, hmac, env.SHOPIFY_WEBHOOK_SECRET))) {
return new Response("bad hmac", { status: 401 });
}
await env.WEBHOOK_QUEUE.send({ topic, body, receivedAt: Date.now() });
return new Response("ok", { status: 200 });
},

async queue(batch: MessageBatch<{ topic: string; body: string }>, env: Env) {
for (const msg of batch.messages) {
try {
await fetch("https://api.raxxo.shop/internal/webhook-drain", {
method: "POST",
headers: { "content-type": "application/json", "x-rx-topic": msg.body.topic },
body: msg.body.body,
});
msg.ack();
} catch (err) {
msg.retry({ delaySeconds: 30 });
}
}
},
};

`

The public handler now responds in under 25 ms regardless of origin health. Shopify never sees a 5xx. The consumer handles 100 messages per batch, retries with exponential backoff, and sends poison messages to a DLQ I inspect once a week. Total cost for a million webhooks per month: under 0.50 EUR. Worth it for the ops headroom alone. I touch on the upstream side of this in the Lab overview, where the queue is the boundary between fast public surface and slow work.

Bottom Line

Workers earn their keep when the logic is per-request, latency-sensitive, and stateless or KV-backed. A/B bucketing, geo banners, signed URLs, AI rerank caches, webhook absorbers: all five fit. The free tier covers 100k requests per day, which is plenty for most Shopify stores. Paid tier kicks in at 5 USD per month for 10 million requests, roughly 0.30 EUR per million after that.

Workers do not replace origin. I still keep a Hono service for anything that needs a real database connection, a long-running job, or more than 30 seconds of CPU. Workers are the front door, the origin is the kitchen. The split saves money in two ways. First, the origin handles maybe 5 percent of total traffic because the edge absorbs the rest, so I run a smaller box. Second, the edge layer lets me ship logic without redeploying the origin, which keeps the deploy blast radius small.

If you are still routing every request through a single Node server, the next pattern to add is whichever of these five is currently breaking under load. For most Shopify stores, that is the webhook handler. Start there, deploy a single Worker, and let the others follow when the pain shows up.

Top comments (0)