As a best-selling author, I invite you to explore my books on Amazon. Don't forget to follow me on Medium and show your support. Thank you! Your support means the world!
When I first started building web applications, all the real work happened in one central place—a big server in a data center somewhere. If you were in Tokyo and the server was in Virginia, your request had to travel halfway around the world and back. This caused noticeable delays, what we call latency. Today, we can run code much closer to the person using it, on servers at the very edge of the network. This is edge computing. It makes everything feel faster and more responsive.
Cloudflare Workers is a platform that lets you run JavaScript code on this global edge network. It feels similar to other serverless tools, but with one crucial difference: your code runs in hundreds of locations simultaneously, not just in a few central ones. Let me share some practical ways to use it.
I'll start with how you set up the main entry point for your application. Think of a worker as a highly efficient traffic director. Every request passes through it. Your job is to figure out what the request needs and send it to the right place. A common pattern is to inspect the URL path and method.
Here's a basic but powerful structure I often use. It creates a single handler that routes requests to different functions.
export default {
async fetch(request, env, ctx) {
const url = new URL(request.url);
const path = url.pathname;
if (path.startsWith('/api/')) {
return handleAPI(request, env, ctx, url);
} else if (path.startsWith('/assets/')) {
return serveAsset(request, env, ctx);
} else {
return handlePage(request, env, ctx);
}
}
};
async function handleAPI(request, env, ctx, url) {
// Your API logic here
const data = { message: "Hello from the edge" };
return new Response(JSON.stringify(data), {
headers: { 'Content-Type': 'application/json' }
});
}
This simple router is your foundation. The env object contains your configuration, and ctx gives you control over the request's lifecycle.
One of the most useful patterns is middleware. This is code that runs before your main handler, perfect for tasks you need to do for every single request, like checking who the user is or logging information. I like to create a pipeline of functions that each prepare the request a little more.
Let's add a middleware that checks for a valid API key on certain routes.
async function withAuth(request, env, handler) {
const apiKey = request.headers.get('X-API-Key');
// A real check would compare against a stored key in `env`
if (!apiKey || apiKey !== env.API_KEY) {
return new Response('Unauthorized', { status: 401 });
}
// If authorized, call the original handler
return handler(request, env);
}
// Now, use it in your router
export default {
async fetch(request, env, ctx) {
const url = new URL(request.url);
if (url.pathname.startsWith('/admin/')) {
// Apply auth middleware only to admin routes
return withAuth(request, env, handleAdminRoute);
}
return handlePublicRoute(request, env);
}
};
This keeps your security logic in one spot and applies it consistently.
For a long time, a major challenge at the edge was managing state—like a user's shopping cart or a game score. Regular edge functions are stateless; they can't remember things between requests in a consistent way. Cloudflare introduced Durable Objects to solve this. Think of a Durable Object as a tiny, stateful server that is globally unique. All requests for a specific piece of state (like a specific user's session) are sent to the same object instance, no matter where in the world they come from.
It's easier to understand with an example. Let's build a simple counter that multiple users can update.
First, you define the Durable Object class.
// In your Worker code, define the Durable Object class
export class Counter {
constructor(state, env) {
this.state = state;
// `state.storage` lets you persist data
this.storage = state.storage;
}
async fetch(request) {
let url = new URL(request.url);
let value = (await this.storage.get("count")) || 0;
if (url.pathname === "/increment") {
value++;
await this.storage.put("count", value);
} else if (url.pathname === "/decrement") {
value--;
await this.storage.put("count", value);
}
return new Response(value.toString());
}
}
Then, in your main worker, you call it. You need a unique ID to point to a specific counter instance. A user ID or a room name works well.
// Your main Worker script
export default {
async fetch(request, env, ctx) {
const url = new URL(request.url);
if (url.pathname.startsWith('/counter/')) {
// Get the counter name from the path, like '/counter/myRoom'
const counterName = url.pathname.split('/')[2];
// Derive a unique ID from the name
const id = env.COUNTER_NAMESPACE.idFromName(counterName);
// Get the object instance
const obj = env.COUNTER_NAMESPACE.get(id);
// Forward the request to the Durable Object
return obj.fetch(request);
}
return new Response('Not found', { status: 404 });
}
};
Now, if users in different countries visit /counter/myRoom, they all talk to the same Counter object. The count will be consistent for everyone. This is powerful for chat rooms, live scores, or collaborative editing.
For simpler data that doesn't need strong consistency, Cloudflare offers KV (Key-Value) storage. It's a global, low-latency cache that's perfect for storing data that changes infrequently, like blog posts, configuration, or user profiles.
The trade-off is that KV is eventually consistent. An update made in one location might take a few seconds to be seen everywhere. That's fine for many use cases. Here's how you cache an expensive API call.
export default {
async fetch(request, env, ctx) {
const cacheKey = 'expensive_api_data';
// Try to get the cached value from KV
let cachedData = await env.MY_KV_NAMESPACE.get(cacheKey, { type: 'json' });
if (cachedData) {
// Cache hit! Return it immediately.
return new Response(JSON.stringify(cachedData));
}
// Cache miss. Fetch the fresh data.
const freshData = await fetch('https://slow-central-api.com/data').then(r => r.json());
// Store it in KV for next time, but don't wait for this to finish.
ctx.waitUntil(
env.MY_KV_NAMESPACE.put(cacheKey, JSON.stringify(freshData), { expirationTtl: 3600 }) // Cache for 1 hour
);
return new Response(JSON.stringify(freshData));
}
};
The ctx.waitUntil() is key here. It tells the edge runtime, "You can send the response back to the user now, but please finish this cache update in the background." This keeps the user's wait time as short as possible.
Sometimes you need to work with files—images, documents, videos. Cloudflare R2 is storage that works like Amazon's S3 but without costly fees for moving data out. The best part is you can serve files directly from the edge.
Imagine a user uploads a profile picture. You can store it in R2 and then serve it from locations worldwide.
async function handleUpload(request, env) {
// Assuming the request is a form with a file
const formData = await request.formData();
const file = formData.get('avatar');
const objectKey = `avatars/user_12345.png`;
// Store the file in R2
await env.MY_R2_BUCKET.put(objectKey, file.body, {
httpMetadata: { contentType: file.type }
});
return new Response(JSON.stringify({ success: true, key: objectKey }));
}
async function serveFile(request, env) {
const url = new URL(request.url);
// Path like /file/avatars/user_12345.png
const objectKey = url.pathname.replace('/file/', '');
const object = await env.MY_R2_BUCKET.get(objectKey);
if (object === null) {
return new Response('File not found', { status: 404 });
}
const headers = new Headers();
object.writeHttpMetadata(headers);
headers.set('etag', object.httpEtag);
return new Response(object.body, { headers });
}
You can also do powerful transformations on the fly. For example, you can fetch an image from R2 and resize it directly at the edge before sending it to a mobile phone, saving bandwidth.
import { Image } from 'cf-images'; // Hypothetical image manipulation library
async function serveResizedImage(request, env) {
const url = new URL(request.url);
// Path like /images/avatar.png?width=100&height=100
const objectKey = url.pathname.replace('/images/', '');
const width = parseInt(url.searchParams.get('width')) || 300;
const height = parseInt(url.searchParams.get('height')) || 300;
const originalImage = await env.MY_R2_BUCKET.get(objectKey);
if (!originalImage) {
return new Response('Image not found', { status: 404 });
}
// Transform the image bytes at the edge
const imageData = await originalImage.arrayBuffer();
const resizedImage = await Image.resize(imageData, { width, height });
return new Response(resizedImage, {
headers: { 'Content-Type': 'image/png' }
});
}
A website isn't just APIs and files; you often need to serve full web pages. You can use Workers to assemble HTML on the edge, personalizing it for each user with incredibly low delay. This is called Edge Side Rendering (ESR).
Instead of sending a blank HTML file to the browser to be filled by JavaScript, you can send a partially complete page with the user-specific data already in place.
async function renderPage(request, env, ctx) {
// 1. Fetch a static HTML shell (the basic page structure)
const htmlShell = `
<!DOCTYPE html>
<html>
<head><title>My Site</title></head>
<body>
<div id="root"><!--CONTENT_PLACEHOLDER--></div>
<script src="/app.js"></script>
</body>
</html>
`;
// 2. Fetch personalized data for this user (super fast from the edge!)
const userData = await fetchUserData(request, env); // e.g., from KV or a Durable Object
// 3. Inject that data into the shell
const initialContent = `
<h1>Welcome back, ${userData.name}!</h1>
<p>You have ${userData.notifications} new notifications.</p>
`;
const finalHTML = htmlShell.replace('<!--CONTENT_PLACEHOLDER-->', initialContent);
// 4. Also embed the data for the client-side JavaScript to use
const htmlWithState = finalHTML.replace('</body>', `
<script>
window.__USER_DATA = ${JSON.stringify(userData)};
</script>
</body>
`);
return new Response(htmlWithState, {
headers: { 'Content-Type': 'text/html' }
});
}
The user sees their personalized page immediately. Then, your client-side JavaScript (app.js) can take over and make the page interactive. This gives you the best of both worlds: a fast, SEO-friendly first render and a dynamic, app-like experience.
Finally, handling errors gracefully at the edge is critical. A network error in Virginia shouldn't break the experience for a user in Sydney. You can build fallback logic and show user-friendly messages.
export default {
async fetch(request, env, ctx) {
try {
// Your main application logic here
return await handleRequest(request, env, ctx);
} catch (error) {
// Log the error for yourself
console.error(`Edge Error: ${error.message}`, { url: request.url });
// Show a friendly message to the user
const userFriendlyPage = `
<html><body>
<h1>Something went wrong</h1>
<p>We're working on it. Please try again shortly.</p>
<a href="/">Go Home</a>
</body></html>
`;
return new Response(userFriendlyPage, {
status: 500,
headers: { 'Content-Type': 'text/html' }
});
}
}
};
You can get even more sophisticated by having backup data sources. If your primary database is slow to respond from the edge, you could try to get slightly older data from a fast KV cache instead of showing an error.
Putting it all together, these techniques let you build applications that feel local to users everywhere. You handle routing and security at the gate, manage live state with Durable Objects, cache common data with KV, serve and transform files from R2, render personalized pages on the edge, and fail gracefully. You write it all in JavaScript, using patterns that are becoming standard, but you deploy it to a network that wraps the entire globe. The result is speed and resilience that was very hard to achieve before.
📘 Checkout my latest ebook for free on my channel!
Be sure to like, share, comment, and subscribe to the channel!
101 Books
101 Books is an AI-driven publishing company co-founded by author Aarav Joshi. By leveraging advanced AI technology, we keep our publishing costs incredibly low—some books are priced as low as $4—making quality knowledge accessible to everyone.
Check out our book Golang Clean Code available on Amazon.
Stay tuned for updates and exciting news. When shopping for books, search for Aarav Joshi to find more of our titles. Use the provided link to enjoy special discounts!
Our Creations
Be sure to check out our creations:
Investor Central | Investor Central Spanish | Investor Central German | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | Java Elite Dev | Golang Elite Dev | Python Elite Dev | JS Elite Dev | JS Schools
We are on Medium
Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva
Top comments (0)