As a best-selling author, I invite you to explore my books on Amazon. Don't forget to follow me on Medium and show your support. Thank you! Your support means the world!
Building a fast website feels good. Visitors stay longer, search engines smile, and your work just feels more polished. But the rules for what makes a site "fast" have gotten more specific. They're no longer just about a quick loading bar. Now, it's about how the page feels to use. Is it instantly usable? Does it jump around as it loads? Does it respond the moment you tap a button?
These questions are answered by three specific measurements called Core Web Vitals. Think of them as a report card for your site's user experience, and Google uses this report card as a ranking factor. Ignoring them is like building a beautiful store with a sticky door—people might leave before they even see what's inside.
I want to share the practical ways I work on these metrics. This isn't about vague theory; it's about the code and decisions that move the needle. Let's break down the three main grades and how to improve them.
The first grade is about loading speed, but with a twist. It's called Largest Contentful Paint, or LCP. It doesn't measure when the whole page loads. Instead, it marks the moment the largest, most obvious piece of content appears on the screen. This is usually your main image, a big headline, or a key block of text. The goal is to get this to show up within 2.5 seconds.
The biggest culprit for a poor LCP is often a large, unoptimized image. The browser can't paint what it hasn't downloaded. My approach is to be ruthless with images from the start.
// Using a modern framework like Next.js makes this systematic
import Image from 'next/image';
function ProductHero() {
return (
<div className="hero">
<Image
src="/product-hero.avif"
alt="Our flagship product"
width={1200}
height={630}
priority // This tells Next.js to preload this image
quality={80} // Often, 80-85% is indistinguishable from 100%
sizes="100vw"
style={{ width: '100%', height: 'auto' }} // Prevents layout shift
/>
</div>
);
}
But it's not just about images. The browser needs to discover and fetch all critical resources as fast as possible. I use the <head> of my document like a mission control center, giving the browser a head start.
<head>
<!-- Tell the browser about critical fonts immediately -->
<link rel="preload" href="/fonts/Inter.woff2" as="font" type="font/woff2" crossorigin>
<!-- If my LCP is a CSS background image, preload it -->
<link rel="preload" as="image" href="hero-bg.webp" imagesrcset="hero-bg-800.webp 800w, hero-bg-1200.webp 1200w">
<!-- Connect to important third-party domains early -->
<link rel="preconnect" href="https://my-cdn.example.com">
<!-- Inline the minimal CSS needed for the initial page view -->
<style>
.hero, .primary-headline, .main-nav {
/* Only the absolute essentials here */
}
</style>
</head>
The second grade, First Input Delay (FID), is all about responsiveness. It measures the time from when a user first taps a button, clicks a link, or uses a custom control to when the browser can actually begin processing that action. A poor FID makes your site feel sluggish and broken. The enemy here is heavy JavaScript tasks that "block" the main thread.
I've learned that a lot of my initial JavaScript work isn't needed right away. The key is to defer it and to break up any unavoidable long tasks.
// The `defer` attribute is my best friend for non-critical scripts.
<script src="chat-widget.js" defer></script>
<script src="analytics.js" defer></script>
// For critical interactivity, I keep the code lean and separate.
<script type="module">
// This loads as an ES module, which is deferred by default.
import { setupCoreCart } from './core-cart.js';
document.addEventListener('DOMContentLoaded', setupCoreCart);
</script>
// When processing large amounts of data, I avoid blocking the thread.
function renderBigList(items) {
const CHUNK_SIZE = 50;
let i = 0;
function processNextChunk() {
const chunk = items.slice(i, i + CHUNK_SIZE);
// Do a small amount of work with the chunk...
appendToDOM(chunk);
i += CHUNK_SIZE;
if (i < items.length) {
// Yielding control back to the browser keeps it responsive.
setTimeout(processNextChunk, 0);
}
}
processNextChunk();
}
For truly heavy calculations, I move them off the main thread completely using a Web Worker. This keeps the interface silky smooth.
// main.js
const dataWorker = new Worker('./data-processor.js');
calculateButton.addEventListener('click', () => {
// Immediately show a loading state.
showSpinner();
// Send the data to the worker thread.
dataWorker.postMessage(largeDataset);
});
// Listen for the result from the worker.
dataWorker.addEventListener('message', (event) => {
hideSpinner();
displayResults(event.data);
});
// data-processor.js
self.addEventListener('message', (event) => {
const result = performComplexCalculation(event.data);
// Send the result back to the main thread.
self.postMessage(result);
});
The third grade, Cumulative Layout Shift (CLS), is the most visual metric. It quantifies how much your page content jumps around unexpectedly during loading. There's nothing more frustrating than trying to read an article or click a button, only to have it move at the last second. I aim for a CLS score under 0.1.
The most common fix is also the simplest: always include width and height attributes on your images and videos. This simple act reserves the space before the asset loads.
<img
src="article-image.jpg"
alt="Descriptive text"
width="800"
height="600"
loading="lazy"
>
For dynamic content like ads, embedded widgets, or banners that load later, you must reserve space. An empty div that suddenly grows pushes everything else down.
.advertisement-container {
min-height: 280px; /* Reserve space for the tallest expected ad */
background-color: #fafafa; /* A neutral placeholder color */
border: 1px dotted #ddd;
}
.widget-embed {
aspect-ratio: 16 / 9; /* If you know the proportions */
width: 100%;
}
Fonts are a major source of layout shift. A web font loading after a system font can cause text to reflow. I use font-display: swap and consider using size-adjust to better match fallback fonts.
@font-face {
font-family: 'BrandFont';
src: url('brandfont.woff2') format('woff2');
font-display: swap; /* Use fallback first, then swap in the web font */
size-adjust: 105%; /* Fine-tunes the swap to minimize movement */
}
Modern CSS is a huge help. The contain property lets you tell the browser that an element's layout is isolated, preventing changes inside it from affecting the rest of the page.
.news-feed-widget {
contain: layout style paint;
/* This box is now mostly independent. */
}
Resource loading is a broad category that underpins everything else. It's the art of telling the browser what's important, in what order, and in what format. My strategy involves a combination of modern HTML attributes and thoughtful prioritization.
The loading attribute for images and iframes is a game-changer. It allows native browser-level lazy loading.
<!-- Load this image only when it's near the viewport -->
<img src="product-gallery-5.jpg" loading="lazy" alt="Alternative view">
<!-- This video poster is important, load it eagerly -->
<img src="video-poster.jpg" loading="eager" alt="Video: How to use">
For the absolute most critical image—often the LCP candidate—I use fetchpriority="high".
<img
src="main-hero.avif"
fetchpriority="high"
alt="Main Hero"
width="1200"
height="630">
Using the modern <picture> element with next-gen formats like AVIF and WebP ensures users get the smallest, fastest file their browser supports.
<picture>
<source type="image/avif" srcset="photo.avif 1x, photo@2x.avif 2x">
<source type="image/webp" srcset="photo.webp 1x, photo@2x.webp 2x">
<img src="photo.jpg" srcset="photo@2x.jpg 2x" alt="A description" width="800" height="600">
</picture>
JavaScript execution remains the single biggest obstacle to a fluid experience. My goal is to ship less of it, and to ship it smarter. Code splitting is the first tool in my box. It means breaking my application's JavaScript into smaller pieces and only loading what's needed for the current page or interaction.
In a React application, this is beautifully simple with React.lazy and Suspense.
import React, { Suspense } from 'react';
// These components are split into separate JavaScript files.
const ProductRecommendations = React.lazy(() => import('./ProductRecommendations'));
const CustomerReviews = React.lazy(() => import('./CustomerReviews'));
function ProductPage() {
return (
<div>
<ProductDetails />
{/* This widget loads only when it scrolls into view */}
<Suspense fallback={<div>Loading recommendations...</div>}>
<ProductRecommendations />
</Suspense>
<Suspense fallback={<ReviewsSkeleton />}>
<CustomerReviews />
</Suspense>
</div>
);
}
I regularly audit my bundles. Tools like Webpack Bundle Analyzer show me exactly what's in my JavaScript packages. Often, I find large libraries where I only use a few functions. In those cases, I can import those functions directly.
// Instead of this:
import _ from 'lodash';
const result = _.chunk(myArray, 2);
// I can do this:
import chunk from 'lodash/chunk';
const result = chunk(myArray, 2);
For styles, unused CSS is dead weight. I use tools like PurgeCSS in my build process to automatically strip out any CSS selectors that aren't being used in my HTML templates.
Finally, monitoring is what turns a one-time fix into a lasting performance culture. You can't manage what you don't measure. I use a combination of real-user data and synthetic testing.
The web-vitals JavaScript library is an easy way to collect real measurements from actual visitors.
import {onLCP, onFID, onCLS} from 'web-vitals';
function sendToMyDashboard({name, value, id}) {
const data = {
metric: name,
value: name === 'CLS' ? value : Math.round(value), // CLS is a decimal
page: location.pathname,
sessionId: id
};
// Send to my internal analytics or a service like Google Analytics
navigator.sendBeacon('/api/web-vitals', JSON.stringify(data));
}
onLCP(sendToMyDashboard);
onFID(sendToMyDashboard);
onCLS(sendToMyDashboard);
In my development workflow and continuous integration (CI) pipeline, I run automated Lighthouse audits. This catches regressions before they reach users.
I can set performance budgets in my package.json or a dedicated config file.
{
"performanceBudgets": {
"lcp": 2500,
"fid": 100,
"cls": 0.1,
"totalJSWeight": 170
}
}
Then, in my CI script, a Lighthouse check will fail the build if these budgets are exceeded, forcing a review of the new code. This shifts performance left in the development process, making it everyone's responsibility.
Optimizing for Core Web Vitals isn't a checklist; it's a mindset. It's about thinking about the user's journey from the moment they request your page. By focusing on these specific, user-centric metrics—loading the main content fast, ensuring immediate interactivity, and guaranteeing visual stability—you build experiences that are not just fast, but feel robust and high-quality. The code patterns and strategies here are the practical steps I take to make that feeling a reality. The result is a website that works better for everyone.
📘 Checkout my latest ebook for free on my channel!
Be sure to like, share, comment, and subscribe to the channel!
101 Books
101 Books is an AI-driven publishing company co-founded by author Aarav Joshi. By leveraging advanced AI technology, we keep our publishing costs incredibly low—some books are priced as low as $4—making quality knowledge accessible to everyone.
Check out our book Golang Clean Code available on Amazon.
Stay tuned for updates and exciting news. When shopping for books, search for Aarav Joshi to find more of our titles. Use the provided link to enjoy special discounts!
Our Creations
Be sure to check out our creations:
Investor Central | Investor Central Spanish | Investor Central German | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | Java Elite Dev | Golang Elite Dev | Python Elite Dev | JS Elite Dev | JS Schools
We are on Medium
Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva
Top comments (0)