Every KB you ship to someone else's site is a KB you didn't ask permission to ship. If a dependency is over 20KB gzipped and used in less than 80% of sessions, it should be lazy-loaded.
When we built IssueCapture — a bug-reporting widget that injects into your customers' production sites — bundle size wasn't a nice-to-have. It was a hard constraint. Nobody gives you a second chance if your widget adds 300KB to their LCP.
Our final numbers: 40KB gzipped for the core widget, 13KB for screenshot capture, 107KB for the annotation canvas. The last two load only when the user explicitly needs them. Here's how we structured that.
The problem with eager imports
import { html2canvas } from 'html2canvas';
import { fabric } from 'fabric';
Both libraries land in your initial bundle whether or not the user ever clicks the annotation tool. Fabric.js alone is 107KB gzipped. The fix is dynamic import().
Dynamic import basics
import() is a function that returns a promise. The browser only fetches the module when execution hits that line.
async function openAnnotationTool() {
const { fabric } = await import('fabric');
const canvas = new fabric.Canvas('c');
}
The chunk gets fetched on demand, cached by the browser, and never shipped to users who don't need it.
How we structured it
let screenshotModule = null;
let annotationModule = null;
async function loadScreenshot() {
if (!screenshotModule) {
screenshotModule = await import('./screenshot');
}
return screenshotModule;
}
async function loadAnnotation() {
if (!annotationModule) {
annotationModule = await import('./annotation');
}
return annotationModule;
}
Caching the module reference means subsequent calls are synchronous lookups — skipping the await entirely on repeat calls keeps your UI snappy.
Loading states matter
The one place developers get lazy-loading wrong is the loading state. If you fire import() on a button click and show nothing for 500ms, users will click again.
async function handleScreenshotClick() {
setLoadingState('screenshot');
try {
const { captureScreenshot } = await loadScreenshot();
const imageData = await captureScreenshot(document.body);
setScreenshotData(imageData);
} catch (err) {
setError('Screenshot failed to load');
} finally {
setLoadingState(null);
}
}
The spinner is essential. Users who click and see immediate feedback will wait. Users who click and see nothing will leave.
Preload hints for the likely path
If someone opens the bug report modal, there's a decent chance they'll want a screenshot. Start fetching in the background:
function handleModalOpen() {
setModalOpen(true);
import('./screenshot').catch(() => {
// Silently ignore prefetch failures
});
}
Or programmatically:
function preloadChunk(src) {
const link = document.createElement('link');
link.rel = 'modulepreload';
link.href = src;
document.head.appendChild(link);
}
The tradeoff: prefetching consumes bandwidth even if the user never triggers the feature. Use it for high-probability paths only. Screenshot capture is used in roughly 60% of our reports, so the prefetch pays off.
Measuring the actual impact
Lighthouse for LCP and TTI:
npx lighthouse https://example.com \
--output json \
--chrome-flags="--headless" \
| jq '.audits["largest-contentful-paint"].numericValue'
Splitting the screenshot module off the initial bundle improved TTI by roughly 180ms on simulated 4G.
Bundle analyzer (Vite):
import { visualizer } from 'rollup-plugin-visualizer';
export default {
plugins: [
visualizer({ open: true, gzip: true })
]
};
Look for large modules in your entry chunk — those are candidates for lazy-loading.
When lazy-loading is not the answer
For features that nearly all users need in nearly all sessions, just ship them eagerly. The mental overhead of managing async loading states is real. Only take it on when the size savings are meaningful.
We ship a 40KB widget that can grow to 160KB when fully loaded — but most users never see that 160KB. They see 40KB, and the rest loads invisibly when they need it.
Where this pattern breaks down: if your lazy-loaded chunk is on a CDN with a cold cache and the user is on a flaky mobile connection, the loading state becomes the experience. We haven't solved that cleanly yet.
Top comments (0)