Introduction
In this article, we'll explore how to build a pure frontend image compression system that runs entirely in the browser. Unlike traditional image compression services that upload your photos to remote servers for processing, this approach leverages WebAssembly codecs to compress images locally, ensuring complete privacy while delivering professional-quality results.
Why Browser-Based Compression?
The Privacy Problem
Traditional image compression services follow this pattern:
❌ User uploads image to server
❌ Image travels over internet to unknown infrastructure
❌ Server processes and stores the image temporarily
❌ Compressed image downloads back to user
❌ User has no control over data retention
This creates serious privacy concerns:
- Personal photos may contain sensitive information
- Business documents could leak proprietary content
- Medical images violate HIPAA regulations
- Legal documents breach attorney-client privilege
- Network overhead for large image files
The Browser-Based Solution
By performing compression locally:
✅ Images never leave the user's device
✅ Zero network transmission of image data
✅ Instant processing without upload delays
✅ Works offline after initial load
✅ Complete data sovereignty and privacy
Architecture Overview
Our system uses @jsquash - a collection of WebAssembly-based image codecs that run directly in the browser:
Core Data Structures
Image File Interface
Tracks all metadata for uploaded and processed images:
interface ImageFile {
id: string; // Unique identifier
file: File; // Original file object
previewUrl: string; // Blob URL for preview
compressedUrl?: string; // Result blob URL
compressedFileName?: string; // Generated filename
originalSize: number; // File size in bytes
compressedSize?: number; // Compressed size
originalType: string; // MIME type
}
Codecs Interface
Abstraction over all supported image formats:
interface Codecs {
// JPEG codec
decodeJpeg: (buffer: ArrayBuffer) => Promise<ImageData>;
encodeJpeg: (imageData: ImageData, options?: { quality: number }) => Promise<ArrayBuffer>;
// PNG codec
decodePng: (buffer: ArrayBuffer) => Promise<ImageData>;
encodePng: (imageData: ImageData) => Promise<ArrayBuffer>;
// WebP codec
decodeWebp: (buffer: ArrayBuffer) => Promise<ImageData>;
encodeWebp: (imageData: ImageData, options?: { quality: number }) => Promise<ArrayBuffer>;
// AVIF codec
decodeAvif: (buffer: ArrayBuffer) => Promise<ImageData>;
encodeAvif: (imageData: ImageData, options?: { quality: number }) => Promise<ArrayBuffer>;
}
Supported Formats
const COMPRESS_FORMATS = [
{ value: "jpeg", label: "JPEG", ext: "jpg" },
{ value: "png", label: "PNG", ext: "png" },
{ value: "webp", label: "WebP", ext: "webp" },
{ value: "avif", label: "AVIF", ext: "avif" },
];
Codec Sources
const JSDELivr = "https://esm.sh";
const CODEC_URLS = [
{ name: "JPEG", url: `${JSDELivr}/@jsquash/jpeg` },
{ name: "PNG", url: `${JSDELivr}/@jsquash/png` },
{ name: "WebP", url: `${JSDELivr}/@jsquash/webp` },
{ name: "AVIF", url: `${JSDELivr}/@jsquash/avif` },
];
Implementation Deep Dive
Step 1: Loading Codecs with Progress Tracking
Dynamic imports allow loading codecs on-demand with progress feedback:
export async function loadCodecs(
onProgress?: (progress: number) => void
): Promise<Codecs> {
const modules: Record<string, any> = {};
// Sequential loading with progress updates
for (let i = 0; i < CODEC_URLS.length; i++) {
const codec = CODEC_URLS[i];
// Dynamic import from esm.sh
const mod = await eval(`import("${codec.url}")`);
modules[codec.name] = mod.default || mod;
if (onProgress) {
onProgress(Math.round(((i + 1) / CODEC_URLS.length) * 100));
}
}
return {
decodeJpeg: modules["JPEG"]?.decode,
encodeJpeg: modules["JPEG"]?.encode,
decodePng: modules["PNG"]?.decode,
encodePng: modules["PNG"]?.encode,
decodeWebp: modules["WebP"]?.decode,
encodeWebp: modules["WebP"]?.encode,
decodeAvif: modules["AVIF"]?.decode,
encodeAvif: modules["AVIF"]?.encode
};
}
Why sequential loading?
- WASM files are fetched automatically when codecs load
- Prevents overwhelming network with parallel requests
- Provides meaningful progress feedback
- Each codec is ~100-500KB WASM binary
Step 2: Service Worker for WASM Caching
The Service Worker caches WASM binaries for offline usage:
Service Worker Implementation:
// compress-sw.js
const CACHE_NAME = 'compress-wasm-cache-v1';
const ESM_SH = 'https://esm.sh';
// WASM files to cache
const WASM_FILES = [
// JPEG codec
`${ESM_SH}/@jsquash/jpeg@1.6.0/es2022/mozjpeg_enc.wasm`,
`${ESM_SH}/@jsquash/jpeg@1.6.0/es2022/mozjpeg_dec.wasm`,
// PNG codec
`${ESM_SH}/@jsquash/png@2.0.0/es2022/png.wasm`,
// WebP codec
`${ESM_SH}/@jsquash/webp@1.0.0/es2022/webp.wasm`,
`${ESM_SH}/@jsquash/webp@1.0.0/es2022/webp_enc.wasm`,
// AVIF codec
`${ESM_SH}/@jsquash/avif@1.0.0/es2022/avif.wasm`,
`${ESM_SH}/@jsquash/avif@1.0.0/es2022/avif_enc.wasm`,
];
// Install: Cache all WASM files
self.addEventListener('install', (event) => {
event.waitUntil(
caches.open(CACHE_NAME).then((cache) => {
return Promise.all(
WASM_FILES.map((url) =>
cache.add(url).catch((err) => {
console.error(`[Compress SW] Failed to cache ${url}:`, err);
})
)
);
})
);
self.skipWaiting();
});
// Fetch: Serve from cache if available
self.addEventListener('fetch', (event) => {
const url = new URL(event.request.url);
// Only handle esm.sh WASM requests
if (url.hostname === 'esm.sh' && url.pathname.endsWith('.wasm')) {
event.respondWith(
caches.match(event.request).then((cachedResponse) => {
if (cachedResponse) {
console.log('[Compress SW] Returning cached WASM:', url.pathname);
return cachedResponse;
}
// Fetch from network and cache
return fetch(event.request).then((networkResponse) => {
if (networkResponse.status === 200) {
const responseToCache = networkResponse.clone();
caches.open(CACHE_NAME).then((cache) => {
cache.put(event.request, responseToCache);
});
}
return networkResponse;
});
})
);
}
});
Step 3: Image Decoding
Decode any supported format to raw ImageData:
const decodeImage = async (
buffer: ArrayBuffer,
type: string
): Promise<ImageData> => {
if (!codecs) throw new Error("Codecs not loaded");
let imageData: ImageData | null = null;
switch (type) {
case "image/jpeg":
imageData = await codecs.decodeJpeg(buffer);
break;
case "image/png":
imageData = await codecs.decodePng(buffer);
break;
case "image/webp":
imageData = await codecs.decodeWebp(buffer);
break;
case "image/avif":
imageData = await codecs.decodeAvif(buffer);
break;
default:
imageData = await codecs.decodeJpeg(buffer);
}
return imageData!;
};
Step 4: Image Encoding
Re-encode raw pixels to target format with quality control:
const encodeImage = async (
imageData: ImageData,
format: string,
q: number
): Promise<ArrayBuffer> => {
if (!codecs) throw new Error("Codecs not loaded");
switch (format) {
case "jpeg":
return codecs.encodeJpeg(imageData, { quality: q });
case "png":
// PNG uses lossless compression, no quality parameter
return codecs.encodePng(imageData);
case "webp":
return codecs.encodeWebp(imageData, { quality: q });
case "avif":
return codecs.encodeAvif(imageData, { quality: q });
default:
return codecs.encodeWebp(imageData, { quality: q });
}
};
Step 5: Compression Pipeline
Complete compression function:
const compressImage = async (
imageFile: ImageFile
): Promise<ImageFile> => {
try {
// Step 1: Read file as ArrayBuffer
const arrayBuffer = await imageFile.file.arrayBuffer();
// Step 2: Decode to raw pixels
const imageData = await decodeImage(
arrayBuffer,
imageFile.originalType
);
// Step 3: Re-encode with target settings
const encodedBuffer = await encodeImage(
imageData,
compressFormat,
quality
);
// Step 4: Create blob with proper MIME type
const format = COMPRESS_FORMATS.find(
f => f.value === compressFormat
) || COMPRESS_FORMATS[2];
const mimeType = `image/${format.value}`;
const compressedBlob = new Blob([encodedBuffer], { type: mimeType });
// Step 5: Generate blob URL
const compressedUrl = URL.createObjectURL(compressedBlob);
const originalName = imageFile.file.name.replace(/\.[^/.]+$/, "");
return {
...imageFile,
compressedUrl,
compressedFileName: `${originalName}_compressed.${format.ext}`,
compressedSize: compressedBlob.size,
};
} catch (error) {
console.error("Compression failed:", error);
return imageFile;
}
};
Step 6: Batch Processing
Process multiple images in parallel:
const handleCompress = async () => {
if (images.length === 0 || !codecs) return;
setCompressing(true);
try {
// Process all images in parallel
const compressed = await Promise.all(
images.map((img) => compressImage(img))
);
setImages(compressed);
} catch (error) {
console.error("Compression failed:", error);
} finally {
setCompressing(false);
}
};
Why parallel processing?
- Each image compression is independent
- WebAssembly runs outside main thread
- No DOM manipulation during processing
- Maximizes CPU utilization
Step 7: Memory Management
Proper cleanup prevents memory leaks:
const handleRemove = (id: string) => {
setImages((prev) => {
const img = prev.find((i) => i.id === id);
if (img) {
// Revoke blob URLs to free memory
URL.revokeObjectURL(img.previewUrl);
if (img.compressedUrl) {
URL.revokeObjectURL(img.compressedUrl);
}
}
return prev.filter((i) => i.id !== id);
});
};
const handleNewImage = () => {
// Cleanup all blob URLs
images.forEach((img) => {
URL.revokeObjectURL(img.previewUrl);
if (img.compressedUrl) {
URL.revokeObjectURL(img.compressedUrl);
}
});
setImages([]);
};
Step 8: Statistics Calculation
Calculate compression savings:
const formatFileSize = (bytes: number): string => {
if (bytes < 1024) return bytes + " B";
if (bytes < 1024 * 1024) return (bytes / 1024).toFixed(1) + " KB";
return (bytes / (1024 * 1024)).toFixed(2) + " MB";
};
const getSavings = (original: number, compressed: number): string => {
const savings = ((original - compressed) / original) * 100;
return savings.toFixed(1) + "%";
};
Code Quality Comparison
Format Characteristics
| Format | Compression | Quality Control | Best For |
|---|---|---|---|
| JPEG | Lossy | Quality 1-100 | Photos, complex images |
| PNG | Lossless | None | Graphics, transparency |
| WebP | Lossy/Lossless | Quality 1-100 | Web, modern browsers |
| AVIF | Lossy/Lossless | Quality 1-100 | Maximum compression |
Typical Compression Ratios
| Original | Format | Quality | Typical Savings |
|---|---|---|---|
| 5MB JPEG | WebP | 75% | 40-60% |
| 5MB JPEG | AVIF | 75% | 60-80% |
| 2MB PNG | WebP | Lossless | 30-50% |
| 2MB PNG | AVIF | 90% | 50-70% |
Technical Stack
| Component | Technology | Purpose |
|---|---|---|
| Framework | React 19 | UI components |
| Build Tool | Next.js 16 | SSR and static generation |
| Styling | Tailwind CSS 4 | Utility-first CSS |
| JPEG Codec | @jsquash/jpeg | MozJPEG encoding/decoding |
| PNG Codec | @jsquash/png | PNG encoding/decoding |
| WebP Codec | @jsquash/webp | WebP encoding/decoding |
| AVIF Codec | @jsquash/avif | AVIF encoding/decoding |
| Module CDN | esm.sh | ES module delivery |
| Caching | Service Worker | WASM offline caching |
| Icons | Lucide React | UI icons |
Why This Architecture?
Why @jsquash?
- Pure WebAssembly: No native dependencies or browser APIs
- Consistent API: Same interface across all formats
- Optimized codecs: Uses industry-standard libraries (mozjpeg, libwebp, libavif)
- Small bundle size: Each codec ~100-500KB WASM
- Cross-platform: Works in any modern browser
Why Service Worker Instead of LocalStorage?
| Feature | Service Worker | LocalStorage |
|---|---|---|
| Size limit | 50MB+ (browser dependent) | 5-10 MB |
| Binary data | Native support | Requires base64 encoding |
| Offline capability | Automatic fetch interception | Manual cache management |
| Background caching | Supported | Not applicable |
Why Not Use Web Workers?
While Web Workers could offload processing:
- WASM already runs outside main thread - Image codecs execute in WebAssembly which is separate from JavaScript execution
- ImageData transfer - Transferring large ImageData objects to workers has overhead
- Simpler debugging - Single-threaded code is easier to debug
- Sufficient performance - @jsquash codecs are optimized and fast enough for UI interaction
For processing hundreds of images, a Worker pool would be beneficial.
Browser Compatibility
Requirements:
- WebAssembly: Chrome 57+, Firefox 52+, Safari 11+, Edge 16+
- Service Worker: Chrome 40+, Firefox 44+, Safari 11.3+, Edge 17+
- ES Modules: Chrome 61+, Firefox 60+, Safari 10.1+, Edge 16+
- Dynamic Import: Chrome 63+, Firefox 67+, Safari 11.1+, Edge 79+
Note: AVIF encoding is CPU-intensive and may be slow on mobile devices. JPEG and WebP offer the best balance of speed and compression.
Try It Yourself
Ready to compress your images with complete privacy? Try our browser-based tool:
Your images are processed entirely in your browser - they never leave your device, ensuring complete privacy and security.
Conclusion
Browser-based image compression using @jsquash demonstrates the power of modern web technologies. By leveraging WebAssembly and the Service Worker Cache API, we can deliver professional-grade image compression without compromising user privacy.
Key advantages:
- Privacy-first: No image uploads required
- Format flexibility: JPEG, PNG, WebP, and AVIF support
- Offline capable: Works without internet after initial load
- Cost-effective: No server infrastructure needed
- Instant results: No network latency
This architecture is ideal for any application handling sensitive images where privacy, speed, and flexibility are paramount.
Further Reading
- @jsquash Documentation
- WebAssembly Documentation
- Service Worker API MDN
- ImageData API MDN
- MozJPEG GitHub
- WebP Documentation
- AVIF Specification
Want to add powerful image compression to your own web applications? The @jsquash library makes it remarkably simple to deploy professional-grade codecs directly to users' browsers.



Top comments (0)