<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Gaurav Bhowmick</title>
    <description>The latest articles on DEV Community by Gaurav Bhowmick (@bhowmick773).</description>
    <link>https://dev.to/bhowmick773</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/bhowmick773"/>
    <language>en</language>
    <item>
      <title>I replaced server-side image compression with 40 lines of Canvas API code</title>
      <dc:creator>Gaurav Bhowmick</dc:creator>
      <pubDate>Tue, 05 May 2026 01:15:25 +0000</pubDate>
      <link>https://dev.to/bhowmick773/i-replaced-server-side-image-compression-with-40-lines-of-canvas-api-code-3egf</link>
      <guid>https://dev.to/bhowmick773/i-replaced-server-side-image-compression-with-40-lines-of-canvas-api-code-3egf</guid>
      <description>&lt;p&gt;I was building a side project that needed image compression. My first instinct was to look for an API — TinyPNG, Cloudinary, something with a POST endpoint.&lt;br&gt;
Then I looked at the pricing. And the rate limits. And the fact that I’d be uploading user photos to someone else’s infrastructure.&lt;br&gt;
So I tried something different: what if the browser just… did it?&lt;br&gt;
Turns out the Canvas API can compress images surprisingly well. The core logic is about 40 lines:&lt;br&gt;
const fallbacks = [0.6, 0.45, 0.3, 0.2];&lt;br&gt;
let blob = await compress(img, quality, format);&lt;/p&gt;

&lt;p&gt;if (blob.size &amp;gt;= file.size) {&lt;br&gt;
  for (const fq of fallbacks) {&lt;br&gt;
    const attempt = await compress(img, fq, format);&lt;br&gt;
    if (attempt.size &amp;lt; file.size) {&lt;br&gt;
      blob = attempt;&lt;br&gt;
      break;&lt;br&gt;
    }&lt;br&gt;
  }&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;Try the requested quality first. If the output is bigger, walk down through lower qualities until something sticks. As a last resort, if WebP output is still larger, fall back to JPEG entirely.&lt;br&gt;
Not elegant, but it works on basically everything I’ve thrown at it.&lt;br&gt;
A side effect I didn’t expect&lt;br&gt;
The Canvas API strips all EXIF metadata. When you draw an image to canvas and export it, the output contains zero metadata — no GPS coordinates, no device model, no timestamps.&lt;br&gt;
I originally considered this a bug. Then I realized most people compressing passport photos and ID scans probably want that metadata gone. So I kept it as a feature.&lt;br&gt;
What about PDFs?&lt;br&gt;
I needed PDF-to-image conversion for a separate use case and figured I’d add it to the same tool. pdfjs-dist handles the rendering, jsPDF handles the reverse direction. Both run client-side.&lt;br&gt;
The tricky part was pdfjs-dist v5 breaking everything. Downgraded to v3.11.174 and it worked immediately. Also had to add canvas: false to the webpack config because pdfjs tries to import the Node canvas module during build.&lt;br&gt;
The result&lt;br&gt;
I bundled all of this into a tool called MiniPx (minipx.com). The whole thing is a static Next.js export on Netlify. No backend. Server cost is literally zero.&lt;br&gt;
Features that exist because I kept needing them:&lt;br&gt;
    • Multi-pass compression with format fallback&lt;br&gt;
    • WebP/JPEG/PNG conversion&lt;br&gt;
    • Resize with presets (passport, email, WhatsApp)&lt;br&gt;
    • EXIF stripping (automatic)&lt;br&gt;
    • PDF to image and image to PDF&lt;br&gt;
    • Batch ZIP download via jszip&lt;br&gt;
The entire homepage is one React component at ~400 lines. Heavy libraries (pdfjs, jspdf, jszip) are lazy-loaded so they don’t affect initial page load.&lt;br&gt;
Not saying this replaces server-side compression for production pipelines. But for a utility tool where users compress a few photos? The browser handles it fine.&lt;br&gt;
Source of the compression approach is just the standard Canvas API — no special libraries needed for the image part. The interesting engineering was mostly in the fallback logic and making sure memory gets cleaned up (revokeObjectURL after every operation, reset canvas dimensions after PDF rendering).&lt;br&gt;
If you’re building something similar, the main gotchas:&lt;br&gt;
    1.  Always check if output &amp;gt; input. Canvas compression can make files larger.&lt;br&gt;
    2.  JPEG needs a white background fill before drawImage, or transparent areas turn black.&lt;br&gt;
    3.  Mobile browsers have canvas size limits. Cap your render scale.&lt;br&gt;
    4.  pdfjs-dist v5 is broken for client-side use. Stick with v3.&lt;br&gt;
Happy to answer questions if anyone’s doing something similar.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>bash</category>
      <category>powerapps</category>
      <category>javascript</category>
    </item>
    <item>
      <title>I built an image compressor that never sees your images published</title>
      <dc:creator>Gaurav Bhowmick</dc:creator>
      <pubDate>Sat, 02 May 2026 07:23:48 +0000</pubDate>
      <link>https://dev.to/bhowmick773/i-built-an-image-compressor-that-never-sees-your-imagespublished-3p00</link>
      <guid>https://dev.to/bhowmick773/i-built-an-image-compressor-that-never-sees-your-imagespublished-3p00</guid>
      <description>&lt;p&gt;Every online image compressor I tried had the same problem: they upload your photos to a server.&lt;br&gt;
TinyPNG, iLoveIMG, Compress2Go — they all work the same way. You pick a file, it goes to someone else's computer, gets compressed, comes back. The compression is good. But your photo — with its GPS coordinates, device serial number, and timestamps baked into the EXIF data — just sat on a server you don't control.&lt;br&gt;
I kept thinking: image compression is just math. It's Canvas API, quality parameters, and blob manipulation. There's no reason this needs a server.&lt;br&gt;
&lt;strong&gt;So I built MiniPx&lt;/strong&gt;. It compresses, converts, and resizes images entirely in the browser. Nothing gets uploaded. Ever. Here's how it works under the hood.&lt;br&gt;
The core compression loop&lt;br&gt;
The actual compression happens in about 20 lines. Load the image into a canvas, draw it, export as a blob with a quality parameter:&lt;/p&gt;

&lt;p&gt;javascriptfunction compressAtQuality(img, w, h, fmt, quality) {&lt;br&gt;
  return new Promise((resolve, reject) =&amp;gt; {&lt;br&gt;
    const canvas = document.createElement('canvas');&lt;br&gt;
    canvas.width = w;&lt;br&gt;
    canvas.height = h;&lt;br&gt;
    const ctx = canvas.getContext('2d');&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// White background for JPEG (no transparency support)
if (fmt === 'image/jpeg') {
  ctx.fillStyle = '#fff';
  ctx.fillRect(0, 0, w, h);
}

ctx.drawImage(img, 0, 0, w, h);
canvas.toBlob(
  (blob) =&amp;gt; blob ? resolve(blob) : reject(new Error('No output')),
  fmt,
  fmt === 'image/png' ? undefined : quality
);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;});&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;That's it. No sharp, no imagemagick, no server-side anything. The browser's built-in JPEG/WebP encoder handles the actual compression.&lt;br&gt;
The problem nobody talks about: when compression makes files bigger&lt;br&gt;
Here's something I didn't expect. If you take a well-optimized JPEG and run it through Canvas at quality 0.65, the output can be larger than the input. The browser re-encodes the entire image from scratch — it doesn't know the original was already compressed.&lt;br&gt;
I hit this constantly during testing. Users would drop a 200KB JPEG and get back a 280KB file. That's embarrassing.&lt;br&gt;
The fix is a fallback chain. If the initial compression produces a bigger file, step down through lower quality levels until you beat the original:&lt;/p&gt;

&lt;p&gt;javascriptlet blob = await compressAtQuality(img, w, h, fmt, quality);&lt;/p&gt;

&lt;p&gt;if (blob.size &amp;gt;= file.size &amp;amp;&amp;amp; fmt !== 'image/png') {&lt;br&gt;
  for (const fallbackQ of [0.6, 0.45, 0.3, 0.2]) {&lt;br&gt;
    if (fallbackQ &amp;gt;= quality) continue;&lt;br&gt;
    const attempt = await compressAtQuality(img, w, h, fmt, fallbackQ);&lt;br&gt;
    if (attempt.size &amp;lt; file.size) {&lt;br&gt;
      blob = attempt;&lt;br&gt;
      break;&lt;br&gt;
    }&lt;br&gt;
    if (attempt.size &amp;lt; blob.size) blob = attempt;&lt;br&gt;
  }&lt;/p&gt;

&lt;p&gt;// Last resort: try a different format entirely&lt;br&gt;
  if (blob.size &amp;gt;= file.size &amp;amp;&amp;amp; fmt === 'image/webp') {&lt;br&gt;
    const jpegFallback = await compressAtQuality(&lt;br&gt;
      img, w, h, 'image/jpeg', Math.min(quality, 0.5)&lt;br&gt;
    );&lt;br&gt;
    if (jpegFallback.size &amp;lt; blob.size) blob = jpegFallback;&lt;br&gt;
  }&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;Not elegant, but it works. The user always gets a smaller file, even if the format or quality level isn't what they originally picked.&lt;br&gt;
PNG is a special headache&lt;br&gt;
PNG compression through Canvas is basically useless. The browser's PNG encoder produces files that are often 1.5-2x larger than the input because it doesn't do the advanced filtering and dictionary optimization that tools like pngquant use.&lt;br&gt;
My workaround: if a PNG output is significantly larger than the input, quietly try WebP and JPEG alternatives and pick the smallest:&lt;/p&gt;

&lt;p&gt;javascriptif (blob.size &amp;gt; file.size * 1.5 &amp;amp;&amp;amp; fmt === 'image/png') {&lt;br&gt;
  const webpAlt = await compressAtQuality(img, w, h, 'image/webp', quality);&lt;br&gt;
  const jpegAlt = await compressAtQuality(img, w, h, 'image/jpeg', quality);&lt;br&gt;
  const smallest = [blob, webpAlt, jpegAlt].sort((a, b) =&amp;gt; a.size - b.size)[0];&lt;br&gt;
  if (smallest.size &amp;lt; blob.size) blob = smallest;&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;This means a user who drops a 4MB PNG screenshot might get back a 400KB WebP instead of the 6MB PNG that Canvas would produce. The file extension changes, which is a tradeoff, but a 93% size reduction beats format purity.&lt;br&gt;
HEIC conversion without a server&lt;br&gt;
This was the trickiest part. iPhones save photos as HEIC by default. Most online converters upload them to a server for decoding because browsers don't natively support HEIC — except Safari does.&lt;br&gt;
So MiniPx checks first:&lt;br&gt;
javascriptconst supportsHEICNatively = async () =&amp;gt; {&lt;br&gt;
  return new Promise((resolve) =&amp;gt; {&lt;br&gt;
    const img = new Image();&lt;br&gt;
    img.onload = () =&amp;gt; resolve(true);&lt;br&gt;
    img.onerror = () =&amp;gt; resolve(false);&lt;br&gt;
    img.src = 'data:image/heic;base64,AAAAGGZ0eXBoZWlj';&lt;br&gt;
    setTimeout(() =&amp;gt; resolve(false), 500);&lt;br&gt;
  });&lt;br&gt;
};&lt;/p&gt;

&lt;p&gt;Safari users get zero-dependency HEIC conversion through the same Canvas trick — load the HEIC into an &lt;a href="" class="article-body-image-wrapper"&gt;&lt;img&gt;&lt;/a&gt;, draw to canvas, export as JPEG. No libraries needed.&lt;br&gt;
Chrome and Firefox users get heic2any, which is a WASM-based HEIC decoder. It's about 350KB, which is heavy, so I lazy-load it only when someone actually tries to convert a HEIC file:&lt;br&gt;
javascriptconst heic2any = (await import('heic2any')).default;&lt;br&gt;
return await heic2any({ blob: file, toType: 'image/jpeg', quality: 0.92 });&lt;br&gt;
Safari users never download those 350KB. Chrome users only download them if they actually need HEIC conversion. Everyone else gets the lightweight path.&lt;br&gt;
Stripping EXIF data (the privacy part)&lt;br&gt;
This is maybe the most important feature and it's almost invisible. Photos from phones contain EXIF metadata: GPS coordinates, device model, serial numbers, timestamps, sometimes even your name.&lt;br&gt;
When you re-draw an image through Canvas, the EXIF data doesn't come along. Canvas only sees pixels — it has no concept of metadata. So every image that passes through MiniPx comes out clean. No GPS. No device info. No timestamps.&lt;br&gt;
I added a toggle for this ("Strip EXIF data") but it's on by default. The Canvas re-encoding handles it automatically — there's no extra code needed.&lt;br&gt;
The architecture&lt;br&gt;
MiniPx is a Next.js 15 static site. No API routes. No database. No server functions. The entire thing is pre-rendered HTML + JS served from Netlify's CDN.&lt;br&gt;
Stack:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Next.js 15 (static export)&lt;/li&gt;
&lt;li&gt;5 client components: ImageTool, PDFTool, HEICTool, TrackedCTA, WebVitals&lt;/li&gt;
&lt;li&gt;Everything else is server-rendered (SEO content, schemas, navigation)&lt;/li&gt;
&lt;li&gt;8 dependencies total&lt;/li&gt;
&lt;li&gt;Hosted on Netlify (free tier)
The First Load JS for any page is about 103-106KB. That's the entire app — React, the compressor, the UI, everything. For comparison, TinyPNG's homepage loads 2.4MB of JavaScript.
I'm pretty aggressive about keeping things server-rendered. The tool pages have long-form SEO content, FAQ accordions, and JSON-LD schemas, but all of that renders on the server as static HTML. The only client-side JavaScript is the actual image processing tool.
What I'd do differently
Batch processing is slow. Right now, files are processed sequentially. Web Workers would let me compress multiple images in parallel, but Canvas API doesn't work in Workers. OffscreenCanvas exists but browser support is spotty. I'm keeping an eye on this.
The PNG problem is unsolved. Client-side PNG optimization is genuinely hard. There are WASM ports of pngquant and oxipng, but they add 500KB+ to the bundle. For now, the format-switching fallback works, but it's a hack.
No preview. You can't see the compressed image before downloading it. Adding side-by-side preview would be a better UX, but it means holding two blob URLs in memory per image, which gets expensive with batch uploads.
Try it&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;*&lt;em&gt;MiniPx is free. No signup, no limits, no ads. *&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;If you're building something similar, the key insight is: Canvas + toBlob gives you 90% of what server-side image processing does, with zero infrastructure cost. The other 10% (PNG optimization, HEIC on non-Safari, advanced filters) requires WASM libraries, but you can lazy-load those so most users never pay the cost.&lt;/p&gt;

</description>
      <category>image</category>
      <category>compression</category>
      <category>ai</category>
      <category>webdev</category>
    </item>
    <item>
      <title>I built a free image compressor that never uploads your files</title>
      <dc:creator>Gaurav Bhowmick</dc:creator>
      <pubDate>Sun, 19 Apr 2026 18:13:12 +0000</pubDate>
      <link>https://dev.to/bhowmick773/i-built-a-free-image-compressor-that-never-uploads-your-files-1p1</link>
      <guid>https://dev.to/bhowmick773/i-built-a-free-image-compressor-that-never-uploads-your-files-1p1</guid>
      <description>&lt;p&gt;Every free image compressor I tried had the same problems: they upload your images to some server, limit you to 5 files a day, or force you to create an account.&lt;br&gt;
I just wanted to drop images, compress, download. So I built one.&lt;br&gt;
Meet MiniPx&lt;br&gt;
minipx.com — a free image compressor that runs 100% in your browser. Your images never leave your device.&lt;br&gt;
How it works&lt;br&gt;
The compression engine uses the HTML5 Canvas API. Here's the core idea:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Load the image into a Canvas element&lt;/li&gt;
&lt;li&gt;Re-encode it at a lower quality using canvas.toBlob()&lt;/li&gt;
&lt;li&gt;If the output is larger than the original (happens with PNGs), try progressively lower quality settings&lt;/li&gt;
&lt;li&gt;If WebP still can't beat the original, fall back to JPEG&lt;/li&gt;
&lt;li&gt;Always return the smallest successful result&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Code:&lt;br&gt;
// Simplified version of the multi-pass compression&lt;br&gt;
async function compress(file, quality, format) {&lt;br&gt;
  const img = await loadImage(file);&lt;br&gt;
  const canvas = document.createElement('canvas');&lt;br&gt;
  canvas.width = img.naturalWidth;&lt;br&gt;
  canvas.height = img.naturalHeight;&lt;br&gt;
  const ctx = canvas.getContext('2d');&lt;br&gt;
  ctx.drawImage(img, 0, 0);&lt;/p&gt;

&lt;p&gt;// First pass&lt;br&gt;
  let blob = await canvasToBlob(canvas, format, quality);&lt;/p&gt;

&lt;p&gt;// Smart fallback if output is larger&lt;br&gt;
  if (blob.size &amp;gt;= file.size) {&lt;br&gt;
    for (const q of [0.6, 0.45, 0.3, 0.2]) {&lt;br&gt;
      const attempt = await canvasToBlob(canvas, format, q);&lt;br&gt;
      if (attempt.size &amp;lt; file.size) {&lt;br&gt;
        blob = attempt;&lt;br&gt;
        break;&lt;br&gt;
      }&lt;br&gt;
    }&lt;br&gt;
  }&lt;br&gt;
  return blob;&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;The key insight: Canvas API quality values aren't linear. Quality 0.92 barely compresses anything. Quality 0.65 (our "Smart" preset) gives 50-70% reduction with no visible quality loss.&lt;br&gt;
Features&lt;/p&gt;

&lt;p&gt;Compress — Gentle (20-40%), Smart (50-70%), Tiny (75-90%)&lt;br&gt;
Convert — WebP, JPEG, PNG in any direction&lt;br&gt;
Resize — Set max width to 1920, 1280, or 800px&lt;br&gt;
Batch — No file limits, process as many as you want&lt;br&gt;
Private — Zero server uploads, works offline&lt;/p&gt;

&lt;p&gt;Tech stack&lt;/p&gt;

&lt;p&gt;Next.js 15 (static export)&lt;br&gt;
React 19&lt;br&gt;
Canvas API for compression&lt;br&gt;
Deployed on Netlify (zero server cost)&lt;/p&gt;

&lt;p&gt;What I learned&lt;br&gt;
Canvas API quality is not what you think. I initially set the "Smart" preset to quality 0.78 thinking that's decent compression. A 3MB PNG screenshot compressed to 3.2MB — larger than the original. Turns out 0.78 is barely any compression for the Canvas encoder. Dropping to 0.65 fixed everything.&lt;br&gt;
The multi-pass approach matters. A single-pass compressor will sometimes produce files larger than the original, especially with PNG screenshots. The fallback chain (try lower quality → try different format) ensures the output is always smaller.&lt;br&gt;
Static sites are underrated for tools. MiniPx costs $0/month to run. No server, no database, no file storage, no bandwidth costs. The entire thing is HTML/CSS/JS served from a CDN.&lt;br&gt;
Try it&lt;br&gt;
minipx.com — would love your feedback. What features would you want added?&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>privacy</category>
      <category>showdev</category>
      <category>webdev</category>
    </item>
  </channel>
</rss>
