Large PDF files are a problem. They're slow to upload, slow to download, and expensive to store. I've dealt with 50MB invoice PDFs that could have been 5MB, and 200MB reports that brought email servers to their knees.
PDF compression fixes this. You can reduce file sizes by 30-70% without noticeable quality loss.
using IronPdf;
// Install via NuGet: Install-Package IronPdf
var pdf = PdfDocument.FromFile("large.pdf");
pdf.CompressImages(60); // 60% JPEG quality
pdf.SaveAs("compressed.pdf");
That's it. One line compresses all images in the PDF.
Why Are PDFs So Large?
Images. A typical business report has:
- Screenshots at full resolution
- Charts exported as high-DPI PNGs
- Logos embedded at print quality
Each image adds megabytes. A 10-page report with 20 images easily hits 30-50MB.
I built a reporting system that generated financial PDFs. Users complained about email attachment size limits. We added compression and reduced average file sizes from 40MB to 12MB — a 70% reduction.
How Do I Compress Images in a PDF?
The CompressImages() method takes a quality parameter (0-100):
var pdf = PdfDocument.FromFile("report.pdf");
pdf.CompressImages(80); // 80% quality
pdf.SaveAs("compressed.pdf");
Quality guidelines:
- 90-100: High quality, minimal compression (use for photos or detailed graphics)
- 70-89: Medium quality, balanced compression (good for most business documents)
- 50-69: Lower quality, aggressive compression (use for drafts or internal docs)
I use 70-80 for production documents. Users rarely notice the difference, and file sizes drop significantly.
What Quality Setting Should I Use?
Test and compare. Generate PDFs at different quality levels:
foreach (int quality in new[] { 90, 80, 70, 60, 50 })
{
var pdf = PdfDocument.FromFile("original.pdf");
pdf.CompressImages(quality);
pdf.SaveAs($"compressed-{quality}.pdf");
}
Open them side-by-side and pick the lowest quality that looks acceptable. The sweet spot is usually 70-80.
For images with text (screenshots, diagrams), use higher quality (80-90). For photos or decorative graphics, you can go lower (60-70).
Can I Compress PDFs Generated from HTML?
Yes. Generate the PDF, then compress:
var renderer = new [ChromePdfRenderer](https://ironpdf.com/blog/videos/how-to-render-html-string-to-pdf-in-csharp-ironpdf/)();
var pdf = renderer.RenderHtmlAsPdf("<h1>Report</h1><img src='chart.png' />");
pdf.CompressImages(75);
pdf.SaveAs("report.pdf");
This is useful when your HTML includes high-resolution images. IronPDF renders them at full quality, then compression reduces file size.
I use this for dashboards with charts. The charts render at high DPI for clarity, then compression brings file size down to acceptable levels.
What About Table-Heavy PDFs?
PDFs with lots of tables have a different problem: internal structure overhead. Use CompressStructTree():
var pdf = PdfDocument.FromFile("tables.pdf");
pdf.CompressStructTree();
pdf.SaveAs("compressed.pdf");
This removes metadata from the PDF's internal tree structure. It's especially effective for PDFs generated from HTML tables.
Warning: This can affect text selection and extraction. Test before using in production.
I used this for a data export feature that generated 100-page tables. File sizes dropped by 40% with no visual changes.
Can I Use Both Compression Methods Together?
Yes:
var pdf = PdfDocument.FromFile("large.pdf");
pdf.CompressImages(75);
pdf.CompressStructTree();
pdf.SaveAs("compressed.pdf");
Combine image compression and structure compression for maximum reduction. I've seen 60% file size reductions with both enabled.
How Do I Control Compression More Precisely?
Use CompressionOptions:
var pdf = PdfDocument.FromFile("document.pdf");
var options = new CompressionOptions
{
CompressImages = true,
JpegQuality = 80,
HighQualityImageSubsampling = true,
ShrinkImages = true,
RemoveStructureTree = true
};
pdf.Compress(options);
pdf.SaveAs("compressed.pdf");
Key options:
-
JpegQuality: 0-100, same as
CompressImages() - ShrinkImages: Scales down images larger than needed for display
- HighQualityImageSubsampling: Uses better color sampling (4:4:4 vs 4:1:1)
-
RemoveStructureTree: Same as
CompressStructTree()
I use ShrinkImages when users upload unnecessarily large images. A 4000x3000 photo displayed at 800x600 gets scaled down, saving space without visible quality loss.
What's Chroma Subsampling?
JPEG compression can reduce color information more than brightness. The human eye is less sensitive to color detail.
4:4:4 (high quality): Full color detail preserved
4:2:0 (standard): Color sampled at half resolution
4:1:1 (aggressive): Color sampled at quarter resolution
Set HighQualityImageSubsampling = true for 4:4:4. Use it for photos or graphics where color accuracy matters. Disable it for diagrams and screenshots where file size matters more.
How Much Compression Can I Expect?
Depends on content:
- Photo-heavy PDFs: 40-70% reduction at quality 70-80
- Screenshot-heavy PDFs: 30-50% reduction
- Table-heavy PDFs: 20-40% reduction with structure compression
- Mixed content: 35-60% reduction
I tested on a 50MB marketing brochure with photos:
- Original: 50MB
- Quality 80: 22MB (56% reduction)
- Quality 70: 16MB (68% reduction)
- Quality 60: 12MB (76% reduction, visible artifacts)
We shipped with quality 75 — 64% reduction, no complaints.
Does Compression Affect PDF Quality?
Image compression is lossy. You're permanently reducing image quality. But:
- At quality 80+, most people can't tell
- At quality 70-79, it's noticeable only on close inspection
- At quality 60-69, artifacts become visible
Structure tree compression is lossless visually, but can break text extraction and accessibility features.
Always test with actual users before deploying compression.
Can I Compress PDFs in Bulk?
Loop through files:
var files = Directory.GetFiles(@"C:\PDFs", "*.pdf");
foreach (var file in files)
{
var pdf = PdfDocument.FromFile(file);
pdf.CompressImages(75);
pdf.SaveAs(file.Replace(".pdf", "-compressed.pdf"));
}
I built a nightly job that compressed archived reports. Over 1000 files, we recovered 200GB of storage.
Should I Compress Before or After Merging PDFs?
After merging is more efficient:
var merged = PdfDocument.Merge(
PdfDocument.FromFile("doc1.pdf"),
PdfDocument.FromFile("doc2.pdf"),
PdfDocument.FromFile("doc3.pdf")
);
merged.CompressImages(75);
merged.SaveAs("final.pdf");
Compressing once on the merged PDF is faster than compressing three separate PDFs then merging.
Quick Reference
| Method | Use Case | Tradeoff |
|---|---|---|
CompressImages(quality) |
Image-heavy PDFs | Quality vs. size |
CompressStructTree() |
Table-heavy PDFs | May break text extraction |
Compress(options) |
Fine-grained control | More complex API |
| Quality 90-100 | Photos, print | Minimal compression |
| Quality 70-89 | Business docs | Balanced |
| Quality 50-69 | Drafts, internal | Visible artifacts |
Key Principles:
- Test compression on actual content before deploying
- Use 70-80 quality for most business documents
- Combine image and structure compression for maximum reduction
- Always keep originals — compression is irreversible
- Measure file size reduction to justify quality tradeoffs
The complete PDF compression guide covers advanced topics like selective compression and batch processing.
Written by Jacob Mellor, CTO at Iron Software. Jacob created IronPDF and leads a team of 50+ engineers building .NET document processing libraries.
Top comments (0)