Your users uploaded 200 photos. Now they want to download them all. What do you do?
The naive approach — loop through files, zip them on your server, serve the result — falls apart fast. Memory spikes with large files. Egress fees add up. You need temp storage, cleanup jobs, and error handling for partial failures.
I hit this exact wall building a file-sharing service that's now processed 550K+ files and 10TB+ of archives. After weeks of wrestling with ZIP64, streaming, and Cloudflare Workers' 128MB memory limit, I turned my solution into an API. Here's how you can skip that pain entirely.
The 30-minute version
Step 1: Get an API key
Sign up at eazip.io (free tier, no credit card). Grab your API key from the dashboard.
Step 2: Collect your file URLs
You already have these — they're in your database. S3 presigned URLs, R2 public URLs, any HTTPS endpoint that returns a file.
const fileUrls = await db.query(
'SELECT file_url, original_filename FROM uploads WHERE project_id = ?',
[projectId]
);
Step 3: One API call
const response = await fetch('https://api.eazip.io/jobs', {
method: 'POST',
headers: {
'X-API-Key': process.env.EAZIP_API_KEY,
'Content-Type': 'application/json',
},
body: JSON.stringify({
files: fileUrls.map(f => ({
url: f.file_url,
filename: f.original_filename,
})),
}),
});
const { job_id } = await response.json();
Step 4: Poll for completion and redirect
async function waitForZip(jobId) {
const MAX_ATTEMPTS = 60;
for (let i = 0; i < MAX_ATTEMPTS; i++) {
const res = await fetch(\`https://api.eazip.io/jobs/\${jobId}\`, {
headers: { 'X-API-Key': process.env.EAZIP_API_KEY },
});
const { job } = await res.json();
if (job.status === 'completed') return job.download_url;
if (job.status === 'failed') throw new Error('ZIP job failed');
await new Promise(r => setTimeout(r, 2000)); // wait 2s
}
throw new Error('ZIP job timed out');
}
// In your route handler:
const downloadUrl = await waitForZip(job_id);
res.redirect(downloadUrl);
That's it. Your users click "Download All", get a ZIP a few seconds later.
What you didn't have to build
- ZIP64 support — files over 4GB just work
- Streaming — constant memory regardless of archive size
- Error recovery — if file #500 fails, the job retries from a checkpoint
- Temp storage cleanup — signed URLs expire automatically
- Egress optimization — zero egress fees on Cloudflare's network
When this makes sense
- SaaS with user-uploaded files — "Download all attachments" in a support ticket, bulk photo export from a gallery
- E-commerce — product image packs, digital goods delivery
- Internal tools — compliance teams exporting 6 months of audit logs, database backups
When it doesn't
- If you only serve a handful of small files — just zip them in memory
- If you need real-time streaming to the browser — this is async (job → download URL)
- If you need custom compression settings — eazip uses STORE (no compression) for speed
Try it
eazip.io — free tier: 60 GB-days/month, no credit card.
Building an export feature? I'd love to hear about your use case — drop a comment or reach out.
Top comments (0)