AWS S3 charges you to store data AND to retrieve it. The egress fees are the nasty surprise — serving 1TB of images costs $90/month just in bandwidth.
Cloudflare R2 gives you S3-compatible storage with zero egress fees. The free tier includes 10GB storage and 10 million reads/month. Zero. Egress. Fees.
What You Get for Free
- 10 GB storage — enough for thousands of files
- 1 million writes/month — PUT, POST, DELETE
- 10 million reads/month — GET requests
- Zero egress fees — serve files globally, pay nothing for bandwidth
- S3-compatible API — use any S3 SDK or tool
- Custom domains — serve from your own domain
- Workers integration — process files at the edge
Quick Start
1. Create a Bucket
In Cloudflare Dashboard > R2 > Create bucket. Get your Account ID and generate an API token with R2 permissions.
2. Node.js (AWS SDK)
npm install @aws-sdk/client-s3
import { S3Client, PutObjectCommand, GetObjectCommand } from "@aws-sdk/client-s3";
const r2 = new S3Client({
region: "auto",
endpoint: "https://YOUR_ACCOUNT_ID.r2.cloudflarestorage.com",
credentials: {
accessKeyId: "YOUR_ACCESS_KEY",
secretAccessKey: "YOUR_SECRET_KEY",
},
});
// Upload a file
await r2.send(new PutObjectCommand({
Bucket: "my-bucket",
Key: "images/photo.jpg",
Body: fileBuffer,
ContentType: "image/jpeg",
}));
// Download a file
const response = await r2.send(new GetObjectCommand({
Bucket: "my-bucket",
Key: "images/photo.jpg",
}));
const data = await response.Body.transformToByteArray();
3. Python (boto3)
import boto3
r2 = boto3.client(
"s3",
endpoint_url="https://YOUR_ACCOUNT_ID.r2.cloudflarestorage.com",
aws_access_key_id="YOUR_ACCESS_KEY",
aws_secret_access_key="YOUR_SECRET_KEY",
region_name="auto",
)
# Upload
r2.upload_file("photo.jpg", "my-bucket", "images/photo.jpg")
# Download
r2.download_file("my-bucket", "images/photo.jpg", "downloaded.jpg")
# List files
objects = r2.list_objects_v2(Bucket="my-bucket", Prefix="images/")
for obj in objects.get("Contents", []):
print(f"{obj['Key']} — {obj['Size']} bytes")
4. CLI (rclone or AWS CLI)
# AWS CLI
aws s3 cp photo.jpg s3://my-bucket/images/ \
--endpoint-url https://YOUR_ACCOUNT_ID.r2.cloudflarestorage.com
# List
aws s3 ls s3://my-bucket/ \
--endpoint-url https://YOUR_ACCOUNT_ID.r2.cloudflarestorage.com
5. Serve Files via Custom Domain
Connect a custom domain in Cloudflare Dashboard and your files are instantly served via Cloudflare CDN — cached at 300+ edge locations worldwide.
https://cdn.yourdomain.com/images/photo.jpg
Global delivery. Zero egress costs. Automatic caching.
The Egress Math
| Provider | 1TB egress/month |
|---|---|
| AWS S3 | $90 |
| Google Cloud Storage | $120 |
| Azure Blob | $87 |
| Cloudflare R2 | $0 |
Real-World Use Case
A developer hosting a podcast told me: "Each episode is 50MB. With 10K downloads per episode on S3, I was paying $45/month just in bandwidth. Moved to R2, same URLs, same S3 API — bandwidth bill dropped to $0. I only pay $0.36/month for storage."
Free Plan Limits
| Feature | Free Tier |
|---|---|
| Storage | 10 GB |
| Reads (GET) | 10M/month |
| Writes (PUT) | 1M/month |
| Egress | Always free |
| S3 API | Compatible |
| Custom domains | Yes |
The Bottom Line
If you are serving files to users — images, videos, downloads, static assets — R2 saves you real money. S3 compatibility means migration takes minutes, not days.
Need to store scraped data? Check out my web scraping tools on Apify — extract data from any website and export to any storage.
Building something custom? Email me at spinov001@gmail.com
Top comments (0)