MinIO is a free, open-source object storage system that is API-compatible with Amazon S3. Run your own S3 — on your hardware, at your scale.
What Is MinIO?
MinIO is a high-performance object storage system built for cloud-native workloads. It implements the S3 API, so any tool that works with AWS S3 works with MinIO.
Key features:
- 100% S3 API compatible
- High performance (325 GiB/s read, 165 GiB/s write)
- Erasure coding for data protection
- Encryption at rest and in transit
- Versioning and immutable objects
- Bucket replication
- Identity management (LDAP, OIDC)
- Kubernetes native
- Single binary deployment
Quick Start
Docker
docker run -p 9000:9000 -p 9001:9001 \
-e MINIO_ROOT_USER=minioadmin \
-e MINIO_ROOT_PASSWORD=minioadmin \
-v minio-data:/data \
minio/minio server /data --console-address ":9001"
- API:
http://localhost:9000 - Console:
http://localhost:9001
Binary
wget https://dl.min.io/server/minio/release/linux-amd64/minio
chmod +x minio
./minio server /data
Use With AWS SDK
Since MinIO is S3-compatible, use any AWS SDK:
JavaScript
import { S3Client, PutObjectCommand, GetObjectCommand } from "@aws-sdk/client-s3";
const s3 = new S3Client({
endpoint: "http://localhost:9000",
region: "us-east-1",
credentials: { accessKeyId: "minioadmin", secretAccessKey: "minioadmin" },
forcePathStyle: true
});
// Upload
await s3.send(new PutObjectCommand({
Bucket: "my-bucket",
Key: "data/report.json",
Body: JSON.stringify({ results: [1, 2, 3] })
}));
// Download
const obj = await s3.send(new GetObjectCommand({
Bucket: "my-bucket",
Key: "data/report.json"
}));
Python (boto3)
import boto3
s3 = boto3.client("s3",
endpoint_url="http://localhost:9000",
aws_access_key_id="minioadmin",
aws_secret_access_key="minioadmin"
)
# Upload
s3.upload_file("local-file.csv", "my-bucket", "data/file.csv")
# List objects
response = s3.list_objects_v2(Bucket="my-bucket", Prefix="data/")
for obj in response["Contents"]:
print(obj["Key"])
CLI (mc)
# Install MinIO Client
wget https://dl.min.io/client/mc/release/linux-amd64/mc
chmod +x mc
# Configure
mc alias set local http://localhost:9000 minioadmin minioadmin
# Create bucket
mc mb local/my-bucket
# Upload files
mc cp data/ local/my-bucket/data/ --recursive
# List files
mc ls local/my-bucket/
Cost Comparison
| Service | 1TB Storage/month | Egress 100GB |
|---|---|---|
| AWS S3 | $23 | $9 |
| Google Cloud | $20 | $12 |
| Azure Blob | $18 | $8.7 |
| MinIO (self-hosted) | $0 | $0 |
| MinIO + Hetzner VPS | $5/mo | $0 |
Use Cases
- Data lakes: Store petabytes of analytics data
- Backups: S3-compatible backup target for restic, Borg, Velero
- ML/AI: Store training data and model artifacts
- Media: Images, videos, documents at scale
- Logs: Centralized log storage
Enterprise Features (Free)
- Erasure coding: Data survives drive failures
- Bitrot protection: Automatic data integrity checks
- Encryption: Server-side encryption with external KMS
- Versioning: Keep every version of every object
- Lifecycle rules: Auto-delete or transition objects
- Replication: Site-to-site, bucket-to-bucket
Who Uses MinIO?
With 49K+ GitHub stars:
- Every major cloud-native project
- AI/ML teams for data storage
- Companies replacing expensive S3 bills
- DevOps for artifact and backup storage
Get Started
- Run single binary or Docker container
- Use any S3-compatible tool or SDK
- Store unlimited data on your hardware
Your own Amazon S3. Free forever.
Need to store scraped web data? Check out my web scraping tools on Apify — extract and store data from any website. Custom solutions: spinov001@gmail.com
Top comments (0)