What is MinIO?
MinIO is a high-performance, S3-compatible object storage server. It's the most popular self-hosted alternative to AWS S3 — used by companies like Snowflake, VMware, and Nutanix in production.
100% S3 API compatible. Your existing AWS SDK code works with zero changes.
Quick Start
# Docker
docker run -p 9000:9000 -p 9001:9001 \
-e MINIO_ROOT_USER=admin \
-e MINIO_ROOT_PASSWORD=password123 \
minio/minio server /data --console-address ":9001"
# Or binary
wget https://dl.min.io/server/minio/release/linux-amd64/minio
chmod +x minio
./minio server /data
Console: http://localhost:9001 | API: http://localhost:9000
S3-Compatible API
Use the standard AWS SDK — just point it to your MinIO server:
import { S3Client, PutObjectCommand, GetObjectCommand, ListObjectsV2Command } from "@aws-sdk/client-s3";
const s3 = new S3Client({
endpoint: "http://localhost:9000",
region: "us-east-1",
credentials: {
accessKeyId: "admin",
secretAccessKey: "password123",
},
forcePathStyle: true,
});
// Upload file
await s3.send(new PutObjectCommand({
Bucket: "my-bucket",
Key: "photos/vacation.jpg",
Body: fileBuffer,
ContentType: "image/jpeg",
}));
// Download file
const { Body } = await s3.send(new GetObjectCommand({
Bucket: "my-bucket",
Key: "photos/vacation.jpg",
}));
const data = await Body.transformToByteArray();
// List objects
const { Contents } = await s3.send(new ListObjectsV2Command({
Bucket: "my-bucket",
Prefix: "photos/",
}));
Contents.forEach(obj => console.log(obj.Key, obj.Size));
Python
import boto3
s3 = boto3.client('s3',
endpoint_url='http://localhost:9000',
aws_access_key_id='admin',
aws_secret_access_key='password123'
)
# Upload
s3.upload_file('local.pdf', 'my-bucket', 'documents/file.pdf')
# Download
s3.download_file('my-bucket', 'documents/file.pdf', 'downloaded.pdf')
# List
response = s3.list_objects_v2(Bucket='my-bucket')
for obj in response['Contents']:
print(obj['Key'], obj['Size'])
MinIO Client (mc)
# Configure
mc alias set myminio http://localhost:9000 admin password123
# Create bucket
mc mb myminio/my-bucket
# Upload
mc cp myfile.txt myminio/my-bucket/
mc cp --recursive ./photos/ myminio/my-bucket/photos/
# Download
mc cp myminio/my-bucket/myfile.txt ./
# List
mc ls myminio/my-bucket/
# Mirror (sync)
mc mirror ./local-dir myminio/my-bucket/backup/
# Set public access
mc anonymous set download myminio/my-bucket/public/
Presigned URLs
import { getSignedUrl } from "@aws-sdk/s3-request-presigner";
// Generate upload URL (valid 1 hour)
const uploadUrl = await getSignedUrl(s3, new PutObjectCommand({
Bucket: "my-bucket",
Key: "user-uploads/file.pdf",
}), { expiresIn: 3600 });
// Generate download URL
const downloadUrl = await getSignedUrl(s3, new GetObjectCommand({
Bucket: "my-bucket",
Key: "user-uploads/file.pdf",
}), { expiresIn: 3600 });
Event Notifications
# Notify on new objects via webhook
mc event add myminio/my-bucket arn:minio:sqs::1:webhook --event put
# Or via Kafka, NATS, Redis, PostgreSQL, MySQL, Elasticsearch
MinIO vs S3
| Feature | MinIO | AWS S3 |
|---|---|---|
| S3 compatible | 100% | Native |
| Self-host | Yes | No |
| Cost | $0 (self) | Pay per GB |
| Performance | Fast | Fast |
| Encryption | Yes | Yes |
| Versioning | Yes | Yes |
| Replication | Yes | Yes |
Need object storage or data infrastructure setup?
📧 spinov001@gmail.com
🔧 My tools on Apify Store
S3 or self-hosted storage? What do you use?
Top comments (0)