MinIO is S3-compatible object storage you can self-host — and its API is 100% compatible with AWS S3 SDKs.
S3-Compatible API
Use any AWS S3 SDK — it just works:
import { S3Client, PutObjectCommand, GetObjectCommand, ListObjectsV2Command } from '@aws-sdk/client-s3'
const s3 = new S3Client({
endpoint: 'http://localhost:9000',
region: 'us-east-1',
credentials: { accessKeyId: 'minioadmin', secretAccessKey: 'minioadmin' },
forcePathStyle: true
})
// Upload file
await s3.send(new PutObjectCommand({
Bucket: 'my-bucket',
Key: 'uploads/photo.jpg',
Body: fileBuffer,
ContentType: 'image/jpeg'
}))
// Download file
const response = await s3.send(new GetObjectCommand({
Bucket: 'my-bucket',
Key: 'uploads/photo.jpg'
}))
// List files
const list = await s3.send(new ListObjectsV2Command({
Bucket: 'my-bucket',
Prefix: 'uploads/'
}))
Presigned URLs — Direct Upload/Download
import { getSignedUrl } from '@aws-sdk/s3-request-presigner'
// Generate upload URL (client uploads directly to MinIO)
const uploadUrl = await getSignedUrl(s3, new PutObjectCommand({
Bucket: 'my-bucket',
Key: `uploads/${userId}/${filename}`
}), { expiresIn: 3600 })
// Generate download URL
const downloadUrl = await getSignedUrl(s3, new GetObjectCommand({
Bucket: 'my-bucket',
Key: 'uploads/report.pdf'
}), { expiresIn: 3600 })
Event Notifications
MinIO sends events when objects change:
# Configure webhook notification
mc admin config set myminio notify_webhook:1 \
endpoint="http://localhost:3000/api/minio-webhook" \
queue_limit="0"
# Set bucket notification
mc event add myminio/my-bucket arn:minio:sqs::1:webhook --event put,delete
// Handle webhook
export async function POST(request) {
const event = await request.json()
for (const record of event.Records) {
console.log(`${record.eventName}: ${record.s3.object.key}`)
if (record.eventName === 's3:ObjectCreated:Put') {
await processUploadedFile(record.s3.object.key)
}
}
}
MinIO Client SDK
import * as Minio from 'minio'
const client = new Minio.Client({
endPoint: 'localhost',
port: 9000,
useSSL: false,
accessKey: 'minioadmin',
secretKey: 'minioadmin'
})
// Create bucket
await client.makeBucket('backups', 'us-east-1')
// Upload with metadata
await client.putObject('backups', 'db-2024-01-01.sql.gz', fileStream, size, {
'Content-Type': 'application/gzip',
'X-Backup-Type': 'daily'
})
// Stream download
const stream = await client.getObject('backups', 'db-2024-01-01.sql.gz')
stream.pipe(fs.createWriteStream('./restore.sql.gz'))
Real-World Use Case
A startup was paying $500/month for S3 storing 2TB of user uploads. They deployed MinIO on a $40/month dedicated server with 4TB NVMe. Same S3 API, same SDK, zero code changes. Annual savings: $5,520. When they needed to go multi-region, they used MinIO's built-in replication — still cheaper than S3.
MinIO is S3 without the AWS bill.
Build Smarter Data Pipelines
Need to scrape websites, extract APIs, or automate data collection? Check out my ready-to-use scrapers on Apify — no coding required.
Custom scraping solution? Email me at spinov001@gmail.com — fast turnaround, fair prices.
Top comments (0)