LocalStack emulates 80+ AWS services on your laptop. Test Lambda, S3, DynamoDB, SQS, SNS, and more — completely offline, completely free. Your existing AWS SDK code works without changes.
Why LocalStack?
- 80+ AWS services — S3, Lambda, DynamoDB, SQS, SNS, API Gateway...
- No AWS account needed — fully local, no cloud costs
- Same API — use standard AWS SDKs and CLI
- Fast feedback — test in seconds, not minutes
- CI-friendly — Docker container, works in any CI
Quick Start
# Docker
docker run -d --name localstack \
-p 4566:4566 -p 4510-4559:4510-4559 \
localstack/localstack
# Or via pip
pip install localstack
localstack start
# Or Docker Compose
AWS CLI Against LocalStack
# Configure endpoint
export AWS_ENDPOINT_URL=http://localhost:4566
export AWS_DEFAULT_REGION=us-east-1
export AWS_ACCESS_KEY_ID=test
export AWS_SECRET_ACCESS_KEY=test
# S3
aws --endpoint-url=$AWS_ENDPOINT_URL s3 mb s3://my-bucket
aws --endpoint-url=$AWS_ENDPOINT_URL s3 cp file.txt s3://my-bucket/
aws --endpoint-url=$AWS_ENDPOINT_URL s3 ls s3://my-bucket/
# DynamoDB
aws --endpoint-url=$AWS_ENDPOINT_URL dynamodb create-table \
--table-name users \
--key-schema AttributeName=id,KeyType=HASH \
--attribute-definitions AttributeName=id,AttributeType=S \
--billing-mode PAY_PER_REQUEST
aws --endpoint-url=$AWS_ENDPOINT_URL dynamodb put-item \
--table-name users \
--item '{"id": {"S": "1"}, "name": {"S": "Alice"}}'
# SQS
aws --endpoint-url=$AWS_ENDPOINT_URL sqs create-queue --queue-name my-queue
aws --endpoint-url=$AWS_ENDPOINT_URL sqs send-message \
--queue-url http://localhost:4566/000000000000/my-queue \
--message-body '{"event": "user_signup"}'
# Lambda
aws --endpoint-url=$AWS_ENDPOINT_URL lambda create-function \
--function-name hello \
--runtime python3.12 \
--handler handler.lambda_handler \
--zip-file fileb://function.zip \
--role arn:aws:iam::000000000000:role/lambda-role
aws --endpoint-url=$AWS_ENDPOINT_URL lambda invoke \
--function-name hello \
--payload '{"name": "World"}' output.json
Python (boto3 — No Code Changes!)
import boto3
# Just change the endpoint
s3 = boto3.client('s3', endpoint_url='http://localhost:4566')
dynamodb = boto3.resource('dynamodb', endpoint_url='http://localhost:4566')
sqs = boto3.client('sqs', endpoint_url='http://localhost:4566')
# Use exactly like real AWS
s3.create_bucket(Bucket='my-bucket')
s3.put_object(Bucket='my-bucket', Key='data.json', Body='{"hello": "world"}')
# DynamoDB
table = dynamodb.Table('users')
table.put_item(Item={'id': '1', 'name': 'Alice', 'email': 'alice@example.com'})
result = table.get_item(Key={'id': '1'})
print(result['Item']) # {'id': '1', 'name': 'Alice', 'email': 'alice@example.com'}
Docker Compose
version: '3'
services:
localstack:
image: localstack/localstack
ports:
- "4566:4566"
environment:
- SERVICES=s3,sqs,dynamodb,lambda
- DEBUG=1
volumes:
- "./init-aws.sh:/etc/localstack/init/ready.d/init-aws.sh"
app:
build: .
environment:
- AWS_ENDPOINT_URL=http://localstack:4566
- AWS_DEFAULT_REGION=us-east-1
- AWS_ACCESS_KEY_ID=test
- AWS_SECRET_ACCESS_KEY=test
depends_on:
- localstack
Free vs Pro
| Feature | Free | Pro |
|---|---|---|
| Core services | S3, SQS, DynamoDB, Lambda, SNS, etc. | All 80+ |
| API Gateway | REST API | REST + HTTP + WebSocket |
| IAM | Basic | Full policy enforcement |
| Persistence | No | Yes |
| Cloud Pods | No | Yes |
Resources
Need AWS testing or data automation? Check my Apify actors or email spinov001@gmail.com.
Top comments (0)