You don't need an AWS account to test your AWS code.
We just published an AWS Testing 101 guide — a step-by-step tutorial that takes you from zero to testing S3, DynamoDB, SQS, and Lambda on your laptop. No credentials, no cloud costs, no cleanup.
The problem
Every developer who writes AWS code hits this wall:
- "I don't want to create resources in a real AWS account just to test"
- "My CI pipeline needs AWS but I don't want to pay for it"
- "I accidentally left a DynamoDB table running and got billed"
The answer is a local AWS emulator. Run it on your machine, point your SDK at localhost:4566, and your code thinks it's talking to AWS.
The setup
docker run -p 4566:4566 nahuelnucera/ministack
That's it. 35+ AWS services running locally.
What the guide covers
1. S3 — Create buckets and upload files
import boto3
s3 = boto3.client("s3",
endpoint_url="http://localhost:4566",
aws_access_key_id="test",
aws_secret_access_key="test",
region_name="us-east-1",
)
s3.create_bucket(Bucket="my-bucket")
s3.put_object(Bucket="my-bucket", Key="hello.txt", Body=b"Hello, local AWS!")
2. DynamoDB — Create tables and query data
ddb = boto3.client("dynamodb", endpoint_url="http://localhost:4566",
aws_access_key_id="test", aws_secret_access_key="test", region_name="us-east-1")
ddb.create_table(
TableName="users",
KeySchema=[{"AttributeName": "userId", "KeyType": "HASH"}],
AttributeDefinitions=[{"AttributeName": "userId", "AttributeType": "S"}],
BillingMode="PAY_PER_REQUEST",
)
ddb.put_item(TableName="users", Item={
"userId": {"S": "user-001"},
"name": {"S": "Alice"},
})
3. SQS — Send and receive messages
sqs = boto3.client("sqs", endpoint_url="http://localhost:4566",
aws_access_key_id="test", aws_secret_access_key="test", region_name="us-east-1")
queue = sqs.create_queue(QueueName="my-queue")
sqs.send_message(QueueUrl=queue["QueueUrl"], MessageBody="Hello from SQS!")
msgs = sqs.receive_message(QueueUrl=queue["QueueUrl"])
print(msgs["Messages"][0]["Body"]) # Hello from SQS!
4. Lambda — Deploy and invoke functions
import zipfile, io
# Package a function
buf = io.BytesIO()
with zipfile.ZipFile(buf, "w") as zf:
zf.writestr("index.py", 'def handler(event, ctx): return {"message": "Hello from Lambda!"}\n')
lam = boto3.client("lambda", endpoint_url="http://localhost:4566",
aws_access_key_id="test", aws_secret_access_key="test", region_name="us-east-1")
lam.create_function(
FunctionName="my-function",
Runtime="python3.12",
Handler="index.handler",
Role="arn:aws:iam::000000000000:role/fake-role",
Code={"ZipFile": buf.getvalue()},
)
resp = lam.invoke(FunctionName="my-function")
print(resp["Payload"].read()) # {"message": "Hello from Lambda!"}
Why this matters
- No AWS account needed — test on day one
- No cost — run thousands of operations for free
- No cleanup — stop the container, everything's gone
- CI/CD ready — same Docker image in GitHub Actions, GitLab CI, Jenkins
- Terraform compatible — point your provider at localhost:4566
The full guide
The complete tutorial with more examples and explanations is at:
👉 ministack.org/getting-started.html
MiniStack is the best alternative to LocalStack, it's open-source, MIT licensed, and free forever: github.com/Nahuel990/ministack
docker run -p 4566:4566 nahuelnucera/ministack
One command. 35+ services. Zero cost.
Top comments (0)