Students will SEE messages moving.
🎯 LAB: Event-Driven Image Processing Architecture
What Students Will Build
User → ALB → EC2 (Web Tier) → S3
EC2 publishes event → SNS → SQS
Worker EC2 reads SQS → processes file → stores result in S3
🌐 Final Architecture
🟢 PHASE 1 — Create S3 Bucket
Step 1 — Open S3
- AWS Console → search S3
- Click Create bucket
Step 2 — Configure
Bucket name:
student-upload-bucket-<yourname>
Region:
us-east-2
Uncheck:
Block all public access
Check confirmation box.
Click:
Create bucket
🟢 PHASE 2 — Create SNS Topic
- Search SNS
- Click Create topic
- Type: Standard
- Name:
file-upload-topic
- Click Create topic
- Copy ARN → save it
🟢 PHASE 3 — Create SQS Queue
- Search SQS
- Click Create queue
- Type: Standard
- Name:
file-processing-queue
- Click Create queue
- Copy ARN
🟢 PHASE 4 — Subscribe SQS to SNS
- Go back to SNS
- Click topic:
file-upload-topic - Click Create subscription
- Protocol: Amazon SQS
- Endpoint: paste SQS ARN
- Click Create subscription
🟢 PHASE 5 — Allow SNS → SQS
- Go to SQS → file-processing-queue
- Click Edit access policy
- Choose Advanced
- Paste:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Allow-SNS",
"Effect": "Allow",
"Principal": { "Service": "sns.amazonaws.com" },
"Action": "sqs:SendMessage",
"Resource": "YOUR_SQS_ARN",
"Condition": {
"ArnEquals": {
"aws:SourceArn": "YOUR_SNS_ARN"
}
}
}
]
}
Replace ARNs.
Save.
🟢 PHASE 6 — Create Web EC2 (Publisher)
- Go to EC2
- Click Launch Instance
Name:
web-server
AMI:
Amazon Linux 2
Instance type:
t2.micro
Create key pair.
Security Group:
Allow:
- HTTP (80)
- SSH (22)
Click Launch.
🟢 PHASE 7 — Install Web App
SSH into instance:
sudo yum update -y
sudo yum install httpd -y
sudo systemctl start httpd
sudo yum install aws-cli -y
Create upload page:
sudo nano /var/www/html/index.html
Paste:
<h1>Upload Simulation</h1>
<form method="POST" action="/upload">
<input type="text" name="filename" placeholder="Enter file name">
<button type="submit">Upload</button>
</form>
Save.
🟢 PHASE 8 — Install Python Publisher Script
sudo yum install python3 -y
nano publisher.py
Paste:
import boto3
sns = boto3.client('sns', region_name='us-east-2')
response = sns.publish(
TopicArn='YOUR_TOPIC_ARN',
Message='File uploaded: test-image.png'
)
print("Message sent")
Replace ARN.
Run:
python3 publisher.py
It will send event to SNS.
🟢 PHASE 9 — Create Worker EC2
Launch second instance:
Name:
worker-server
Install Python + AWS CLI.
🟢 PHASE 10 — Worker Script (Consumer)
On worker:
nano worker.py
Paste:
import boto3
import time
sqs = boto3.client('sqs', region_name='us-east-2')
queue_url = 'YOUR_QUEUE_URL'
while True:
messages = sqs.receive_message(
QueueUrl=queue_url,
MaxNumberOfMessages=1,
WaitTimeSeconds=5
)
if 'Messages' in messages:
for message in messages['Messages']:
print("Processing:", message['Body'])
sqs.delete_message(
QueueUrl=queue_url,
ReceiptHandle=message['ReceiptHandle']
)
time.sleep(3)
Replace Queue URL.
Run:
python3 worker.py
Now it polls continuously.
🟢 PHASE 11 — Test Flow
- Run publisher on web server.
- Worker prints:
Processing: File uploaded: test-image.png
Students will SEE event-driven processing live.
🟢 OPTIONAL — Add ALB + ASG
Create:
- Target Group
- Launch Template
- Auto Scaling Group
- Application Load Balancer
Attach web-server template to ASG.
Now traffic hits ALB.
🎓 What Students Learn
• Decoupling architecture
• Event-driven systems
• Async processing
• Worker scaling
• Production DevOps pattern




Top comments (0)