DEV Community

Cover image for Day 46: Event-Driven Processing with Amazon S3 and Lambda
Thu Kha Kyawe
Thu Kha Kyawe

Posted on

Day 46: Event-Driven Processing with Amazon S3 and Lambda

Lab Information

The DevOps team is working on automating file management between two S3 buckets. The task is to create a public S3 bucket for file uploads and a private S3 bucket for securely storing the files. A Lambda function will be triggered automatically whenever a file is uploaded to the public S3 bucket, which will copy the file to the private bucket. Additionally, logs of the operation will be stored in a DynamoDB table. The logs should include details such as the source bucket, destination bucket, and the object key of the file that was copied. This will help the team maintain better security and visibility for file transfers.

Create a public S3 bucket named xfusion-public-24772. Ensure that the bucket allows public access to its objects.
Create a private S3 bucket named xfusion-private-31548. Ensure that the bucket does not allow public access.
Create a Lambda function named xfusion-copyfunction. This function should be triggered by uploads to the public S3 bucket and should copy the uploaded file to the private bucket. Create the necessary policies and a role named lambda_execution_role. Attach these policies to the role, and then link this role to the Lambda function.
lambda-function.py is already present under the /root/ directory on AWS client host, replace REPLACE-WITH-YOUR-DYNAMODB-TABLE and REPLACE-WITH-YOUR-PRIVATE-BUCKET values.
Create a DynamoDB table named xfusion-S3CopyLogs with a partition key LogID (string). This table will store logs generated by the Lambda function, including details such as source bucket name, destination bucket name, and object key.
For testing upload the file sample.zip located in the /root directory on the client host to the public S3 bucket. The Lambda function should trigger and copy the file to the private bucket.
Verify that the file has been successfully copied to the private bucket by checking the private bucket in the S3 console.
Verify that a log entry has been created in the DynamoDB table containing the file copy details.
Enter fullscreen mode Exit fullscreen mode

Lab Solutions

Architecture

Public S3 Bucket
│ (upload)

Lambda Function
├── Copy file → Private S3 Bucket
└── Log entry → DynamoDB

Step 1: Create Public S3 Bucket

Go to S3 → Create bucket

Configure:

Bucket name:

xfusion-public-24772

Region: Same as Lambda

Block Public Access: ❌ Uncheck Block all public access

Acknowledge the warning

Create bucket

Allow Public Read (Required)

Open bucket → Permissions

Add this Bucket Policy:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": "*",
      "Action": "s3:GetObject",
      "Resource": "arn:aws:s3:::xfusion-public-24772/*"
    }
  ]
}
Enter fullscreen mode Exit fullscreen mode

Step 2: Create Private S3 Bucket

Go to S3 → Create bucket

Configure:

Bucket name:

xfusion-private-31548

Block Public Access: ✅ Keep enabled

Create bucket

✅ This bucket remains private

Step 3: Create DynamoDB Table

Go to DynamoDB → Create table

Configure:

Table name:

xfusion-S3CopyLogs

Partition key:

LogID (String)

Leave defaults

Create table

Wait until status = ACTIVE

Step 4: Create IAM Role for Lambda
4.1 Create Role

Go to IAM → Roles → Create role

Trusted entity:

AWS service

Lambda

Next

4.2 Attach Policies

Attach these AWS managed policies:

AWSLambdaBasicExecutionRole

AmazonS3FullAccess

AmazonDynamoDBFullAccess

(FullAccess is acceptable for labs)

Role name:

lambda_execution_role

Create role

Step 5: Prepare Lambda Code

On the aws-client host:

vi lambda-function.py
Enter fullscreen mode Exit fullscreen mode

Replace These Values
DYNAMODB_TABLE = "xfusion-S3CopyLogs"
DESTINATION_BUCKET = "xfusion-private-31548"

✅ Save the file

Step 6: Create Lambda Function

6.1 Create Function

Go to Lambda → Create function

Choose Author from scratch

Configure:

Function name:

xfusion-copyfunction

Runtime: Python 3.9

Execution role: Use existing role

Role: lambda_execution_role

Create function

6.2 Upload Code

Open the function

Update lambda-function.py

Click Deploy

Step 7: Add S3 Trigger to Lambda

In Lambda → Add trigger

Select S3

Configure:

Bucket: xfusion-public-24772

Event type: PUT

Acknowledge permission prompt

Add trigger

Step 8: Test the Setup
Upload Test File

From aws-client:

aws s3 cp /root/sample.zip s3://xfusion-public-24772/
Enter fullscreen mode Exit fullscreen mode

Step 9: Verify File Copy

Go to S3 → xfusion-private-31548

Confirm:

sample.zip

✅ File copied successfully

Step 10: Verify DynamoDB Logs

Go to DynamoDB → Tables → xfusion-S3CopyLogs

Click Explore table items

You should see an entry with:

Source bucket

Destination bucket

Object key (sample.zip)


Resources & Next Steps
📦 Full Code Repository: KodeKloud Learning Labs
📖 More Deep Dives: Whispering Cloud Insights - Read other technical articles
💬 Join Discussion: DEV Community - Share your thoughts and questions
💼 Let's Connect: LinkedIn - I'd love to connect with you

Credits
• All labs are from: KodeKloud
• I sincerely appreciate your provision of these valuable resources.

Top comments (0)