DEV Community

Cover image for ☁️ AWS - Email notification system
Lester Diaz Perez
Lester Diaz Perez

Posted on

☁️ AWS - Email notification system

Technologies:

☁️ AWS: πŸͺ£ S3 bucket, ⚑️ lambda function, πŸ”” SNS, πŸ“¬ SQS
πŸ€– IaC: πŸ—οΈ Terraform
πŸ”„ CD: πŸ› οΈ GitLab

Project resume: Email notification system developed in AWS. Once a file is uploaded to an s3 bucket it triggers an event which calls a lambda function and notifies service subscribers that a file has been uploaded.

πŸ“ŒRequirements

🎯Workflow

1️⃣ Source code πŸ“„
2️⃣ Lambda function ⚑️
3️⃣ Deploy to AWS through GitLab πŸš€


1️⃣ Source code πŸ“„

Gitlab project

Structure

terraform
└── terraform
    β”œβ”€β”€ bucket
    β”‚   β”œβ”€β”€ main.tf
    β”‚   └── output.tf
    β”‚   └── variables.tf
    β”œβ”€β”€ lambda
    β”‚   β”œβ”€β”€ main.tf
    β”‚   └── output.tf
    β”‚   └── variables.tf
    β”‚   └── lambda_assume_role_policy.json
    β”‚   └── lambda_policy.json
    β”‚   └── lambda_function.py
    β”‚   └── lambda_function.zip
    β”œβ”€β”€ sns
    β”‚   β”œβ”€β”€ main.tf
    β”‚   └── output.tf
    β”‚   └── variables.tf
    β”œβ”€β”€ sqs
    β”‚   β”œβ”€β”€ main.tf
    β”‚   └── output.tf
    β”‚   └── variables.tf
    β”œβ”€β”€ main.tf
    └── providers.tf
Enter fullscreen mode Exit fullscreen mode

It is organized into modules dedicated to different AWS services, each designed to serve a specific purpose and facilitate the management and scalability of the cloud environment.In addition, using Terraform to orchestrate these modules ensures a consistent and controlled deployment of the infrastructure, aligned with best practices for configuration management and agile development in the cloud.

2️⃣ Lambda function ⚑️

import json
import boto3

# Initialize AWS clients
s3_client = boto3.client('s3')
sns_client = boto3.client('sns')
sqs_client = boto3.client('sqs')

def lambda_handler(event, context):
    # Define SNS topic ARN and SQS queue URL
    sns_topic_arn = 'arn:aws:sns:us-east-2:087243254862:notifcationsystem-bucket'
    sqs_queue_url = 'https://sqs.us-east-2.amazonaws.com/087243254862/notification'

    # Process S3 event records
    for record in event['Records']:
        # Print the entire event for debugging purposes
        print(event)

        # Extract S3 bucket and object information from the event record
        s3_bucket = record['s3']['bucket']['name']
        s3_key = record['s3']['object']['key']

        # Example: Prepare metadata to send to SQS
        metadata = {
            'bucket': s3_bucket,
            'key': s3_key,
            'timestamp': record['eventTime']
        }

        # Send metadata to SQS queue
        sqs_response = sqs_client.send_message(
            QueueUrl=sqs_queue_url,
            MessageBody=json.dumps(metadata)
        )

        # Example: Prepare notification message to send to SNS
        notification_message = f"New file uploaded to S3 bucket '{s3_bucket}' with key '{s3_key}'"

        # Publish notification message to SNS topic
        sns_response = sns_client.publish(
            TopicArn=sns_topic_arn,
            Message=notification_message,
            Subject="File Upload Notification"
        )

    # Return a success response
    return {
        'statusCode': 200,
        'body': json.dumps('Processing complete')
    }

Enter fullscreen mode Exit fullscreen mode

3️⃣ GitLab workflow πŸš€

  • Add pipeline

Click pipeline

  • πŸ”‘Add access key from AWS into secrets in GitLab(Create a User for this purpose)

  • ▢️ Run

running pipeline


LinkedIn

Top comments (0)