DEV Community

Janis Adhikari
Janis Adhikari

Posted on

Cloud Resume Challenge: A Cloud-Native DevOps Approach

Table of Contents

  1. Cloud Architecture Overview
  2. Infrastructure as Code with Terraform
  3. Serverless Backend with AWS Lambda
  4. Visitor Data Storage with AWS DynamoDB
  5. API Gateway for Serverless API Integration
  6. Global Hosting with AWS S3 and CloudFront
  7. Security and HTTPS with Cloudflare and CloudFront
  8. CI/CD Pipeline with GitHub Actions
  9. Conclusion

1. Cloud Architecture Overview

This project demonstrates a cloud-native architecture that integrates multiple AWS services, focusing on scalability, automation, and security. The main components of the architecture are:

  • AWS S3 for static website hosting.
  • AWS Lambda for serverless computing to handle visitor tracking.
  • AWS DynamoDB for storing visitor count data.
  • AWS API Gateway to trigger Lambda functions.
  • AWS CloudFront and Cloudflare for content delivery and HTTPS security.

The design is highly scalable, with minimal management required. As the website grows, the cloud services automatically scale to handle increased traffic.


2. Infrastructure as Code with Terraform

Using Terraform for Infrastructure as Code (IaC) allows me to define, deploy, and manage the entire cloud infrastructure in a repeatable and version-controlled manner.

I created Terraform scripts to automate the provisioning of all the cloud resources:

  • AWS Lambda: A Python-based Lambda function is defined and deployed via Terraform to track visitor count.
  • AWS DynamoDB: The visitor count is stored in a DynamoDB table, which is also provisioned with Terraform.
  • API Gateway: Terraform is used to configure API Gateway for seamless integration with Lambda functions.

This IaC approach allows for easy replication, rollback, and management of the infrastructure across multiple environments.


3. Serverless Backend with AWS Lambda

AWS Lambda is a serverless compute service that runs code in response to events and automatically manages the underlying compute resources.

In this project:

  • The Lambda function is responsible for tracking and updating the visitor count every time the website is accessed.
  • The function is triggered by HTTP requests from API Gateway, which is configured to invoke the Lambda when the frontend calls the API endpoint.

Lambda's scalability ensures that the application can handle increasing numbers of visitors without requiring any manual intervention for resource scaling or management.


4. Visitor Data Storage with AWS DynamoDB

For storing the visitor count, I used AWS DynamoDB, a fully managed NoSQL database:

  • Scalability: DynamoDB handles large amounts of read and write operations with low latency and automatically scales based on usage, which is ideal for an application with variable traffic.
  • Data Model: The table is designed to store a single item representing the visitor count, with attributes such as visitor_id (partition key) and count (the number of visitors).

This simple design allows the Lambda function to query, update, and track the visitor count efficiently.


5. API Gateway for Serverless API Integration

AWS API Gateway is used to expose a REST API that serves as the entry point for the frontend to interact with the backend:

  • Event Trigger: The API Gateway triggers the Lambda function whenever an HTTP request is made by a user visiting the site.
  • Serverless: API Gateway, combined with Lambda, ensures that there are no servers to manage, and the system scales automatically based on demand.

By using API Gateway, we can easily decouple the frontend from the backend while maintaining a seamless integration between the two services.


6. Global Hosting with AWS S3 and CloudFront

To host the static resume website, I used AWS S3:

  • S3 Bucket: The frontend files (HTML, CSS, JavaScript) are uploaded to an S3 bucket, where the static website is hosted.
  • Scalability: S3 is highly scalable and reliable, ensuring that the resume is always available regardless of traffic.

To further enhance performance, I utilized AWS CloudFront, a CDN that caches content at edge locations:

  • Global Distribution: CloudFront caches the website globally, reducing latency and providing faster access to users worldwide.
  • Automatic Scaling: CloudFront scales automatically based on traffic, ensuring efficient delivery of content during traffic spikes.

7. Security and HTTPS with Cloudflare and CloudFront

To secure the website and ensure encrypted communication, I used Cloudflare and CloudFront:

  • Cloudflare: I routed my custom domain (janisadhi.me) through Cloudflare for DNS management and additional security features like DDoS protection.
  • CloudFront: By using CloudFront, I applied HTTPS security to the website, ensuring that all data between the client and the server is encrypted.

This two-layered approach to security improves the reliability and safety of the website.


8. CI/CD Pipeline with GitHub Actions

To automate the deployment of the frontend and backend, I set up CI/CD pipelines using GitHub Actions:

  • Frontend Deployment: Every time changes are made to the frontend code (HTML, CSS, JavaScript), GitHub Actions automatically uploads the updated files to the S3 bucket.
  • Backend Deployment: For the Lambda function, GitHub Actions automatically updates the function whenever code changes are made in the backend repository.

This automation ensures that changes are deployed quickly and consistently, without any manual intervention. Additionally, it reduces the risk of human error during the deployment process.


9. Conclusion

This project demonstrates a cloud-native DevOps solution leveraging AWS services, Terraform, and GitHub Actions to automate the infrastructure provisioning, deployment, and scaling of a simple website. Here's a recap of the cloud and DevOps aspects:

  • Infrastructure as Code: Managed the cloud infrastructure using Terraform, ensuring repeatability and version control.
  • Serverless Backend: Utilized AWS Lambda to handle the backend logic without managing servers.
  • Scalable and Secure Hosting: Hosted the website on AWS S3 and CloudFront, ensuring high availability and security.
  • Automation with CI/CD: Set up automated deployment pipelines using GitHub Actions for both frontend and backend.

This project allowed me to integrate various cloud-native technologies and DevOps practices to create a scalable, secure, and automated system. If you're interested in replicating this project, you can find the code and configuration files on GitHub.

Image of Datadog

The Future of AI, LLMs, and Observability on Google Cloud

Datadog sat down with Google’s Director of AI to discuss the current and future states of AI, ML, and LLMs on Google Cloud. Discover 7 key insights for technical leaders, covering everything from upskilling teams to observability best practices

Learn More

Top comments (0)

A Workflow Copilot. Tailored to You.

Pieces.app image

Our desktop app, with its intelligent copilot, streamlines coding by generating snippets, extracting code from screenshots, and accelerating problem-solving.

Read the docs

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay