<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Sharon</title>
    <description>The latest articles on DEV Community by Sharon (@sharon_enam).</description>
    <link>https://dev.to/sharon_enam</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/sharon_enam"/>
    <language>en</language>
    <item>
      <title>How I Built a Web Vulnerabilty Scanner - OpenEye</title>
      <dc:creator>Sharon</dc:creator>
      <pubDate>Sun, 28 Sep 2025 22:22:53 +0000</pubDate>
      <link>https://dev.to/sharon_enam/how-to-build-a-web-vulnerabilty-scanner-openeye-2c5a</link>
      <guid>https://dev.to/sharon_enam/how-to-build-a-web-vulnerabilty-scanner-openeye-2c5a</guid>
      <description>&lt;p&gt;Creating secure web applications is no easy feat. Vulnerabilities like SQL injection, XSS, or CSRF are still among the most common attack vectors, yet not every developer has the time or the skills to run deep security scans.&lt;/p&gt;

&lt;p&gt;So I decided to solve this problem by building &lt;strong&gt;OpenEye&lt;/strong&gt;; a modern, cloud-hosted, and user-friendly web vulnerability scanner that leverages OWASP ZAP under the hood but wraps it with a clean Django-based interface, making vulnerability scanning accessible to both non-technical users and professionals who just want clear, concise output.&lt;/p&gt;

&lt;p&gt;This blog covers the concept, architecture, implementation, security considerations, and deployment of the project.&lt;/p&gt;

&lt;h2&gt;
  
  
  Concept: Making Security Scanning Accessible
&lt;/h2&gt;

&lt;p&gt;The idea was simple. What if anyone not just security experts could run a reliable vulnerability scan against their own websites, and instantly see a structured report highlighting risks by severity?&lt;/p&gt;

&lt;p&gt;With OpenEye, users log in, enter a target URL, and initiate a scan. The system then spins up a dedicated ZAP container, performs both active and passive scanning, and outputs results grouped by severity levels — critical, high, medium, and low.&lt;/p&gt;

&lt;p&gt;Users can also revisit past scans through their personal history panel, ensuring that important findings aren't lost in a sea of logs.&lt;/p&gt;

&lt;p&gt;The frontend is built with Tailwind CSS + JavaScript and Django templates. It's simple and intuitive even a grade 3 student wouldn't get lost.&lt;/p&gt;

&lt;p&gt;The goal was to reduce friction so whether you're a developer or someone without a technical background, you can quickly run a scan and make sense of the findings.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why OWASP ZAP?
&lt;/h2&gt;

&lt;p&gt;Initially, I considered building my own scanning engine but that would be reinventing the wheel. OWASP ZAP is an industry-standard DAST (Dynamic Application Security Testing) tool, and it already:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Detects SQLi, XSS, CSRF, authentication/session flaws, and misconfigurations&lt;/li&gt;
&lt;li&gt;Provides a JSON API for integration&lt;/li&gt;
&lt;li&gt;Has a robust active and passive scanning mechanism&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;By embedding ZAP inside Docker containers, each scan runs in isolation, preventing cross-contamination and ensuring resource efficiency.&lt;/p&gt;

&lt;h2&gt;
  
  
  Architecture Overview
&lt;/h2&gt;

&lt;p&gt;Here's how the system fits together:&lt;/p&gt;

&lt;p&gt;The app has a frontend and backend built with Django, PostgreSQL via Supabase for the database to store scan history, authentication with AWS Cognito, and our core functionality OWASP ZAP running in Docker containers. After testing locally to enable users to easily access the app, I hosted it on an AWS EC2 instance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fll9k2kuzgw6aeawayx8b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fll9k2kuzgw6aeawayx8b.png" alt="High level view of OpenEye" width="800" height="131"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I've said a lot, deep breathes. Now let's take it slowly and peel out the implementation step by step.&lt;/p&gt;

&lt;h3&gt;
  
  
  Django Project Setup
&lt;/h3&gt;

&lt;p&gt;I'll assume you already know how to set up a Django project and run it. If not, no worries at all just do a quick Google search and you'd find a ton of resources. Using Django isn't compulsory; use any framework that works for you. For me, I wanted to learn a little more Django, hence why.&lt;/p&gt;

&lt;p&gt;If you'd be working totally locally, then you could just spin up a PostgreSQL database locally. Otherwise, use a cloud-hosted database. Supabase was my ideal choice because why not, it's free and easy to use. Create the necessary tables and fields to store your scan information. It's all up to you; you're free to store whatever you like.&lt;/p&gt;

&lt;h3&gt;
  
  
  OWASP ZAP Integration
&lt;/h3&gt;

&lt;p&gt;One of the best parts about OWASP ZAP is that it exposes a REST API out of the box. That means instead of manually interacting with the ZAP desktop client, you can programmatically control scans from your own application. This was perfect for me because I wanted OpenEye to feel like a standalone platform.&lt;/p&gt;

&lt;p&gt;At a high level, here's what I needed to do:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Trigger a spider scan&lt;/strong&gt; – to crawl the target application and discover URLs/endpoints&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Run an active scan&lt;/strong&gt; – to test those discovered endpoints for vulnerabilities&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Fetch alerts&lt;/strong&gt; – to retrieve all the issues ZAP found so I could parse, rank, and display them in OpenEye's dashboard&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;ZAP's REST API makes these tasks surprisingly straightforward. For example, here's a simplified version of the wrapper functions I built:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;start_spider_scan&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;target_url&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;_make_request&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;/JSON/spider/action/scan/&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;url&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;target_url&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;scan&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;''&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_alerts&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;target_url&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;Dict&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Any&lt;/span&gt;&lt;span class="p"&gt;]:&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;_make_request&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;/JSON/core/view/alerts/&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;baseurl&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;target_url&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;All I'm really doing here is sending HTTP requests to ZAP's REST API endpoints:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;/JSON/spider/action/scan/&lt;/code&gt; → tells ZAP to start crawling a target&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;/JSON/core/view/alerts/&lt;/code&gt; → retrieves all alerts (vulnerabilities, misconfigurations, etc.) for that target&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The &lt;code&gt;_make_request()&lt;/code&gt; helper I wrote under the hood is just an HTTP client method that talks to ZAP running inside its Docker container. So instead of re-inventing the wheel and writing my own vulnerability scanner from scratch, I leverage ZAP's proven scanning logic but wrap it in my own backend API layer.&lt;/p&gt;

&lt;p&gt;This approach gave me two big advantages:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Abstraction &amp;amp; Control&lt;/strong&gt; – My backend only needs to call simple Python functions like &lt;code&gt;start_spider_scan()&lt;/code&gt; or &lt;code&gt;get_alerts()&lt;/code&gt;. I don't have to expose raw ZAP API calls directly to the frontend.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Custom Processing&lt;/strong&gt; – Once I had the JSON responses, I could parse them, group issues by severity, and feed them into my own database and dashboard instead of relying on ZAP's default reporting.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;So when a user runs a scan in OpenEye, they're really triggering these wrappers in my backend, which in turn communicate with ZAP's REST API inside Docker. Also, by default, ZAP listens on port 8080 inside the container.&lt;/p&gt;

&lt;p&gt;For example, if I start ZAP in Docker like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker run &lt;span class="nt"&gt;-u&lt;/span&gt; zap &lt;span class="nt"&gt;-p&lt;/span&gt; 8080:8080 &lt;span class="nt"&gt;-i&lt;/span&gt; ghcr.io/zaproxy/zaproxy:stable zap.sh &lt;span class="nt"&gt;-daemon&lt;/span&gt; &lt;span class="nt"&gt;-host&lt;/span&gt; 0.0.0.0 &lt;span class="nt"&gt;-port&lt;/span&gt; 8080
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Where:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;-daemon&lt;/code&gt; runs ZAP headlessly (no GUI)&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;-host 0.0.0.0&lt;/code&gt; makes it listen on all interfaces&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;-port 8080&lt;/code&gt; exposes the API at &lt;a href="http://localhost:8080" rel="noopener noreferrer"&gt;http://localhost:8080&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;-p 8080:8080&lt;/code&gt; maps the container port to the host, so my backend can reach it&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Once ZAP is running, the API is always accessible at endpoints like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;code&gt;http://localhost:8080/JSON/spider/action/scan/&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;http://localhost:8080/JSON/core/view/alerts/&lt;/code&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Authentication with Cognito
&lt;/h2&gt;

&lt;p&gt;Managing authentication securely is one of those things that looks simple on the surface "just add login/signup" but in reality it's full of pitfalls: password storage,token lifetimes, OAuth2 flows, etc. And I had a bit of a headache with setting the  auth callback (side-eye to AWS for this).&lt;/p&gt;

&lt;p&gt;Either way, Cognito gave me:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Secure defaults&lt;/strong&gt; — password policies, account recovery, MFA&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;OAuth2.0 compliance&lt;/strong&gt; — standard flows (authorization code, implicit, etc.)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scalability&lt;/strong&gt; — I don't need to worry about user pools or scaling login endpoints&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here's what happens when a user logs in to OpenEye:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Redirect to Cognito
&lt;/h3&gt;

&lt;p&gt;When the user clicks "Login," they're redirected to my Cognito hosted login page. Cognito provides a default UI, which saved me time.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Authorization Code Returned
&lt;/h3&gt;

&lt;p&gt;After the user enters their credentials, Cognito redirects them back to my Django app with an authorization code in the URL query string.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Backend Exchanges Code for Tokens
&lt;/h3&gt;

&lt;p&gt;My Django app then takes that code and makes a server-to-server POST request to Cognito's &lt;code&gt;/oauth2/token&lt;/code&gt; endpoint. This returns:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;An &lt;strong&gt;ID Token&lt;/strong&gt;  basically a JWT containing user identity claims like email, sub, etc.&lt;/li&gt;
&lt;li&gt;An &lt;strong&gt;Access Token&lt;/strong&gt; and refresh token&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  4. User Session Established
&lt;/h3&gt;

&lt;p&gt;Django decodes the ID token, extracts the user info, and establishes a session. From the app's perspective, the user is now authenticated.&lt;/p&gt;

&lt;p&gt;One thing Cognito enforces for good reasons is that OAuth2 redirects must happen over HTTPS (except for localhost during development). This meant when I hosted my app on an EC2 instance, I couldn't just serve HTTP to Cognito,I had to set up SSL.&lt;/p&gt;

&lt;p&gt;First, I created a free subdomain for my app using FreeDNS(you could check them out) after I set it up on an AWS EC2 instance, then ran this command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;sudo &lt;/span&gt;certbot &lt;span class="nt"&gt;--nginx&lt;/span&gt; &lt;span class="nt"&gt;-d&lt;/span&gt; openeye.chickenkiller.com
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Certbot automatically provisioned and installed free SSL certificates. Nginx handled HTTPS termination and forwarded requests to Django running on Gunicorn.&lt;/p&gt;

&lt;p&gt;Then finally, the entire OAuth2 flow worked securely end-to-end...&lt;br&gt;
Relax, continue reading I'll elaborate more on this.&lt;/p&gt;

&lt;h2&gt;
  
  
  AWS EC2 Deployment
&lt;/h2&gt;

&lt;p&gt;This is not entirely synchronous because I had set up the app on the EC2 instance before creating a subdomain name and registering it with SSL, which was a series of back and forth. But to make it easier for you to follow, I mentioned this in my earlier step with setting up the auth, so please don't get confused.&lt;/p&gt;

&lt;p&gt;Once I had the core pieces (ZAP API, Django backend, Cognito authentication), I needed a place to run everything in the cloud. For this, I chose an AWS EC2 t2.micro instance (Ubuntu 22.04) small, cheap, and well within the AWS Free Tier.&lt;/p&gt;

&lt;p&gt;The very first thing I did after launching the instance was update packages and install the essentials; Python, Docker, Nginx, etc.&lt;/p&gt;

&lt;p&gt;At this point, I had a clean Ubuntu server with the tools needed to run both my app and ZAP.&lt;/p&gt;

&lt;p&gt;Django doesn't serve production traffic directly, so I used &lt;strong&gt;Gunicorn&lt;/strong&gt; as the WSGI HTTP server. Gunicorn is lightweight and designed specifically for running Python web apps in production. It basically handles the load and spins up more instances of my app when necessary.&lt;/p&gt;

&lt;p&gt;Then I put &lt;strong&gt;Nginx&lt;/strong&gt; in front of Gunicorn for two reasons:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Static files&lt;/strong&gt;: Nginx serves static assets (CSS, JS, images) much faster than Gunicorn&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Reverse proxy&lt;/strong&gt;: Nginx terminates HTTPS and forwards requests to Gunicorn on port 8000&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;So when clients hit &lt;code&gt;https://openeye.chickenkiller.com&lt;/code&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Nginx terminates SSL, then proxies the request to Gunicorn running Django on port 8000&lt;/li&gt;
&lt;li&gt;Static files are served directly by Nginx for speed&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For the scanning engine, I didn't want to install ZAP directly on the host. Running it in Docker gave me isolation, portability, and easy lifecycle management (start, stop, update). So I just downloaded the image and ran it detached (&lt;code&gt;-d&lt;/code&gt; means it stays up as a background service).&lt;/p&gt;

&lt;p&gt;Now, my Django backend can talk to the ZAP API via &lt;code&gt;http://localhost:8080&lt;/code&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;So that's it! Not so intimidating as you thought, huh? Oh yeah, and why the name OpenEye, you ask? Well, why not ,don't you think it's the perfect name for a vulnerability scanner? OpenEye - it's literally a wide-opened eye for finding vulnerabilities(whatever that means). So that's it feel free to replicate this and add your own spice!&lt;/p&gt;

</description>
      <category>vulnerabilityscanner</category>
      <category>owaspzap</category>
      <category>cybersecurity</category>
      <category>aws</category>
    </item>
    <item>
      <title>Building a Secure Serverless Portfolio Generator Using AWS</title>
      <dc:creator>Sharon</dc:creator>
      <pubDate>Fri, 27 Jun 2025 19:34:04 +0000</pubDate>
      <link>https://dev.to/sharon_enam/building-a-secure-serverless-portfolio-generator-using-aws-23i5</link>
      <guid>https://dev.to/sharon_enam/building-a-secure-serverless-portfolio-generator-using-aws-23i5</guid>
      <description>&lt;p&gt;Creating a portfolio as a developer is a flex, we want to showcase our work, but building the actual portfolio site takes time and effort. So I decided to solve that problem in the most cloud-native, cost-efficient, and secure way possible by building a personalized portfolio generator powered entirely by AWS services.&lt;br&gt;
This blog covers the architecture, implementation, security considerations, and deployment of the project.&lt;/p&gt;

&lt;h2&gt;
  
  
  Concept: A Portfolio From Just Your GitHub
&lt;/h2&gt;

&lt;p&gt;The idea was simple:&lt;br&gt;
What if you could instantly generate a sleek portfolio just by logging in with GitHub, no need to manually input anything?&lt;/p&gt;

&lt;p&gt;Users visit the site, click "Generate Portfolio", and authenticate with GitHub. The system then extracts their GitHub data, generates a polished summary using AI, and presents them with a live portfolio site they can share on resumes and CVs.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs08fgwd2frlv30epz7kt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs08fgwd2frlv30epz7kt.png" alt="A graphical view of the flow of how the app works" width="800" height="198"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Frontend: React-Based Dynamic Template
&lt;/h2&gt;

&lt;p&gt;I first created a reusable React portfolio template that dynamically populates user information from GitHub. It’s structured, responsive, and designed to look professional with customizable sections like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Bio&lt;/li&gt;
&lt;li&gt;Skills (based on languages used)&lt;/li&gt;
&lt;li&gt;Pinned Projects&lt;/li&gt;
&lt;li&gt;GitHub profile metrics&lt;/li&gt;
&lt;li&gt;Years of experience(based on when the account was created)&lt;/li&gt;
&lt;li&gt;Location&lt;/li&gt;
&lt;li&gt;Phone number&lt;/li&gt;
&lt;li&gt;Number of followes &lt;/li&gt;
&lt;li&gt;Number of public repos&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Why GitHub Authentication Was Crucial
&lt;/h2&gt;

&lt;p&gt;Initially, my prototype allowed portfolio generation for any GitHub username just type it in. But I quickly realized this could be abused, someone could generate portfolios for others, impersonate developers, or misuse public data at scale.&lt;/p&gt;

&lt;p&gt;To fix this, I implemented OAuth authentication with GitHub. Now users must sign in with GitHub, and I extract their authenticated username automatically. This enforces proper authorization and follows the principle of least privilege.&lt;/p&gt;

&lt;h2&gt;
  
  
  Architecture Overview: AWS Services Used
&lt;/h2&gt;

&lt;p&gt;Lambda Functions: Core of the Backend Logic&lt;br&gt;
GitHub Data Extraction Function&lt;br&gt;
This Lambda function is the heart of the portfolio generation workflow. It’s triggered when the frontend sends a POST request to an API Gateway endpoint, with the authenticated GitHub username in the request body.&lt;/p&gt;

&lt;h4&gt;
  
  
  Here’s a breakdown of what it does:
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Receives the GitHub username as payload in a POST request from the frontend after the user authenticates&lt;/li&gt;
&lt;li&gt;Uses PyGitHub to extract user details&lt;/li&gt;
&lt;li&gt;Fetches pinned repos via GitHub GraphQL API&lt;/li&gt;
&lt;li&gt;Computes top 5 languages from public repos&lt;/li&gt;
&lt;li&gt;Generates a summary using AWS Bedrock&lt;/li&gt;
&lt;li&gt;Stores all data in DynamoDB&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Snippet:
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;github&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Github&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Auth&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;boto3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;

&lt;span class="n"&gt;auth&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;Auth&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Token&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;environ&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Github_token&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="n"&gt;g&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Github&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;auth&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;auth&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;user&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;g&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get_user&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;username&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
 &lt;span class="c1"&gt;#Fetch profile, repos, languages...
&lt;/span&gt; &lt;span class="c1"&gt;#Fetch pinned repos using GraphQL
&lt;/span&gt; &lt;span class="c1"&gt;#Generate summary with Bedrock
&lt;/span&gt; &lt;span class="c1"&gt;#Save to DynamoDB_
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Packaging Note&lt;/strong&gt;: When deploying this function, I zipped my Python dependencies (like PyGitHub) with my code and uploaded them as a deployment package. AWS Lambda requires all libraries to be bundled if they’re not part of the standard runtime.&lt;/p&gt;

&lt;p&gt;To trigger the function, I set up an API Gateway POST endpoint that receives the username from the authenticated session.This design ensures that users don’t have to manually enter their username. Instead, the frontend (after successful GitHub OAuth login) automatically sends the correct username to the backend for processing.&lt;/p&gt;

&lt;h2&gt;
  
  
  DynamoDB Data Retrieval Function
&lt;/h2&gt;

&lt;p&gt;This Lambda function handles the display side of the portfolio generator. Once a user's data is already extracted and stored in DynamoDB, this function retrieves it and serves it to the frontend.&lt;/p&gt;

&lt;h4&gt;
  
  
  How it works:
&lt;/h4&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;It's triggered when the frontend makes a GET request to the API Gateway endpoint.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The frontend appends the GitHub username as a query parameter (e.g &lt;code&gt;...?username=sharon-dev&lt;/code&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The Lambda function reads the username from &lt;code&gt;event['queryStringParameters']&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;It fetches the corresponding user record from the Portfolio_Data DynamoDB table. The data is returned as a JSON response, ready for the frontend to display&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  code snippet
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;lambda_handler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;username&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;queryStringParameters&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{}).&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;username&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;table&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get_item&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Key&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Username&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;username&lt;/span&gt;&lt;span class="p"&gt;})&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;statusCode&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;body&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dumps&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Item&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;default&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Frontend Integration Flow:
&lt;/h2&gt;

&lt;p&gt;On the frontend:&lt;br&gt;
After portfolio generation, a user is redirected to a unique URL like &lt;code&gt;main.d1ljzwcnoo4d.amplifyapp.com/?username=me-dev&lt;/code&gt;&lt;br&gt;
The React app extracts the username from the URL and makes a GET request to the API endpoint:&lt;/p&gt;

&lt;h4&gt;
  
  
  code snippet
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nf"&gt;fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`https://api-id.execute-api.region.amazonaws.com/prod/user-data?username=&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;username&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;then&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;res&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
  &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;then&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c1"&gt;// Dynamically inject data into the portfolio template&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The frontend then replaces placeholders in the portfolio template with the fetched JSON including bio, skills, project list, and summary.&lt;/p&gt;

&lt;h2&gt;
  
  
  Additional Implementation Details:
&lt;/h2&gt;

&lt;p&gt;I enabled CORS (Cross-Origin Resource Sharing) on the API Gateway endpoint to ensure the frontend hosted on Amplify could securely interact with this Lambda function.&lt;/p&gt;

&lt;p&gt;This allows public access to view generated portfolios via shareable links without requiring re-authentication.&lt;/p&gt;

&lt;p&gt;This design enables links like &lt;code&gt;main.d1ljzwr7cnoo4d.amplifyapp.com/?username=dev-dev&lt;/code&gt;&lt;br&gt;
to display a fully personalized, server-rendered portfolio anywhere, anytime.&lt;/p&gt;

&lt;p&gt;Hosting on AWS Amplify: Fast, Scalable, and Developer-Friendly&lt;br&gt;
To complete the stack, I chose AWS Amplify to host the frontend React application and it turned out to be the perfect fit for a serverless, cost-optimized solution like this one.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Amplify?
&lt;/h2&gt;

&lt;p&gt;Amplify offers several advantages that made it an ideal choice:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Amplify connects directly to my GitHub repository. Every time I push a change to the main branch, it automatically builds and deploys the updated frontend  no manual steps required. This CI/CD setup ensures fast iteration and continuous delivery.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;It abstracts away all the infrastructure management. There’s no need to configure EC2, NGINX, or S3 buckets manually — it handles everything from provisioning build environments to deploying across CDN edge locations.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Amplify automatically provisions an SSL certificate and serves content over HTTPS, which is essential for OAuth authentication and user trust.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;I was able to configure build-time environment variables directly in the Amplify console for keys like the API base URL — keeping secrets out of the frontend code and simplifying deployment across environments.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;All of this was achieved while staying entirely within the AWS Free Tier, making Amplify a cost-effective option for solo developers and students.&lt;/p&gt;

&lt;p&gt;Beyond building a functional app, I focused heavily on security best practices, clean code, and environment management to make the project production-ready.&lt;/p&gt;

&lt;h2&gt;
  
  
  Environment Variables and Secrets Management
&lt;/h2&gt;

&lt;p&gt;To prevent hardcoding sensitive information like API keys and tokens:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;I stored all sensitive values (e.g., GitHub OAuth token, API endpoints) in a .env file for local development.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On AWS Amplify, I used the Amplify Console’s environment variables settings to securely inject these at build time, keeping secrets out of the codebase and version control.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This approach ensured clean separation between code and configuration, making the project safer and easier to maintain across environments.&lt;/p&gt;

&lt;h2&gt;
  
  
  Security Best Practices Followed
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Least Privilege OAuth: The GitHub OAuth implementation only requests access to public data no write or private scopes are used, reducing risk in case of token exposure.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;No Persistent Credentials: User tokens and sensitive metadata are never stored in any database. Only public GitHub profile information and project metadata are saved.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;CORS Headers and API Gateway Hardening: I configured CORS policies carefully to ensure only authorized frontend origins (i.e., the Amplify app) could access the backend APIs.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Validation and Error Handling: Both Lambda functions perform input validation (checking for missing or malformed usernames) and provide structured error responses for better frontend debugging.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Clean Deployment Packages: For AWS Lambda, I ensured all dependencies (like PyGitHub, requests, and boto3) were packaged cleanly and zipped with only the necessary files, minimizing cold start times and reducing package size.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  How I Kept Costs at $0.00 in development
&lt;/h2&gt;

&lt;p&gt;One of my goals for this project was to explore real-world cloud development without spending a dime and I’m happy to report that I succeeded. Here's how I kept my entire serverless portfolio generator within the AWS Free Tier.&lt;br&gt;
My strategy was Build Smart, Stay Serverless&lt;/p&gt;

&lt;h3&gt;
  
  
  Here are the tricks That Helped
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Efficient Lambda Invocations: My Lambda functions are lightweight and short-lived designed to run only when triggered and exit quickly to avoid compute costs.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Minimal Bedrock Usage: I limited Bedrock inference to one-time portfolio generation per user. By using Titan Nova Lite, I avoided higher-cost models like Claude or Jurassic and stayed well under usage thresholds.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;No Data Egress: Since everything happens within AWS (including Bedrock, DynamoDB, and Lambda), I avoided outbound data transfer charges.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;CI/CD Only When Needed: Amplify only runs a build when I push to GitHub, and I kept the frontend optimized to avoid long build times.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Try It Out
&lt;/h2&gt;

&lt;p&gt;Want to see how it works?&lt;br&gt;
&lt;a href="https://main.d1ljzwr7cnoo4d.amplifyapp.com/" rel="noopener noreferrer"&gt;👉 Generate your own portfolio here&lt;br&gt;
&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>beginners</category>
      <category>portfolio</category>
      <category>serverless</category>
    </item>
    <item>
      <title>Buckets? No, S3 buckets</title>
      <dc:creator>Sharon</dc:creator>
      <pubDate>Thu, 02 Jan 2025 19:43:46 +0000</pubDate>
      <link>https://dev.to/sharon_enam/buckets-no-s3-buckets-4aci</link>
      <guid>https://dev.to/sharon_enam/buckets-no-s3-buckets-4aci</guid>
      <description>&lt;p&gt;Today, let’s talk about S3 buckets—you know, those virtual buckets where you store your data. Just because it’s called a “bucket” doesn’t mean it deserves any less attention. In fact, it’s quite the opposite! AWS calls it Simple Storage Service, or S3 for short, but don’t let the name fool you—it’s a powerhouse for storing all kinds of stuff. Whether it’s documents, images, backups, or anything else you can think of, your S3 buckets are versatile.&lt;/p&gt;

&lt;p&gt;But here’s the deal: while they’re great for storage, we need to make sure that the data inside stays secure. A leaky bucket in the real world is messy. A leaky S3 bucket? That’s a disaster. Let’s make sure your buckets stay airtight! &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo6ua53a9gp3tc0dbd2d5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo6ua53a9gp3tc0dbd2d5.png" alt="Description of securing your S3 bucket" width="800" height="576"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Steps to Keep Your S3 Bucket Safe&lt;/strong&gt;&lt;br&gt;
Alright, let’s get down to it—how do we make sure your S3 bucket is as secure as it can be? The good news is AWS gives you all the tools you need; you just have to use them the right way. Here are some steps to keep your data locked up tight:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Turn Off Public Access by Default&lt;br&gt;
Public access is the enemy of a secure S3 bucket. AWS has a feature called Block Public Access—and trust me, it’s your best friend. Unless you’re hosting a website or sharing files intentionally, make sure public access is turned off for your buckets. Leaving it open is like leaving your front door wide open with a “help yourself” sign.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Tighten Permissions with IAM&lt;br&gt;
The key to a secure bucket is knowing exactly who can access it and what they can do with it. Use AWS Identity and Access Management (IAM) to create precise permission rules. Think of it like a VIP list—only the people (or systems) on the list can get in. And always follow the principle of least privilege: give people access to only what they need and nothing more.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Encrypt Your Data&lt;br&gt;
Data should always be protected, whether it’s sitting idle in your bucket or traveling across the internet. AWS gives you a few ways to encrypt your data:&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Server-Side Encryption (SSE): Let AWS handle the encryption for you.&lt;br&gt;
AWS Key Management Service (KMS): Take control and manage your encryption keys.&lt;br&gt;
Client-Side Encryption: Encrypt your files before they even reach AWS.&lt;br&gt;
This step isn’t optional—encryption is your safety net.&lt;/p&gt;

&lt;p&gt;4.Log Everything&lt;br&gt;
You can’t fix what you don’t see, so turn on logging and monitoring.&lt;br&gt;
S3 Access Logs let you track who’s accessing your bucket and what they’re doing.&lt;br&gt;
AWS CloudTrail gives you detailed logs of all API activity, so you can catch unauthorized actions.&lt;br&gt;
Logs might not sound glamorous, but they’re invaluable when something goes wrong.&lt;/p&gt;

&lt;p&gt;5.Audit Your Buckets Regularly&lt;br&gt;
Things change—people move roles, permissions get tweaked, and before you know it, your bucket could be vulnerable. Use tools like AWS Trusted Advisor or third-party solutions to regularly audit your buckets. Better safe than sorry, right?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Simple Mistakes, Huge Consequences&lt;/strong&gt;&lt;br&gt;
Let’s not sugarcoat it: S3 misconfigurations can lead to disaster. Here’s why:&lt;/p&gt;

&lt;p&gt;Real Data Leaks: Companies have accidentally exposed sensitive customer data because they left their buckets open to the world. It happens more often than you’d think.&lt;/p&gt;

&lt;p&gt;Hackers Love Misconfigured Buckets: There are automated tools scanning the internet for open S3 buckets 24/7. If your bucket is one of them, it’s game over.&lt;br&gt;
The good news? These mistakes are 100% preventable. A few extra minutes spent setting things up correctly can save you from a lifetime of regret.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Advanced Safety Measures&lt;/strong&gt;&lt;br&gt;
If you want to go above and beyond, here are a few advanced tips:&lt;/p&gt;

&lt;p&gt;Versioning: Keep track of every change to your files. If someone deletes or overwrites something important, you can roll it back.&lt;/p&gt;

&lt;p&gt;Object Lock: Protect critical data with a Write-Once-Read-Many (WORM) configuration, making it tamper-proof.&lt;/p&gt;

&lt;p&gt;Access Points: Manage access to large-scale buckets more efficiently with S3 Access Points.&lt;/p&gt;

&lt;p&gt;Amazon Macie: Use this tool to automatically detect and protect sensitive data stored in your buckets.&lt;/p&gt;

&lt;p&gt;Final Thoughts&lt;br&gt;
Securing your S3 buckets isn’t rocket science, but it does take some effort. Whether you’re a seasoned pro or new to AWS, following these steps will ensure your buckets are airtight. Remember, a secure bucket isn’t just about protecting data—it’s about protecting your reputation, your business, and the trust of anyone whose information you’re storing.&lt;/p&gt;

&lt;p&gt;So, take the time to lock down your S3 buckets today. Your future self (and your customers) will thank you!&lt;/p&gt;

</description>
      <category>s3</category>
      <category>security</category>
      <category>beginners</category>
      <category>aws</category>
    </item>
    <item>
      <title>I am a wall - Call me a VPC</title>
      <dc:creator>Sharon</dc:creator>
      <pubDate>Tue, 31 Dec 2024 23:25:19 +0000</pubDate>
      <link>https://dev.to/sharon_enam/i-am-a-wall-call-me-a-vpc-3fn6</link>
      <guid>https://dev.to/sharon_enam/i-am-a-wall-call-me-a-vpc-3fn6</guid>
      <description>&lt;p&gt;The world is a good place, right? So why bother with fences or walls? Well, tell that to my neighbor whose home was broken into last night(Imagninary neighbour). Sometimes, just having a barrier does the trick. Something as simple as locking your door or building a fence can deter a lot of attackers.&lt;/p&gt;

&lt;p&gt;AWS takes the same principle and applies it to cloud computing. They call their walls Virtual Private Clouds (VPCs). Fancy name, right? In simple terms a VPC is a barrier that protects your AWS resources. It makes them exist in their own little world, invisible and inaccessible to outsiders unless you allow it.&lt;/p&gt;

&lt;p&gt;Let’s dive into the technical side of VPCs and how they intersect with security.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fne75a5fc4bvyi7vlhx4p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fne75a5fc4bvyi7vlhx4p.png" alt="A graphical description of VPCs" width="800" height="346"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;What is a VPC?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A Virtual Private Cloud is a logically isolated section of the AWS Cloud where you can launch and run your resources, like EC2 instances or databases, securely. Think of it as your private AWS yard, complete with high walls and gates that only you control.&lt;/p&gt;

&lt;p&gt;The beauty of a VPC is that it lets you design your network the way you want—complete with public and private zones, firewalls, and gateways. It’s all about keeping your resources safe while giving you full control.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How Do VPCs Improve Security?&lt;/strong&gt;&lt;br&gt;
Here’s how VPCs work their magic in security:&lt;/p&gt;

&lt;p&gt;1️⃣ Network Isolation&lt;br&gt;
The first and most obvious benefit is isolation. Your VPC is your space. It’s separate from everyone else in the AWS Cloud. This means no one else can see or touch your resources unless you explicitly let them.&lt;/p&gt;

&lt;p&gt;2️⃣ Private Subnets&lt;br&gt;
VPCs let you divide your network into subnets, and you can decide which ones are private. For example, keep sensitive resources like databases or backend servers in private subnets, away from the public internet.&lt;/p&gt;

&lt;p&gt;3️⃣ Security Groups and NACLs&lt;br&gt;
AWS gives you two layers of protection to control traffic:&lt;br&gt;
Security Groups: These act like virtual firewalls for your instances. You can allow or block traffic based on IP address, port, and protocol.&lt;/p&gt;

&lt;p&gt;Network Access Control Lists (NACLs): These operate at the subnet level, providing an additional layer of traffic filtering.&lt;br&gt;
Think of them as guards stationed at the gates, deciding who gets in and who doesn’t.&lt;/p&gt;

&lt;p&gt;4️⃣ Invisible to the Outside World&lt;br&gt;
A properly configured VPC makes your resources invisible to the outside world. No one can even see they exist unless you allow it through specific settings like public IPs or load balancers.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How Are VPCs Deployed for Security?&lt;/strong&gt;&lt;br&gt;
1.Define Your Network Layout&lt;br&gt;
Start by designing your network. Decide how many subnets you need, and split them into public and private ones. For instance:&lt;/p&gt;

&lt;p&gt;Public Subnet: For resources like web servers that need internet access.&lt;br&gt;
Private Subnet: For databases or application servers that should stay hidden.&lt;/p&gt;

&lt;p&gt;2.Configure Route Tables&lt;br&gt;
Route tables determine how traffic flows in your VPC. Use these to control which subnets can talk to each other and which ones can access the internet.&lt;/p&gt;

&lt;p&gt;3.Add Firewalls&lt;br&gt;
Set up security groups for your resources and NACLs for your subnets. Think of these as rules that say:&lt;br&gt;
“Only allow traffic from this IP on port 80.”&lt;br&gt;
“Block all traffic from this region.”&lt;/p&gt;

&lt;p&gt;4.Use Gateways and Endpoints&lt;br&gt;
For internet access, you can use an Internet Gateway for public subnets. For private subnets, use NAT Gateways to let them access the internet without being exposed.&lt;/p&gt;

&lt;p&gt;If you need access to AWS services like S3, consider VPC Endpoints. These allow secure communication with AWS services without leaving the VPC.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pro Tips for VPC Security&lt;/strong&gt;&lt;br&gt;
Enable VPC Flow Logs: This is like your CCTV. It records traffic coming in and out of your VPC so you can monitor and troubleshoot.&lt;br&gt;
Use Multi-Account Setups: Isolate workloads by using separate AWS accounts with dedicated VPCs, and manage them with AWS Organizations.&lt;/p&gt;

&lt;p&gt;Apply the Principle of Least Privilege: Only give access where absolutely necessary—whether it’s security group rules or IAM permissions.&lt;/p&gt;

&lt;p&gt;Audit Regularly: Use tools like AWS Config or Trusted Advisor to spot overly permissive configurations.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why VPCs Are Essential for Cloud Security&lt;/strong&gt;&lt;br&gt;
Without a VPC, your resources would be defenseless,vulnearable to any passerby. VPCs create a secure, isolated environment that makes unauthorized access nearly impossible—unless you leave the gate open.&lt;/p&gt;

&lt;p&gt;By giving you fine-grained control over traffic, resource visibility, and connectivity, VPCs are like having a smart security system for your AWS environment.&lt;/p&gt;

&lt;p&gt;So, whether you’re launching a single instance or building a massive multi-region architecture, a VPC is the backbone of your security strategy. After all, who wouldn’t want a high-tech wall protecting their cloud resources?&lt;/p&gt;

</description>
      <category>security</category>
      <category>vpc</category>
      <category>aws</category>
      <category>beginners</category>
    </item>
    <item>
      <title>IAM - What’s the Big Deal?</title>
      <dc:creator>Sharon</dc:creator>
      <pubDate>Mon, 30 Dec 2024 19:02:08 +0000</pubDate>
      <link>https://dev.to/sharon_enam/iam-whats-the-big-deal-3jf0</link>
      <guid>https://dev.to/sharon_enam/iam-whats-the-big-deal-3jf0</guid>
      <description>&lt;p&gt;As a security enthusiast one thing for sure is IAM will find you wherever you are. And it’s true—it’s that essential. Think about it: You wouldn’t want someone snooping through your love texts without permission, right? The same goes for your cloud services. IAM ensures that only the right people get access to the right resources, at the right time.&lt;/p&gt;

&lt;p&gt;Enough stories—let’s dive into the technicalities of AWS IAM and how it works.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0j4kqwmmxj4qmqz9kkp9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0j4kqwmmxj4qmqz9kkp9.png" alt="Graphical Description of IAM" width="800" height="440"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What is AWS IAM?&lt;/strong&gt;&lt;br&gt;
AWS Identity and Access Management (IAM) is a service that helps you control who can access your AWS resources and what actions they can take. Think of it as the gatekeeper for your AWS account.&lt;/p&gt;

&lt;p&gt;At its core, IAM manages two key components:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Identity:&lt;/strong&gt; Refers to users, groups, and roles that need access.&lt;br&gt;
&lt;strong&gt;Access:&lt;/strong&gt; Determines what actions identities are allowed to perform.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How Does IAM Work?&lt;/strong&gt;&lt;br&gt;
IAM operates based on policies and permissions, which define and enforce access rules. Here’s how it breaks down:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Users and Groups&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Users are individual accounts created for people who need access to AWS.&lt;br&gt;
Groups are collections of users with similar access needs. Instead of assigning permissions one by one, you can apply them to a group, and all users inherit those permissions.&lt;/p&gt;

&lt;p&gt;Roles&lt;br&gt;
Roles are used for temporary access. For example, when an application or service (like EC2) needs permissions to interact with another AWS service (like S3), you assign a role instead of using permanent credentials.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Policies&lt;/strong&gt;&lt;br&gt;
Policies are the backbone of IAM. They’re JSON documents that specify who can access what, under what conditions. AWS has two types of policies:&lt;/p&gt;

&lt;p&gt;AWS Managed Policies: Predefined by AWS for common use cases.&lt;br&gt;
Customer Managed Policies: Custom policies tailored to your specific needs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How is IAM Achieved and Deployed?&lt;/strong&gt;&lt;br&gt;
IAM is built into AWS, meaning there’s no separate infrastructure to set up. Here’s a typical process for deploying IAM:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Define Permissions:&lt;/strong&gt;&lt;br&gt;
Start by identifying what level of access each user or group needs. Follow the principle of least privilege, giving only the permissions required for the task.&lt;/p&gt;

&lt;p&gt;Create Users, Groups, and Roles:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Set up users for individual accounts.&lt;/strong&gt;&lt;br&gt;
Organize users into groups to streamline permission management.&lt;br&gt;
Create roles for applications or services requiring temporary access.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Attach Policies:&lt;/strong&gt;&lt;br&gt;
Use policies to define the allowed actions and resources. For instance, you can allow a group to read S3 buckets but prevent them from deleting files.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Enable Multi-Factor Authentication (MFA):&lt;/strong&gt;&lt;br&gt;
Add an extra layer of security by requiring a one-time passcode for user logins.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Monitor and Audit:&lt;/strong&gt;&lt;br&gt;
Regularly review access permissions and use tools like AWS CloudTrail to track IAM activity and ensure compliance.&lt;/p&gt;

&lt;p&gt;So, the next time you think about cloud security, remember: If you wouldn’t share your love texts with the world, don’t leave your AWS services open to just anyone. Privacy matters everywhere—especially in the cloud.&lt;/p&gt;

</description>
      <category>iam</category>
      <category>security</category>
      <category>beginners</category>
      <category>aws</category>
    </item>
  </channel>
</rss>
