<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Muhammed Ashraf </title>
    <description>The latest articles on DEV Community by Muhammed Ashraf  (@muash10).</description>
    <link>https://dev.to/muash10</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/muash10"/>
    <language>en</language>
    <item>
      <title>Using AWS App Runner to build &amp; host my website</title>
      <dc:creator>Muhammed Ashraf </dc:creator>
      <pubDate>Thu, 08 Jan 2026 13:28:46 +0000</pubDate>
      <link>https://dev.to/muash10/using-aws-app-runner-for-hosting-my-website-1875</link>
      <guid>https://dev.to/muash10/using-aws-app-runner-for-hosting-my-website-1875</guid>
      <description>&lt;h2&gt;
  
  
  Overview
&lt;/h2&gt;

&lt;p&gt;AWS provides a lot of different ways to deploy your application. based on whatever you are looking for, you will find a service for this. For example, if you are looking to deploy a classic application you have EC2 or a fully managed service you have Elastic Beanstalk. If you are looking to host containers, you have Amazon ECS, EKS or you can use for a fast and simple service like AWS App Runner&lt;/p&gt;

&lt;p&gt;As per AWS &lt;a href="https://docs.aws.amazon.com/apprunner/latest/dg/what-is-apprunner.html" rel="noopener noreferrer"&gt;documentation&lt;/a&gt;. AWS App Runner is very simple, fast service that helps you to deploy your application from either source code like GitHub or Image repo like ECR into a scalable and cost-effective service&lt;/p&gt;

&lt;p&gt;Referring to cost. the service is very cost-effective since you will only pay for the actual traffic since App Runner provision resources based on your traffic (Lower Number of requests = Lower provisioned resources)&lt;/p&gt;

&lt;p&gt;I was trying to explore AWS App Runner, So I deployed a website on AWS App Runner with other services such as S3, DynamoDB, ECR and Amazon SES for sending emails.&lt;/p&gt;

&lt;h2&gt;
  
  
  Architecture
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsvoyjs6ji25b0p9uq1r9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsvoyjs6ji25b0p9uq1r9.png" alt=" " width="711" height="402"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Technology Stack
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Runtime: Node.js&lt;/li&gt;
&lt;li&gt;Framework: Express.js&lt;/li&gt;
&lt;li&gt;Templating: EJS (Embedded JavaScript)&lt;/li&gt;
&lt;li&gt;Frontend: Tailwind CSS (Styling), Alpine.js (Interactivity/State)&lt;/li&gt;
&lt;li&gt;Database: AWS DynamoDB (NoSQL)&lt;/li&gt;
&lt;li&gt;Storage: AWS S3 (Food Images)&lt;/li&gt;
&lt;li&gt;Email: AWS SES (Order Notifications)&lt;/li&gt;
&lt;li&gt;Hosting: AWS App Runner (Dockerized)&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Directory Structure
&lt;/h3&gt;

&lt;p&gt;Core files:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;server.js: Entry Point. Configures Express, middleware and static files. Starts the server.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;lib/aws-client.js: AWS Utility. Centralizes initialization of DynamoDB Client, S3 Client, and SES Client.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Route Handlers:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;shop.js: Customer Facing. Handles public access to the menu, cart operations, and checkout.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;admin.js: Admin Management. Protected routes for managing items, authentication, and stats.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Views:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;index.ejs: Homepage. Renders the Menu (Hot/Frozen sections), Cart Drawer, and Hero section.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;admin.ejs: Dashboard. Main Admin interface for Adding/Editing/Deleting items.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;admin-stats.ejs: Analytics. Visual dashboard using Chart.js to show revenue and order trends.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;login-ejs: Authentication. Admin login form.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Partials:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;header.ejs: Navigation bar, Mobile Menu, Favicon, Libraries import (Tailwind, Alpine).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;footer.ejs: Page footer, Closing tags.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;product-card.ejs: Reusable component for rendering a single regular menu item.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;cart-drawer.ejs: The sliding cart sidebar content&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  AWS Components
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Compute: AWS App Runner&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Role: Fully managed container application service.&lt;br&gt;
Configuration: Autoscaling instances based on request load.&lt;br&gt;
Source: Deploys the Docker image directly from Amazon ECR.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Database: Amazon DynamoDB&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Role: Serverless NoSQL key-value database.&lt;br&gt;
Tables:&lt;br&gt;
MenuTable: Stores food items (itemId, name, price, description, category).&lt;br&gt;
OrdersTable: Stores customer orders (orderId, customerDetails, items, total, status).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Storage: Amazon S3&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Role: Object storage for uploaded food images.&lt;br&gt;
Access: Images are uploaded via the Admin Panel. The application generates signed URLs or proxies them for secure display.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Container Registry: Amazon ECR&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Role: Securely stores the Docker container images.&lt;br&gt;
Workflow: docker push commands upload new versions of the app. App Runner detects these changes to update the live site.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Communications: Amazon SES&lt;/strong&gt;&lt;br&gt;
Role: Reliable email delivery service.&lt;/p&gt;

&lt;h2&gt;
  
  
  Snapshots from the website
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2jd691fmjrotyzeptoq9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2jd691fmjrotyzeptoq9.png" alt=" " width="725" height="516"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbk8emcispzbqs7g801qu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbk8emcispzbqs7g801qu.png" alt=" " width="800" height="332"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmpr0iv0hm68xni9iz84o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmpr0iv0hm68xni9iz84o.png" alt=" " width="800" height="319"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So, AWS App Runner simplifies the deployments. You don't care about the deployment's steps You only focus on improving your code or monitor your website.&lt;/p&gt;

</description>
      <category>programming</category>
      <category>aws</category>
      <category>docker</category>
    </item>
    <item>
      <title>Using AWS CloudFront to enhance the performance, Security &amp; Availability of your application</title>
      <dc:creator>Muhammed Ashraf </dc:creator>
      <pubDate>Mon, 13 Oct 2025 11:21:21 +0000</pubDate>
      <link>https://dev.to/aws-builders/using-aws-cloudfront-to-enhance-the-performance-security-availability-of-your-application-3i26</link>
      <guid>https://dev.to/aws-builders/using-aws-cloudfront-to-enhance-the-performance-security-availability-of-your-application-3i26</guid>
      <description>&lt;p&gt;Hosting a website that serves a lot of customers around the world then AWS CloudFront should be considered by you since it distributes your content of your website and store them at the nearest edge location to your clients.&lt;/p&gt;

&lt;p&gt;This significantly improves performance and reduces loading times which enhances the customer's experience &lt;/p&gt;

&lt;p&gt;In this article I will try to explain CloudFront features that can be used to enhance the overall experience of your website&lt;/p&gt;

&lt;p&gt;We are going to discuss the below features &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Origin Group and Multiple Origins &lt;/li&gt;
&lt;li&gt;CloudFront Functions &lt;/li&gt;
&lt;li&gt;Global Accelerator &lt;/li&gt;
&lt;li&gt;CloudFront Security&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;But before we start, we will try to explain how does CloudFront work&lt;/p&gt;

&lt;h1&gt;
  
  
  Overview
&lt;/h1&gt;

&lt;p&gt;They key architectural components of CloudFront are distribution, Edge locations or Point of Presence, Regional Edge Cache, Origin &amp;amp; Caching Behavior &lt;/p&gt;

&lt;p&gt;Let's walkthrough them one by one &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Distribution: this the primary resource you create, and it contains the configurations including origins, caching behavior and security settings. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Edge Locations: there can be considered as data centers where content is cached&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Regional Edge Cache: this is larger caching layer located between edge locations &amp;amp; origin, they store less popular content for larger periods than smaller edge locations &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Origin: This is the source of your content, It can be S3, ALB, NLB EC2 or On-permise server, you can find more information &lt;a href="https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/DownloadDistS3AndCustomOrigins.html" rel="noopener noreferrer"&gt;Origin Types&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Cache Behavior: set of configuration rules you apply to specific URL patterns, for example routing, TTL or redirection&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Referring to &lt;a href="https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/HowCloudFrontWorks.html" rel="noopener noreferrer"&gt;AWS official documents&lt;/a&gt; this is how architecture looks like &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F22vss04sy6k7shu2b2a9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F22vss04sy6k7shu2b2a9.png" alt=" " width="800" height="487"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;CloudFront has a lot of features that could help you to go beyond if you used them correctly, Let's discuss them one by one&lt;/p&gt;

&lt;h1&gt;
  
  
  Feature
&lt;/h1&gt;

&lt;h2&gt;
  
  
  Origin Group &amp;amp; Multiple Origin
&lt;/h2&gt;

&lt;p&gt;As mentioned earlier, Origin is the source of your data, the OriginGroup feature allow you to add multiple origins to the same group (primary origin &amp;amp; secondary origin) in addition to a failover criteria you define, This means if you send a request to your primary region and origin responded with an error status code, request will be redirected to the secondary origin.&lt;/p&gt;

&lt;p&gt;you can use muliple origin groups to server the contents based on their types, &lt;/p&gt;

&lt;p&gt;for example: You can redirect the static content to Origin A while dynamic content to Origin B&lt;/p&gt;

&lt;h2&gt;
  
  
  Global Accelerator
&lt;/h2&gt;

&lt;p&gt;It achieves low latency and high performance by utilizing AWS global network and avoid going to the public internet, &lt;/p&gt;

&lt;p&gt;it works by providing a static IP address to your application and route the traffic through the optimal route &amp;amp; healthy endpoint&lt;/p&gt;

&lt;h1&gt;
  
  
  Use Cases
&lt;/h1&gt;

&lt;h2&gt;
  
  
  Routing to Multiple Origin
&lt;/h2&gt;

&lt;p&gt;CloudFront Cache Behavior can be used to route traffic based on the path pattern as below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fryyrklavs4u80188rszg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fryyrklavs4u80188rszg.png" alt=" " width="761" height="461"&gt;&lt;/a&gt;&lt;br&gt;
This allows you to serve static &amp;amp; dynamic origin instead of having different architecture for both content type&lt;/p&gt;

&lt;h2&gt;
  
  
  Origin Failover Through Origin Groups
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F312uv87o3l18al5ahtw8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F312uv87o3l18al5ahtw8.png" alt=" " width="671" height="481"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This feature helps you to achieve high availability by forwarding failed requests to another Origin.&lt;/p&gt;

&lt;h2&gt;
  
  
  Restrict Access Through Custom Headers
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fubjrzur9639t2r30wh94.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fubjrzur9639t2r30wh94.png" alt=" " width="791" height="452"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By adding Custom Header on CloudFront and modifying the requests, this will allow us to define a rule on the ALB that if the requests don't contain the Custom Header will be denied, any direct access will be blocked by the defined rules on the ALB&lt;/p&gt;

&lt;h1&gt;
  
  
  Closing Words
&lt;/h1&gt;

&lt;p&gt;CloudFront is a powerful service if you are looking to distribute your application or website globally, it has many features that will help you to achieve high availability, security and reduce latency for the clients that reaching your application. &lt;/p&gt;

&lt;h1&gt;
  
  
  References
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://aws.amazon.com/cloudfront/features/" rel="noopener noreferrer"&gt;CloudFront Official Documents&lt;/a&gt;&lt;br&gt;
&lt;a href="https://aws.amazon.com/blogs/architecture/how-unidays-achieved-aws-region-expansion-in-3-weeks/" rel="noopener noreferrer"&gt;ow UNiDAYS achieved AWS Region expansion in 3 weeks&lt;/a&gt;&lt;/p&gt;

</description>
      <category>cloudcomputing</category>
      <category>aws</category>
      <category>security</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Using Amazon Textract to analyze and extract text from Documents Part 1</title>
      <dc:creator>Muhammed Ashraf </dc:creator>
      <pubDate>Sun, 14 Sep 2025 17:37:26 +0000</pubDate>
      <link>https://dev.to/aws-builders/using-amazon-textract-to-extract-text-from-pdfs-part-1-40od</link>
      <guid>https://dev.to/aws-builders/using-amazon-textract-to-extract-text-from-pdfs-part-1-40od</guid>
      <description>&lt;p&gt;Amazon Textract is very powerful machine learning &lt;br&gt;
service that used to analyze do documents and extract either text or handwriting from scanned documents.&lt;/p&gt;

&lt;p&gt;It can be used to build different solutions for different use cases such as:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Financial Services &lt;/li&gt;
&lt;li&gt;Health Care&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In this article we walkthrough how to build a solution on that works on extracting texts from PDF, analyze them, then store them into DynamoDB for further analysis&lt;/p&gt;

&lt;p&gt;We will use a mix of AWS services to build our solutions, below is a breakdown of these services and the use case of them&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;S3 Buckets: will be used to be our storage for raw &amp;amp; extracted JSON files&lt;/li&gt;
&lt;li&gt;Lambda: will be used to invoke Amazon Textract through StartDocumentAnalysis &amp;amp; GetDocumentAnalysis APIs, store the JSON files into S3&lt;/li&gt;
&lt;li&gt;SNS: used for async communication to invoke the Lambda to get the results and store them into the final bucket&lt;/li&gt;
&lt;li&gt;Eventbridge: Used to trigger Lambda function once file is uploaded to S3 &lt;/li&gt;
&lt;li&gt;AWS Glue: it will used as batch job to iterate over the S3 bucket to convert the files and ingest the files into DynamoDB&lt;/li&gt;
&lt;li&gt;DynamoDB: will be our storage for the extracted data&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Sequence Flow
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuli6jdus0b1hm7qhmjym.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuli6jdus0b1hm7qhmjym.png" alt=" " width="800" height="298"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Functional Requirements
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;User should be able to upload PDF files to S3 bucket&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Non-Functional Requirements
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Solution must be high available&lt;/li&gt;
&lt;li&gt;Solution should be reliable &lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Block Diagram
&lt;/h2&gt;

&lt;p&gt;Our block diagram shows the components that will be used to build our solution &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsh84otdkr1nh2mssetwn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsh84otdkr1nh2mssetwn.png" alt=" " width="800" height="154"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  High Level Design
&lt;/h2&gt;

&lt;p&gt;The high-level design shows the services used to build our solution, focusing on ingesting, analyzing &amp;amp; storing the results DynamoDB&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwdciaa9dkb2mh00szf35.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwdciaa9dkb2mh00szf35.png" alt=" " width="800" height="250"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We will breakdown our solution into different aspects &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;High availability:

&lt;ul&gt;
&lt;li&gt;Storage: our storage services such as S3 &amp;amp; DynamoDB, offer high availability you can find all the details related for each service here &lt;a href="https://docs.aws.amazon.com/AmazonS3/latest/userguide/disaster-recovery-resiliency.html" rel="noopener noreferrer"&gt;S3&lt;/a&gt; &lt;a href="https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/disaster-recovery-resiliency.html" rel="noopener noreferrer"&gt;DynamoDB&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Lambda Function: &lt;a href="https://docs.aws.amazon.com/lambda/latest/dg/security-resilience.html" rel="noopener noreferrer"&gt;Resilience in Lambda &lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;SNS: &lt;a href="https://docs.aws.amazon.com/sns/latest/dg/sns-resilience.html" rel="noopener noreferrer"&gt;Resilience in SNS&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Amazon Textract: &lt;a href="https://docs.aws.amazon.com/textract/latest/dg/disaster-recovery-resiliency.html" rel="noopener noreferrer"&gt;Resilience in Amazon Textract&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h2&gt;
  
  
  Amazon Textract APIs
&lt;/h2&gt;

&lt;p&gt;We will utilize &lt;a href="https://docs.aws.amazon.com/textract/latest/dg/API_StartDocumentAnalysis.html" rel="noopener noreferrer"&gt;StartDocumentAnalysis API&lt;/a&gt; and &lt;a href="https://docs.aws.amazon.com/textract/latest/dg/API_GetDocumentAnalysis.html" rel="noopener noreferrer"&gt;GetDocumentAnalysis API&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;for StartDocumentAnalysis API we have different types of features such as (Tables, Forms, Queries, Signature and layout) we will use Queries to extract specific data from the statements such as card number, client name, new charges etc.&lt;/p&gt;

&lt;p&gt;and for GetDocumentAnaylsis API is recieving the results of StartDocumentAnalysis API in aschyronous mode, we will utlize SNS to decouple our Lambda functions.&lt;/p&gt;

&lt;p&gt;below you can find a screenshots for the setup&lt;/p&gt;

&lt;p&gt;we have two Lambda functions as below &lt;/p&gt;

&lt;p&gt;the trigger_lambda_put will be used to be triggered once a file uploaded to the S3 and call the Amazon Textract API&lt;br&gt;
and other function will be used to get the results and filtering out the required parameters&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyov7rfb498dnyws08jbc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyov7rfb498dnyws08jbc.png" alt=" " width="800" height="134"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;we have also two S3 buckets for the input and outputs files&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0a8e3l53qqkihs2ixz6n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0a8e3l53qqkihs2ixz6n.png" alt=" " width="800" height="48"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;SNS&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn4vvt2ic719ydfbk2izf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn4vvt2ic719ydfbk2izf.png" alt=" " width="695" height="159"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;the output will be as below &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiupfrirj0fbwy74emqvt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiupfrirj0fbwy74emqvt.png" alt=" " width="800" height="201"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;file name is textract job id .json, this can be modified through the lambda function code&lt;/p&gt;

&lt;p&gt;the final output should be as below &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhrvwrlk5fonv3rd0ebmr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhrvwrlk5fonv3rd0ebmr.png" alt=" " width="670" height="110"&gt;&lt;/a&gt;&lt;br&gt;
since I have defined the card holder name as a filter parameter in the Lambda function code&lt;/p&gt;

&lt;p&gt;In part two we will discuss more about AWS Glue JOB to process multiple files and store them into DynamoDB and we will cover the cost part for each component&lt;/p&gt;

</description>
      <category>ai</category>
      <category>aws</category>
    </item>
    <item>
      <title>Using AWS Comprehend to analyze customers' feedback</title>
      <dc:creator>Muhammed Ashraf </dc:creator>
      <pubDate>Fri, 15 Aug 2025 16:11:17 +0000</pubDate>
      <link>https://dev.to/aws-builders/using-aws-comprehend-to-analyze-customers-feedback-3og6</link>
      <guid>https://dev.to/aws-builders/using-aws-comprehend-to-analyze-customers-feedback-3og6</guid>
      <description>&lt;p&gt;Analyzing customer feedback for your product is recommended to enhance and fill the gaps for any business want to enhance their products and customer services&lt;/p&gt;

&lt;p&gt;AWS provides several services that can used to achieve this, One of the services that can be used is Amazon Comprehend&lt;/p&gt;

&lt;p&gt;Based on AWS documentation Amazon Comprehend "Amazon Comprehend uses natural language processing (NLP) to extract insights about the content of documents. It develops insights by recognizing the entities, key phrases, language, sentiments, and other common elements in a document. Use Amazon Comprehend to create new products based on understanding the structure of documents."&lt;/p&gt;

&lt;p&gt;So, in this blog we will discuss how we are going to use Amazon Comprehend to analyze customer reviews &lt;/p&gt;

&lt;p&gt;We will start with the functional &amp;amp; non-functional requirements then we will move to the core components and the setup of the solution&lt;/p&gt;

&lt;p&gt;We made some assumptions that the application is already deployed on an EC2 instance and users will be able to write their review, the application is Inegrated with S3 and will generate a text file contains review &lt;/p&gt;

&lt;h2&gt;
  
  
  Functional Requirements
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Analyze text documents which will be generated by application&lt;/li&gt;
&lt;li&gt;Extract the reviews and categorize them either POSITIVE, NEGATIVE or MIXED based on sentiment score&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Non-Functional requirements
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Solution should be reliable &lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  HLD
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffbaz4tstdp8bfpp9gt28.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffbaz4tstdp8bfpp9gt28.png" alt=" " width="741" height="342"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Core Components
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;EC2: It will host the application accessed by users.&lt;/li&gt;
&lt;li&gt;S3: will host the review files generated by application which contains user's reviews&lt;/li&gt;
&lt;li&gt;Lambda Function: It will be triggered by S3 event notification and will invoke Amazon Comprehend by using Detect_Sentiment API, it will also save the generated results to DynamoDB&lt;/li&gt;
&lt;li&gt;DynamoDB: will store the final results either the review is POSITIVE, NEGATIVE or MIXED&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Solution Setup
&lt;/h2&gt;

&lt;p&gt;1- We have created an S3 bucket with a directory uploads which will host the generated review files&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmu9z523rg80ql6532e17.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmu9z523rg80ql6532e17.png" alt=" " width="800" height="274"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs3iqliauqjy0zh5c5gdo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs3iqliauqjy0zh5c5gdo.png" alt=" " width="800" height="140"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The bucket is created with the default configuration and for event notification you can create type of events is (POST, PUT)&lt;/p&gt;

&lt;p&gt;2- We have created a table on DynamoDB with the below configuration:&lt;br&gt;
Table Name: CustomerFeedbackAnalysis&lt;br&gt;
Partition Key: FeedbackID&lt;/p&gt;

&lt;p&gt;We kept rest of configuration as default but for sure you will configure them based on your requirements&lt;br&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftu8aj246f1nnzv77tqo2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftu8aj246f1nnzv77tqo2.png" alt=" " width="800" height="162"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;3- We created a Lambda function with defaults configuration except for the Timeout since it's only 3 seconds and IAM Role since it should have permissions for the below:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AWSLambdaBasicExecutionRole (Basic Exection Role for Lambda)&lt;/li&gt;
&lt;li&gt;AmazonS3ReadOnlyAccess (to read files from S3)&lt;/li&gt;
&lt;li&gt;AmazonDynamoDBFullAccess (Write results to DynamoDB)&lt;/li&gt;
&lt;li&gt;ComprehendReadOnly (Invoke Comprehend using Detect_Sentiment API)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi4u1fy85tz8o4hmmulca.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi4u1fy85tz8o4hmmulca.png" alt=" " width="800" height="262"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Sequence Diagram
&lt;/h2&gt;

&lt;p&gt;The flow should be as below &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvcospkqwf554mbfmn62u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvcospkqwf554mbfmn62u.png" alt=" " width="800" height="463"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The results
&lt;/h2&gt;

&lt;p&gt;I generated some random reviews and uploaded them to the S3, the final results stored in DynamoDB table as below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fll29vegfczxxxc18xhi2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fll29vegfczxxxc18xhi2.png" alt=" " width="800" height="269"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  References
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://aws.amazon.com/blogs/machine-learning/analyze-content-with-amazon-comprehend-and-amazon-sagemaker-notebooks/" rel="noopener noreferrer"&gt;Analyze content with Amazon Comprehend and Amazon SageMaker notebooks&lt;/a&gt;&lt;br&gt;
&lt;a href="https://docs.aws.amazon.com/comprehend/latest/dg/what-is.html" rel="noopener noreferrer"&gt;Amazon Comprehend&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>genai</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Content Moderation Using AWS Rekognition</title>
      <dc:creator>Muhammed Ashraf </dc:creator>
      <pubDate>Thu, 17 Jul 2025 10:57:02 +0000</pubDate>
      <link>https://dev.to/muash10/content-moderation-using-aws-rekognition-15m1</link>
      <guid>https://dev.to/muash10/content-moderation-using-aws-rekognition-15m1</guid>
      <description>&lt;h2&gt;
  
  
  Overview
&lt;/h2&gt;

&lt;p&gt;Content moderation is considered very important for organizations, especially for social media, advertising, and education. Any organization requires special analysis for media.&lt;/p&gt;

&lt;p&gt;The basic definition of content moderation is reviewing media and content to ensure it complies with the standards and guidelines set by the organization.&lt;/p&gt;

&lt;p&gt;Achieving that manually will be very difficult, since you are going to review every content that is uploaded by a user, and it's even impossible if we are talking about a large user base platform.&lt;/p&gt;

&lt;p&gt;In this article, we will discuss deploying a content moderation solution on AWS and how to utilize different AWS services to achieve this.&lt;/p&gt;

&lt;p&gt;First, we will break down our core components.&lt;/p&gt;

&lt;h2&gt;
  
  
  Core Components
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;User&lt;/li&gt;
&lt;li&gt;Content&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Then we will have our high-level design. Keep in mind that you can have different architectures for this, and you are not only restricted to this.&lt;/p&gt;

&lt;h2&gt;
  
  
  Architecture
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F32vsr2s51et4uywuf1n1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F32vsr2s51et4uywuf1n1.png" alt=" " width="800" height="307"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Core Services
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;S3 Bucket --&amp;gt; It will be our storage services which will be used to store our content.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;S3 has a feature called S3 event notification, which allows S3 to trigger events in case of uploading, deleting, or replication events. You can find a list of types of events &lt;a href="https://docs.aws.amazon.com/AmazonS3/latest/userguide/EventNotifications.html" rel="noopener noreferrer"&gt;here&lt;/a&gt; and the destinations that can receive this notification. &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Lambda Function --&amp;gt; it will be used to put messages contains object details to SQS whenever any object uploaded to S3 and another one will pull SQS messages and triggers Rekognition API to start the content moderation process.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;SQS --&amp;gt; It will be utilized to decouple upload and moderation process from each other to avoid bursts and spikes &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Rekognition --&amp;gt; The actual moderation process happens here since AWS Rekognition is an AI service that is used for an image/video analysis, the results will be stored in another bucket&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Sequence Diagram
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fit8n6qd1uqx1g8dqwn04.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fit8n6qd1uqx1g8dqwn04.png" alt=" " width="800" height="256"&gt;&lt;/a&gt;&lt;br&gt;
The below sequence shows how the flow should be &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;The user will upload content to an S3 bucket. In the real world, the S3 bucket is the storage for the front-end layer, which can be a mobile or web app. I just used S3 directly for simplicity and explanation.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;An S3 event notification will be triggered once an object is uploaded, and the destination will be a Lambda function.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The Lambda function will send a message containing details about the uploaded object to the SQS for decoupling the architecture.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;A moderator Lambda function will pull the message from SQS and trigger the Rekognition API (DetectModerationLabels). More info &lt;a href="https://docs.aws.amazon.com/rekognition/latest/dg/moderation.html" rel="noopener noreferrer"&gt;here&lt;/a&gt; related to label categories.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The results will be stored into another S3 bucket for better isolation.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;For sure, content moderation is used across nearly all social media and advertising organizations, and it became easier with the help of AWS's different AI services. You don't need to have experts in AI or machine learning to deploy your own content moderator, just go through the AWS documentation.&lt;/p&gt;

&lt;p&gt;Hope this article helps you, and please let me know if you have any comments.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>aws</category>
    </item>
    <item>
      <title>AWS Migration Services Part1</title>
      <dc:creator>Muhammed Ashraf </dc:creator>
      <pubDate>Tue, 17 Jun 2025 12:00:42 +0000</pubDate>
      <link>https://dev.to/muash10/aws-migration-services-part1-ho9</link>
      <guid>https://dev.to/muash10/aws-migration-services-part1-ho9</guid>
      <description>&lt;h2&gt;
  
  
  Overview
&lt;/h2&gt;

&lt;p&gt;In this article we are going to discuss the different service related to discovery &amp;amp; migration data either from AWS to AWS or On-premises to AWS and when features, use case for each one of them.&lt;/p&gt;

&lt;p&gt;There is a ton of services that can be used to migrate either for application or database such as the below services:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AWS Application Migration Services (AWS MGN)&lt;/li&gt;
&lt;li&gt;AWS Database Migration Service (AWS DMS)&lt;/li&gt;
&lt;li&gt;AWS Datasync&lt;/li&gt;
&lt;li&gt;AWS Migration Hub&lt;/li&gt;
&lt;li&gt;AWS Transfer Family &lt;/li&gt;
&lt;li&gt;AWS Application Discovery&lt;/li&gt;
&lt;li&gt;AWS Modernize Mainframe Application&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgav7v3x5uow76ydxwaxt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgav7v3x5uow76ydxwaxt.png" alt="Image description" width="371" height="411"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's start with the application &amp;amp; databases then move to the data migration services:&lt;/p&gt;

&lt;h2&gt;
  
  
  Application &amp;amp; Database Migration Services
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;AWS MGN&lt;/strong&gt;: it's a service that used to help companies to lift and shift their physical, virtual or cloud-servers with facing any compatibility or performance issues, it simplifies the moving process for large servers to AWS,&lt;/p&gt;

&lt;p&gt;It eliminates compatibility issues by the below features:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Block-level replication: By replicating at the block level, MGN is largely agnostic to the applications, databases, and file systems running on the server. It is copying the underlying disk data, not interpreting the files.&lt;/li&gt;
&lt;li&gt;Automated OS conversion: When a test or cutover instance is launched, MGN automatically handles the necessary conversions to make the server boot and run natively on AWS infrastructure.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;While performance issues are handled by:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Minimal Source Impact: The replication agent is lightweight and operates in the background with throttled resource consumption. &lt;/li&gt;
&lt;li&gt;Choosing the right instance size: In the MGN console, you can define Launch Settings for each source server. This allows you to choose the appropriate EC2 instance type&lt;/li&gt;
&lt;li&gt;Optimized data transfer: While the data transfer happens over your network, the continuous replication model avoids the need for a massive, single data transfer that could saturate your connection during business hours.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You can choose AWS MGN when you are looking for rehosting migration and need to quickly move existing application.&lt;/p&gt;

&lt;p&gt;you can find more technical details about how it works on the below links &lt;a href="https://dev.toAWS%20MGN%20FAQs"&gt;https://docs.aws.amazon.com/mgn/latest/ug/General-Questions-FAQ.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS DMS&lt;/strong&gt;: it's a common service that support a lot of sources DB and many destinations used to migrate databases either an AWS DB or external DB &lt;/p&gt;

&lt;p&gt;there are a lot of features for the DMS such as the below features &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Support of homogeneous &amp;amp; heterogeneous migrations: DMS supports migrating between the same database engines and between different database engines.&lt;/li&gt;
&lt;li&gt;CDC: DMS can capture changes from the source database and apply them to the target in near real-time, minimizing downtime during the migration.&lt;/li&gt;
&lt;li&gt;Schema Conversion:  DMS works in conjunction with the AWS Schema Conversion Tool to convert the source database schema and code to a format compatible with the target database.&lt;/li&gt;
&lt;li&gt;Serverless Option: A serverless feature automatically provisions and scales the migration resources, simplifying the process further.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;the major use case for this service is migration of databases either from/to External/AWS&lt;/p&gt;

&lt;p&gt;for more technical information about how it works and supported sources, destinations you can visit this link &lt;a href="https://aws.amazon.com/dms/faqs/" rel="noopener noreferrer"&gt;AWS DMS FAQs&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS DataSync&lt;/strong&gt;: It's a service that used to transfer &amp;amp; accelerate movements of large data between on-premises storage systems and AWS storage services.&lt;/p&gt;

&lt;p&gt;So as defined above the main focus only of this service is moving data between storage systems and it has many features such as:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Accelerated data transfer: by using purpose-built protocol to transfer data up to 10 times faster than open-source tools.&lt;/li&gt;
&lt;li&gt;Automated and Managed: It handles many of the tasks involved in data transfer, including scripting copy jobs, scheduling and monitoring transfers, validating data integrity, and optimizing network utilization.&lt;/li&gt;
&lt;li&gt;Secure Transfers: Data is encrypted in transit and at rest, and the service integrates with AWS security features like IAM roles and VPC endpoints.&lt;/li&gt;
&lt;li&gt;Broad Storage Support: It supports a wide range of storage systems, including Network File System (NFS), Server Message Block (SMB), and Amazon S3, Amazon EFS, and Amazon FSx.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You can use this service for migrating large datasets to AWS or archiving cold data from on-permises to cost effective AWS storage like AWS S3 Glacier, it can be used also in data replication to achieve business continuity.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS Migration Hub&lt;/strong&gt;: it's a like center location that you can manage &amp;amp; track the progress of applications migrations across multiple AWS accounts and partner solutions &lt;/p&gt;

&lt;p&gt;It has many features like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Centralized Tracking: Monitor the status of migrations from various tools like AWS Application Migration Service, AWS Database Migration Service, and partner migration tools.&lt;/li&gt;
&lt;li&gt;Application Discovery: Integrates with AWS Application Discovery Service to automatically gather information about your on-premises servers, including specifications, performance data, and network dependencies.&lt;/li&gt;
&lt;li&gt;Strategy Recommendations: Provides recommendations on the best migration and modernization strategy for your applications.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So picking the right service based on your needs is mandatory, we can summarize the use case for each service as below: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;looking for lift and shift solution then you can go for AWS MGN&lt;/li&gt;
&lt;li&gt;Database migration and schema conversion then use AWS DMS&lt;/li&gt;
&lt;li&gt;Replicating &amp;amp; moving data from on premises to AWS then the DataSync option is the one&lt;/li&gt;
&lt;li&gt;Centrally manage and track the progress of migration you can use migration hub&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Based on you migration strategy you can pick the right choice for you.&lt;/p&gt;

&lt;p&gt;we will discuss more the remaining services in another article but for now I will drop references below for each service if you want to dig deep in each service&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/datasync/latest/userguide/what-is-datasync.html" rel="noopener noreferrer"&gt;AWS DataSync&lt;/a&gt;&lt;br&gt;
&lt;a href="https://docs.aws.amazon.com/mgn/latest/ug/what-is-application-migration-service.html" rel="noopener noreferrer"&gt;AWS MGN&lt;/a&gt;&lt;br&gt;
&lt;a href="https://docs.aws.amazon.com/dms/latest/userguide/Welcome.html" rel="noopener noreferrer"&gt;AWS DMS&lt;/a&gt;&lt;br&gt;
&lt;a href="https://docs.aws.amazon.com/migrationhub/latest/ug/whatishub.html" rel="noopener noreferrer"&gt;AWS Migration Hub&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>[Boost]</title>
      <dc:creator>Muhammed Ashraf </dc:creator>
      <pubDate>Sun, 23 Feb 2025 10:27:45 +0000</pubDate>
      <link>https://dev.to/muash10/-27cl</link>
      <guid>https://dev.to/muash10/-27cl</guid>
      <description>&lt;div class="ltag__link"&gt;
  &lt;a href="/muash10" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__pic"&gt;
      &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F1008152%2F3cfc0b0b-22b7-48db-8c02-bdbe227d8918.jpg" alt="muash10"&gt;
    &lt;/div&gt;
  &lt;/a&gt;
  &lt;a href="https://dev.to/muash10/how-i-built-a-simple-twitter-like-system-on-aws-with-the-help-of-grok-ai-20b3" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__content"&gt;
      &lt;h2&gt;How I built a simple Twitter-Like System on AWS with the help of Grok AI&lt;/h2&gt;
      &lt;h3&gt;Muhammed Ashraf  ・ Feb 22&lt;/h3&gt;
      &lt;div class="ltag__link__taglist"&gt;
        &lt;span class="ltag__link__tag"&gt;#ai&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#aws&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#cloud&lt;/span&gt;
      &lt;/div&gt;
    &lt;/div&gt;
  &lt;/a&gt;
&lt;/div&gt;


</description>
      <category>ai</category>
      <category>aws</category>
      <category>cloud</category>
    </item>
    <item>
      <title>How I built a simple Twitter-Like System on AWS with the help of Grok AI</title>
      <dc:creator>Muhammed Ashraf </dc:creator>
      <pubDate>Sat, 22 Feb 2025 23:34:49 +0000</pubDate>
      <link>https://dev.to/muash10/how-i-built-a-simple-twitter-like-system-on-aws-with-the-help-of-grok-ai-20b3</link>
      <guid>https://dev.to/muash10/how-i-built-a-simple-twitter-like-system-on-aws-with-the-help-of-grok-ai-20b3</guid>
      <description>&lt;p&gt;As the article title states, Grok AI wrote most of the code, as my expertise lies in solution architecture, So I'm writing this article to share my experience in how I used Grok AI to help me to apply my experience to build this system and enhance my hands-on experience.&lt;/p&gt;

&lt;p&gt;I am not an expert coder, but I understand how large systems, such as social media websites, function.&lt;/p&gt;

&lt;p&gt;Building an enterprise system requires experience in system integration and service selection within different architectures. AI can assist, but its effective use requires a strong understanding of how these systems work.&lt;/p&gt;

&lt;p&gt;But don't worry, this article is written by me, not AI 😎😁&lt;/p&gt;

&lt;p&gt;First, you need to list the functional requirements of your system,&lt;/p&gt;

&lt;p&gt;Functional requirements are the core things your system should do. If we take a moment to think together about what functions a system like Twitter should have,&lt;br&gt;
the first thing that comes to mind is that the user should be able to sign up for an account and log in using that account.&lt;/p&gt;

&lt;p&gt;Also, users should be able to post and delete tweets, upload photo, love tweets, comments and retweets&lt;/p&gt;

&lt;p&gt;I tried to cover some core features to just help you understand how we can make this happens and later on we may build on these new features&lt;/p&gt;

&lt;p&gt;I will list the function requirements which covered by this system&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;User can sign up and login&lt;/li&gt;
&lt;li&gt;User should be able to post tweets&lt;/li&gt;
&lt;li&gt;User Should be able to delete his tweets&lt;/li&gt;
&lt;li&gt;User should be able to love and comments on tweets&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Non-functional requirments are define how system should behave&lt;/p&gt;

&lt;p&gt;They are like&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Elasticity &lt;/li&gt;
&lt;li&gt;High availability &lt;/li&gt;
&lt;li&gt;Scalability &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These requirements should enhance the user experience &lt;/p&gt;

&lt;p&gt;The second thing you should do is your capacity estimation. This will help you pick the right resources for your system to avoid any spikes or under/high utilization.&lt;/p&gt;

&lt;p&gt;We will not cover this here since it's a very simple system. You can search the internet; there are a lot of resources covering this. I will drop some links below 😁 ✌&lt;/p&gt;

&lt;h2&gt;
  
  
  High level Design &amp;amp; Components
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc9afrb0slm97c3z01iqc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc9afrb0slm97c3z01iqc.png" alt="Image description" width="800" height="562"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I picked AWS since this is my area i&lt;br&gt;
Of expertise and used common services for building the system&lt;/p&gt;

&lt;p&gt;Below is a breakdown of the services I used:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;EC2 instance: To host our frontend code and act as a web server.&lt;/li&gt;
&lt;li&gt;API Gateway: Built APIs used for signup, login and authorization, posting tweets, deleting tweets, liking tweets, and commenting. Each function has its own URL and Lambda function.&lt;/li&gt;
&lt;li&gt;Lambda Functions: Contain the logic for the system functionality mentioned above.&lt;/li&gt;
&lt;li&gt;DynamoDB: Contains Users and Tweets tables that store the data.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Sequence Diagrams
&lt;/h2&gt;

&lt;p&gt;Signup Flow&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff57wxzc7k064j2uhdnr4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff57wxzc7k064j2uhdnr4.png" alt="Image description" width="800" height="265"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Login Flow&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmv3160gblqmugomv8yvo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmv3160gblqmugomv8yvo.png" alt="Image description" width="800" height="262"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Post Tweet Flow&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flnpgs93we7kixf86995q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flnpgs93we7kixf86995q.png" alt="Image description" width="800" height="309"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Love Tweet&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffi6wogut01qc5x0n4kpw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffi6wogut01qc5x0n4kpw.png" alt="Image description" width="800" height="270"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Love Tweet, Comment &amp;amp; Delete Tweet they are all the same in terms of getting tweet_id and perform the action&lt;/p&gt;

&lt;p&gt;Some screenshots of the UI:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgch4iswolvbq95q6kxzg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgch4iswolvbq95q6kxzg.png" alt="Image description" width="800" height="267"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6qf8t6vwubkbao69xlpt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6qf8t6vwubkbao69xlpt.png" alt="Image description" width="800" height="284"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzkqh828k76tkj3g6c1y6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzkqh828k76tkj3g6c1y6.png" alt="Image description" width="800" height="274"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3o0lingpb660mk6wwmm9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3o0lingpb660mk6wwmm9.png" alt="Image description" width="800" height="336"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Code
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/Muhammedashraf10/twitter-like-app/tree/main" rel="noopener noreferrer"&gt;Twitter-Like-App&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And now for the interesting part, I uploaded a code on Github&lt;br&gt;
feel free to use it and remember this is a very basic code, Further enhancements are coming 😁&lt;/p&gt;

&lt;h2&gt;
  
  
  Final Words
&lt;/h2&gt;

&lt;p&gt;I know best practices are not applied here and many features are missing such as decoupling the components, caching service and following/followee system and the system looks dummy that cannot handle heavy workloads 😢🤦‍♀️, but it should give you a vision of how larger systems should work, and you can consider it a start.&lt;/p&gt;

&lt;p&gt;And you can make magic happen If you know the way and how you can interact with AI.&lt;/p&gt;

&lt;p&gt;I hope this article helped you to understand a little about how you can make use of AI tools and how you can build a system by help of these tools, I will try to work on this base version for further enhancement and features and I may create another article to include these enhancements.&lt;/p&gt;

&lt;p&gt;Will be happy to see your comments and suggestions 😃&lt;/p&gt;

&lt;h2&gt;
  
  
  References
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.geeksforgeeks.org/design-twitter-a-system-design-interview-question/" rel="noopener noreferrer"&gt;Twitter System design&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.youtube.com/watch?v=Nfa-uUHuFHg&amp;amp;t" rel="noopener noreferrer"&gt;Hello Interview Youtube Channel&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>aws</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Parsing &amp; Loading Data from S3 to DynamoDB with Lambda Function</title>
      <dc:creator>Muhammed Ashraf </dc:creator>
      <pubDate>Sun, 05 Jan 2025 14:40:57 +0000</pubDate>
      <link>https://dev.to/muash10/parsing-loading-data-from-s3-to-dynamodb-with-lambda-function-25ak</link>
      <guid>https://dev.to/muash10/parsing-loading-data-from-s3-to-dynamodb-with-lambda-function-25ak</guid>
      <description>&lt;p&gt;Many scenarios require you to work with data formatted as JSON, and you want to extract and process the data then save it into table for future use &lt;/p&gt;

&lt;p&gt;In this article we are going to discuss loading JSON formatted data from S3 bucket into DynamoDB table using Lambda function&lt;/p&gt;

&lt;h1&gt;
  
  
  Prerequisites
&lt;/h1&gt;

&lt;ol&gt;
&lt;li&gt;IAM user with permissions to upload objects to S3 &lt;/li&gt;
&lt;li&gt;Lambda Execution role with permissions to S3 &amp;amp; DynamoDB&lt;/li&gt;
&lt;/ol&gt;

&lt;h1&gt;
  
  
  Architecture &amp;amp; Components
&lt;/h1&gt;

&lt;p&gt;The architecture below shows we are using 3 AWS services&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;S3 bucket&lt;/li&gt;
&lt;li&gt;Lambda Function&lt;/li&gt;
&lt;li&gt;DynamoDB Table&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk52w8470ldmngsxp5wzj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk52w8470ldmngsxp5wzj.png" alt="Image description" width="681" height="451"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A brief description of services below as refreshment:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;S3 Bucket: Object storage service with scalability, security &amp;amp; high-performance service will be used as our storage service for the data&lt;/li&gt;
&lt;li&gt;Lambda Function: Serverless compute service which allows you to run your code without worrying about the infrastructure, easy to setup and support a lot of programming languages, we will utilize it to run our code and deploy our logic.&lt;/li&gt;
&lt;li&gt;DynamoDB: Serverless NoSQL database used to store our data in tables, we will use it to store our processed data by the Lambda function&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Flow
&lt;/h1&gt;

&lt;ol&gt;
&lt;li&gt;User will upload JSON file to S3 bucket through console or CLI which behind the scenes PutObject API&lt;/li&gt;
&lt;li&gt;Object is Uploaded successfully, S3 Event will be triggered to invoke the lambda function to load &amp;amp; process the file&lt;/li&gt;
&lt;li&gt;Lambda will process the data and load it into DynamoDB table&lt;/li&gt;
&lt;/ol&gt;

&lt;h1&gt;
  
  
  Implementation Steps
&lt;/h1&gt;

&lt;p&gt;We will walk through the steps &amp;amp; configuration for deploying the above diagram&lt;br&gt;&lt;br&gt;
1- Create Lambda Function with below Configuration&lt;/p&gt;

&lt;p&gt;Author from Scratch&lt;br&gt;
Function Name: ParserDemo&lt;br&gt;
Runtime: Python 3.1x&lt;/p&gt;

&lt;p&gt;Leave the rest as default &lt;br&gt;
After Lambda created, you will need to modify the timeout configuration &amp;amp; Execution role as below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fglo230xa6ht4602bovii.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fglo230xa6ht4602bovii.png" alt="Image description" width="800" height="139"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo1a5r8vya07jzwmek64u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo1a5r8vya07jzwmek64u.png" alt="Image description" width="800" height="268"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I wrote this python code to perform the logic&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import json
import boto3

s3_client = boto3.client('s3')
dynamodb = boto3.resource('dynamodb')

def lambda_handler(event, context):



    bucket_name = event['Records'][0]['s3']['bucket']['name'] # Getting the bucket name from the event triggered by S3
    object_key = event['Records'][0]['s3']['object']['key'] # Getting the Key of the item when the data is uploaded to S3
    print(f"Bucket: {bucket_name}, Key: {object_key}")


    response = s3_client.get_object(
    Bucket=bucket_name,
    Key=object_key
)


    # We will convert the streamed data into bytes
    json_data = response['Body'].read()
    string_formatted = json_data.decode('UTF-8') #Converting data into string

    dict_format_data = json.loads(string_formatted) #Converting Data into Dictionary 


    # Inserting Data Into DynamoDB

    table = dynamodb.Table('DemoTable')
    if isinstance(dict_format_data, list): #check if the file contains single record
        for record in dict_format_data:
            table.put_item(Item=record)

    elif isinstance(dict_format_data, dict): # check if the file contains multiple records 
        table.put_item(Item=data)

    else:  
        raise ValueError("Not Supported Format") # Raise error if nothing matched

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;2- Create S3 bucket &lt;/p&gt;

&lt;p&gt;BucketName: &lt;em&gt;use a unique name&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;leave the rest of configuration as default &lt;/p&gt;

&lt;p&gt;Add the created S3 bucket as a trigger to lambda function as below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuahbv45hdacatnatmq43.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuahbv45hdacatnatmq43.png" alt="Image description" width="800" height="333"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpcezmnrd3ksfh640bzz6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpcezmnrd3ksfh640bzz6.png" alt="Image description" width="778" height="513"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;3- Create a Table in the DynamoDB with the below configuration&lt;/p&gt;

&lt;p&gt;Table Name: DemoTable&lt;br&gt;
Partition Key: UserId&lt;br&gt;
Table Settings: Customized&lt;br&gt;
Capacity Mode: Provisioned &lt;/p&gt;

&lt;p&gt;To Save costs configure the provisioned capacity units for read/write with low value (1 or 2 units) &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx5vzff8u0j2u3tezd62c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx5vzff8u0j2u3tezd62c.png" alt="Image description" width="800" height="382"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk3acqa21sbec40oyrcsz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk3acqa21sbec40oyrcsz.png" alt="Image description" width="800" height="293"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now the setup is ready, you can test it by uploading a file to the S3, then you will find items created on the DynamoDB table with the records you have uploaded into the file.&lt;/p&gt;

&lt;p&gt;CloudWatch Logs for Lambda Function&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftilv8unegl1t7zj7wu7q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftilv8unegl1t7zj7wu7q.png" alt="Image description" width="715" height="750"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;DynamoDB Items&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy1312jmeav4vb1c60d8x.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy1312jmeav4vb1c60d8x.png" alt="Image description" width="800" height="287"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I hope you found this interesting and please let me know if you have any comments.&lt;/p&gt;

&lt;h1&gt;
  
  
  References
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3/client/get_object.html" rel="noopener noreferrer"&gt;S3 API&lt;/a&gt;&lt;br&gt;
&lt;a href="https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/dynamodb/client/put_item.html" rel="noopener noreferrer"&gt;DynamoDB API&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.udemy.com/course/aws-lambda-and-python-full-course-beginner-to-advanced/?srsltid=AfmBOortdTR_7mzPMIVAqN7FmqNUyqVQOaF5cZJNc2SBgMmWtLRQ6cVP" rel="noopener noreferrer"&gt;boto3 practice for AWS services&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>python</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Using Stable Diffusion in AWS Bedrock to Generate Images</title>
      <dc:creator>Muhammed Ashraf </dc:creator>
      <pubDate>Sun, 03 Nov 2024 14:50:22 +0000</pubDate>
      <link>https://dev.to/muash10/using-stable-diffusion-in-aws-bedrock-to-generate-images-1h70</link>
      <guid>https://dev.to/muash10/using-stable-diffusion-in-aws-bedrock-to-generate-images-1h70</guid>
      <description>&lt;p&gt;No doubt that Generative AI is now being used by nearly everyone in the technology industry, it helps a lot of people daily with their work, life and almost everything, &lt;/p&gt;

&lt;p&gt;Generative AI is type of AI that is used to generate contents such as texts, images and videos based on the data that the models trained on.&lt;/p&gt;

&lt;p&gt;there are a lot of foundations models which they are already trained on data, these models are very powerful and trained on very big datasets, and they are available to use such as: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Chat GPT&lt;/li&gt;
&lt;li&gt;Amazon Titan&lt;/li&gt;
&lt;li&gt;Stable Diffusion&lt;/li&gt;
&lt;li&gt;Claude &lt;/li&gt;
&lt;li&gt;Llama&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Amazon Bedrock is a managed service that offers these foundation model to be used or to you can use some AWS bedrock to train these models with your own data and use it with other AWS services to build your application.&lt;/p&gt;

&lt;p&gt;In this article we will building an API that invoke AWS Bedrock to generate images using SDXL 1.0 model&lt;/p&gt;

&lt;p&gt;We will use the below services &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Lambda function which contain the logic and integrate with AWS Bedrock&lt;/li&gt;
&lt;li&gt;API Gateway which exposes the Lambda function&lt;/li&gt;
&lt;li&gt;AWS Bedrock with Stable Diffusion Model&lt;/li&gt;
&lt;li&gt;S3 bucket to store the images &lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  High Level Design
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs4orod9a81d2cia534kf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs4orod9a81d2cia534kf.png" alt="Image description" width="800" height="365"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Steps
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;We will Create IAM role first for our lambda function with managed policy (S3FullAccess,BedrockFullAccess)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Go to IAM --&amp;gt; Select Roles --&amp;gt; Create Role and Select AWS Service and Lambda to be the use case --&amp;gt; Add the previous policies --&amp;gt; Lambda-execution-role-demo will be the name --&amp;gt; &lt;br&gt;
Create&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;We will create Lambda function with python runtime
*&lt;em&gt;Go to Lambda --&amp;gt; Create Function *&lt;/em&gt; &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvvjaommk4opdia0jgdc9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvvjaommk4opdia0jgdc9.png" alt="Image description" width="800" height="281"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Make sure to modify the Lambda timeout since it will be 3 seconds by default&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create S3 bucket with default configuration
*&lt;em&gt;Go to S3 Console --&amp;gt; Create Bucket --&amp;gt; Leave the default settings *&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F273elrul1jecs37pcsmr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F273elrul1jecs37pcsmr.png" alt="Image description" width="800" height="459"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Make sure to enter a unique bucket name&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;This bucket will be used to store the generated images and will be used to generate a pre-signed URLs to have temporary access to these images&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Subscribe to Amazon Bedrock Stable Diffusion model
&lt;strong&gt;Go to Bedrock Console --&amp;gt; Base Models under Foundation Models --&amp;gt; Request model access&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmz9jn36upsq6422g0ciu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmz9jn36upsq6422g0ciu.png" alt="Image description" width="800" height="263"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Search for SDXL you should able to see it, mark it and press next then Submit&lt;/p&gt;

&lt;p&gt;I am already subscribed to this model, so the configuration screen is different &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F56h1r1817a6et0s8w1vf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F56h1r1817a6et0s8w1vf.png" alt="Image description" width="800" height="304"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You will have access after few minutes, after than you are ready to go &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Get the model ID since it will be used later in Lambda python code
&lt;strong&gt;Go to Bedrock Console --&amp;gt; On the left select images under playground --&amp;gt; Select SDXL 1.0 and Press on the 3 dots&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F58jrewevrc45f49rj0i5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F58jrewevrc45f49rj0i5.png" alt="Image description" width="800" height="243"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select View API Request you will get a sample request to interact with this model as below &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8tj1rmu8znhfu24lrnvg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8tj1rmu8znhfu24lrnvg.png" alt="Image description" width="800" height="390"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;note down the model id stability.stable-diffusion-xl-v1&lt;/p&gt;
&lt;h2&gt;
  
  
  Code Breakdown
&lt;/h2&gt;

&lt;p&gt;First you need to import the below packages&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import json # for converting the payload into JSON
import boto3 # to interact with AWS services
import base64 # it will be used to encode the images 
import datetime # will be used to interact with date and time functions
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To interact with AWS services you will need to define client for the services &lt;/p&gt;

&lt;p&gt;below we are going to interact with bedrock and S3 so we will have two clients&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;bedrock_client = boto3.client('bedrock-runtime')
s3_client = boto3.client('s3')
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then you will get the input from prompt text from the API gateway&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  input_text = event['prompt']
  print(input_text) # for troubleshooting
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now you will need to configure the parameter for invoking the Stable Diffusion model, You can find the full parameters &lt;a href="https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-diffusion-1-0-text-image.html" rel="noopener noreferrer"&gt;here&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;I used the below&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;payload = {
        "text_prompts": [ #(Required) – An array of text prompts to use for generation. 
            {
                "text": input_text # The prompt that you want to pass to the model.
            }
        ],
        "height": 1024, # (Optional) Height of the image to generate
        "width": 1024, # (Optional) Width of the image to generate
        "cfg_scale": 7, # (Optional) Determines how much the final image portrays the prompt.
        "steps": 50, # (Optional) Generation step determines how many times the image is sampled
        "style_preset": "photographic" # A style preset that guides the image model towards a particular style
    }

# Now we are invoking the model 
    response = bedrock_client.invoke_model( 
        body=json.dumps(payload),
        contentType='application/json', # The MIME type of the input data in the request.
        accept='application/json', #The desired MIME type of the inference body in the response.
        modelId='stability.stable-diffusion-xl-v1' #model ID
    )
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We will now work to extract the image from the response and we will use base64 to decode it&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    response_byte=json.loads(response['body'].read()) # store the response bytes
    response_base64=response_byte['artifacts'][0]['base64'] #extract the artifacts into base 64
    resposne_image=base64.b64decode(response_base64) # decode the image
    image_name='images-'+ datetime.datetime.today().strftime('%Y-%M-%D-%H-%S') +'.jpg' # store the image name into date time format
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now we are going to put the generated image into S3 bucket and generate pre-signed URL&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt; s3_response=s3_client.put_object(
        Bucket='', #replace it with your bucket name
        Body=resposne_image,
        Key=image_name
    )

    generate_url=s3_client.generate_presigned_url('get_object', Params={'Bucket':'&amp;lt;add your bucket here&amp;gt;','Key':image_name}, ExpiresIn=3600)

 return {
        'statusCode': 200,
        'body':generate_url
    }

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Now to let's move to API Gateway, we are going to create REST API with any related name you want &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fls2fvw670h2lwb3ieixl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fls2fvw670h2lwb3ieixl.png" alt="Image description" width="800" height="554"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We will create a resource with get method &lt;/p&gt;

&lt;p&gt;** Create Resource --&amp;gt; Create Method with GET**&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6bsgpeay8c0n2npidr4o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6bsgpeay8c0n2npidr4o.png" alt="Image description" width="378" height="435"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select your Lambda Function and make sure that the API gateway has permission to invoke the Lambda function&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foqzzn24sgot612ufly3f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foqzzn24sgot612ufly3f.png" alt="Image description" width="800" height="768"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Make sure to modify the method request settings &amp;amp; URL query string parameters to be as below &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F08d6t5ajlgs06yopi07y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F08d6t5ajlgs06yopi07y.png" alt="Image description" width="800" height="356"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The value here should be similar to the one in the code &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx2dreo8rz37gsiqejavs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx2dreo8rz37gsiqejavs.png" alt="Image description" width="800" height="212"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Edit the mapping template in the integration request with the below values&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F34dyglno4913rskl98pt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F34dyglno4913rskl98pt.png" alt="Image description" width="800" height="398"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Now you are ready to Deploy your API and test it &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy9ytfmwmqedw6liotlbp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy9ytfmwmqedw6liotlbp.png" alt="Image description" width="662" height="637"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2us1c42wm8i51183oojw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2us1c42wm8i51183oojw.png" alt="Image description" width="800" height="191"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You should now receive a pre-signed returned to the API Gateway &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft86hdqsbkpi1z1g6l0kc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft86hdqsbkpi1z1g6l0kc.png" alt="Image description" width="800" height="356"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The generated image should be something like this &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnrdnhogvswyngl250kuw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnrdnhogvswyngl250kuw.png" alt="Image description" width="800" height="628"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Hope you like the demo 😊, Leave like to keep me motivated ❤&lt;/p&gt;

&lt;h2&gt;
  
  
  References
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.udemy.com/course/amazon-bedrock-aws-generative-ai-beginner-to-advanced/?couponCode=LETSLEARNNOW" rel="noopener noreferrer"&gt;Rahul Trisal Udemy Course&lt;/a&gt; *&lt;em&gt;Recommended *&lt;/em&gt;&lt;br&gt;
&lt;a href="https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-diffusion-1-0-text-image.html" rel="noopener noreferrer"&gt;SDXL Parameters&lt;/a&gt;&lt;br&gt;
&lt;a href="https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/bedrock-runtime/client/invoke_model.html" rel="noopener noreferrer"&gt;Boto3 Invoke Model&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>aws</category>
      <category>cloud</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Integrating AWS Connect with Amazon Lex to understand voice input Part1</title>
      <dc:creator>Muhammed Ashraf </dc:creator>
      <pubDate>Wed, 16 Oct 2024 09:25:05 +0000</pubDate>
      <link>https://dev.to/muash10/integrating-aws-connect-with-amazon-lex-to-understand-voice-input-part1-2ip7</link>
      <guid>https://dev.to/muash10/integrating-aws-connect-with-amazon-lex-to-understand-voice-input-part1-2ip7</guid>
      <description>&lt;p&gt;In this series of articles, we are going to discuss the below: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;How to create a flow on Amazon Connect.&lt;/li&gt;
&lt;li&gt;How to integrate it with Amazon Lex.&lt;/li&gt;
&lt;li&gt;Sending Contact Trace Records to S3 using Kinesis Firehose.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  HLD
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7cubxldiny284rzz6zq8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7cubxldiny284rzz6zq8.png" alt="Image description" width="800" height="622"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Overview
&lt;/h2&gt;

&lt;p&gt;Amazon Connect is service used to help you build your own contact center it’s a managed service by AWS which allow you to build your flows to handle customers’ requests &amp;amp; issues without the need of taking care of the infrastructure, we have used Amazon Connect in the setup to provide a contact flow to handle a customer with two categories (VIP &amp;amp; Golden) with two different queues for each category of customers, the flow is handling internet new requests &amp;amp; internet issues.&lt;/p&gt;

&lt;p&gt;Amazon Connect is integrated with Amazon Lex which is a service allows you to create chat bots to handle customers requests and issues with bot which will be used later in the contact flow to recognize customer requests.&lt;/p&gt;

&lt;p&gt;Amazon Lex is also a managed service used to build conversational bots which can be integrated by other services such as Lambda function, &lt;/p&gt;

&lt;p&gt;Amazon Lex will be invoked in the contact flow using Invoke Lex block and the Lex chat bot created to recognize two types of requests, new service requests &amp;amp; internet down problems.&lt;/p&gt;

&lt;h2&gt;
  
  
  Design Components
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmwa71aefqy7c81gb6fzs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmwa71aefqy7c81gb6fzs.png" alt="Image description" width="800" height="676"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Flow
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv5ihhb7gwhurw239kw89.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv5ihhb7gwhurw239kw89.png" alt="Image description" width="800" height="171"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Implementation Steps
&lt;/h2&gt;

&lt;p&gt;Through AWS Console you will have to create an Amazon Connect Instance&lt;/p&gt;

&lt;p&gt;I have created the instance with the below configuration for simplicity &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fre6xvp8m69h5km8a9zhj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fre6xvp8m69h5km8a9zhj.png" alt="Image description" width="800" height="442"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Note: The access URL should be unique&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Remember the Administrator credentials since it will be used later &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fslup0xqa63g9043e3xtr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fslup0xqa63g9043e3xtr.png" alt="Image description" width="800" height="556"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Set your telephony settings in order to allow instance can recieve and initiate calls &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgugfbc0umzgmxnki62ob.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgugfbc0umzgmxnki62ob.png" alt="Image description" width="800" height="239"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then we will configure our storage for call related information (recording &amp;amp; transcripts)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn5ifnleql7lsit56snxf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn5ifnleql7lsit56snxf.png" alt="Image description" width="800" height="695"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then we are going to connect with the administrator credentials to our instance through the Access URL&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbu1wqq0zdqmdcqvg61w2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbu1wqq0zdqmdcqvg61w2.png" alt="Image description" width="692" height="753"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You will have the below Dashboard&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi02s69id79hmu5ipc92h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi02s69id79hmu5ipc92h.png" alt="Image description" width="800" height="368"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We will create a very simple flow in this article, then we may explore creation of more flow in future&lt;/p&gt;

&lt;p&gt;Go to flows on the left as below &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiam3n1z2jx12alkdup8g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiam3n1z2jx12alkdup8g.png" alt="Image description" width="800" height="363"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then create flow and name it as sample flow&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fin0r2n93v53fz37d9btw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fin0r2n93v53fz37d9btw.png" alt="Image description" width="800" height="160"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;On the left we have call flow blocks, they are the main components of our IVR flow, we will go into more detail regarding these blocks later.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv18nt5kve6hvq2qw8b2d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv18nt5kve6hvq2qw8b2d.png" alt="Image description" width="800" height="335"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The type of this flow is inbound flow, which means it will receive incoming calls from customers&lt;/p&gt;

&lt;p&gt;We added two blocks related to the logging,&lt;/p&gt;

&lt;p&gt;Set Logging Behavior: When logging is enabled, data for each block in your flow is sent to Amazon CloudWatch Logs.&lt;/p&gt;

&lt;p&gt;Set recording and analytics behavior: This Specifies recording behavior and configure Contact Lens conversational analytics &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr0dcesiz6sklfbm81ds8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr0dcesiz6sklfbm81ds8.png" alt="Image description" width="800" height="338"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;then we will add these two blocks,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff51g1gdeezz46cyr64ah.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff51g1gdeezz46cyr64ah.png" alt="Image description" width="800" height="463"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The play prompt will be configured to welcome the customer &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fusrib5aanyuefcqz61wo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fusrib5aanyuefcqz61wo.png" alt="Image description" width="597" height="289"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The other block is to check hours of operation, This block helps you to define the working hours of your queue&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6egog301sw02ea2th2xs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6egog301sw02ea2th2xs.png" alt="Image description" width="599" height="318"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We will then use the below three blocks &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj4ycyslu9024sbu7xlfq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj4ycyslu9024sbu7xlfq.png" alt="Image description" width="800" height="346"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Get customer input block: will receive an input from the user through the dial pad and based on the value it will be routed to the related queue&lt;/p&gt;

&lt;p&gt;Set working queue block is defining current working queue to transfer the call &lt;/p&gt;

&lt;p&gt;we will go into details later for creating queues, but I have selected the already created Basic Queue &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk0d11jhnfr3wgkej7ir7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk0d11jhnfr3wgkej7ir7.png" alt="Image description" width="612" height="443"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;the final block is the actual transfer to queue, here where the customer connected to the actual agent, after the customer request is fulfilled, the call should be terminated &lt;/p&gt;

&lt;p&gt;After adding the terminate block, now you are ready to claim a number for your flow, but before moving forward you will have to save and publish your flow&lt;/p&gt;

&lt;p&gt;for error in each flow block you will have to play prompt that "we are not able to process your request now" for better user experience&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F54wx3sf96qx1p0ydtfhd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F54wx3sf96qx1p0ydtfhd.png" alt="Image description" width="361" height="214"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;now go the dashboard then phone numbers in channel to claim a number&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgooarpdbvnm466syswg1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgooarpdbvnm466syswg1.png" alt="Image description" width="435" height="139"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;the difference between DID and toll free in a brief that the DID can be dialed from anywhere around the world, while the toll free can be dialed only from the country that you are claiming the number for, more details &lt;a href="https://repost.aws/questions/QUqvpbqcdzSgaWUMGB0q5rwA/basic-difference-between-did-number-or-toll-free-number" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;channel will be voice, and you can choose any country just for testing purposes I have selected DID &amp;amp; UK &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8tu7idbu69iid1vyv1wd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8tu7idbu69iid1vyv1wd.png" alt="Image description" width="427" height="140"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;then select the flow that you are going to link this number with&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5fq78adc3r8mtxa6jnbv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5fq78adc3r8mtxa6jnbv.png" alt="Image description" width="287" height="166"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now you are ready to go and test your flow, I have used skype to dial and test this flow, in part two we are going to go into more details regarding voice recognition with Amazon Lex &lt;/p&gt;

</description>
      <category>cloud</category>
      <category>ivr</category>
      <category>aws</category>
    </item>
    <item>
      <title>Automate Cleaning of Unused EIP Through Lambda Part1</title>
      <dc:creator>Muhammed Ashraf </dc:creator>
      <pubDate>Wed, 02 Oct 2024 14:45:37 +0000</pubDate>
      <link>https://dev.to/muash10/automate-cleaning-of-unused-eip-through-lambda-part1-2mnb</link>
      <guid>https://dev.to/muash10/automate-cleaning-of-unused-eip-through-lambda-part1-2mnb</guid>
      <description>&lt;h2&gt;
  
  
  Overview
&lt;/h2&gt;

&lt;p&gt;The cost on AWS is considered one of the important things that everyone should take care of. It will be a headache if you have many resources on your account and a lot of charges occur.&lt;/p&gt;

&lt;p&gt;Sometimes, most of the charges are due to forgetting a running resource like an EC2 instance or forgetting to delete unused resources such as EBS or EIP.&lt;/p&gt;

&lt;p&gt;EBS and EIP charge you if they are provisioned and created on your account, whether used or not.&lt;/p&gt;

&lt;p&gt;Previously in &lt;a href="https://dev.to/muash10/delete-unused-ebs-volumes-through-lambda-function-1amb"&gt;this article&lt;/a&gt;, we discussed how to delete unattached EBS volumes to save you some costs. In this article, I will help you to delete unused EIPs and how to Automate this part in order to avoid charges for unnecessary EIPs.&lt;/p&gt;

&lt;h2&gt;
  
  
  High Level Design
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdmi9685kctno6894cv8w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdmi9685kctno6894cv8w.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Steps:
&lt;/h2&gt;

&lt;p&gt;we will utilize EventBridge new feature which is schedule to trigger our Lambda function every 1 day &lt;/p&gt;

&lt;p&gt;1- We will create a Lambda function default configuration as below with runtime python 3.12 and an execution role to VPC full access&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4qnpz3mt5d5lew5tdpiv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4qnpz3mt5d5lew5tdpiv.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is my python code I wrote in order to release unused EIPs&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import json
import boto3

def lambda_handler(event, context):
    ec2_resource = boto3.resource('ec2')

    elastic_ips_list = []
    for elastic_ips in ec2_resource.vpc_addresses.all():
        try:
            if elastic_ips.instance_id is None:
               elastic_ips_list.append(elastic_ips)
               print(f"Releasing the Unused Addresses \n")
               elastic_ips.release()
               print("IPs addresses released")
        except Exception as e:
            print(f"Error releasing IP Address {elastic_ip.public_ip}: {str(e)}")

    return {
        'body': json.dumps("The list of Unused Addresses is: " + str(elastic_ips_list))
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;2- We will create EventBridge schedule with the below configurations in order to trigger our lambda function&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fky58alihne47glmxly7a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fky58alihne47glmxly7a.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fikthllk2nbsa7keb25ij.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fikthllk2nbsa7keb25ij.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1bxs1e6ad7mks2rs2fym.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1bxs1e6ad7mks2rs2fym.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw7zlh8088je8d6wjckot.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw7zlh8088je8d6wjckot.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx6u9dq91h0mk34mxqgda.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx6u9dq91h0mk34mxqgda.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9bo763xfa852oyz5ncgz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9bo763xfa852oyz5ncgz.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs9jnpa4no754jci9o0o6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs9jnpa4no754jci9o0o6.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After the EventBridge schedule configured it will be ready to invoke our lambda function with the rate configured, the Lambda function will list the EIPs and will check for the unassociated ones to delete them, in the next article we will configure our lambda function to send an email with the deleted EIPs through AWS SNS.&lt;/p&gt;

&lt;h2&gt;
  
  
  Closing Words
&lt;/h2&gt;

&lt;p&gt;Monitoring your AWS cost is crucial to avoid any unnecessary charges, so always try to have checks on your resources to see what resources are unused in order to start deleting them. It's better to automate this task through a combination of EventBridge schedules and Lambda functions.&lt;/p&gt;

</description>
      <category>cloud</category>
      <category>aws</category>
      <category>lambda</category>
      <category>automation</category>
    </item>
  </channel>
</rss>
