<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Rahul Kumar Sharma</title>
    <description>The latest articles on DEV Community by Rahul Kumar Sharma (@rahulkspace).</description>
    <link>https://dev.to/rahulkspace</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/rahulkspace"/>
    <language>en</language>
    <item>
      <title>How To Set Up Multi-Factor Authentication for SSH on Ubuntu?</title>
      <dc:creator>Rahul Kumar Sharma</dc:creator>
      <pubDate>Fri, 15 Aug 2025 16:15:08 +0000</pubDate>
      <link>https://dev.to/rahulkspace/how-to-set-up-multi-factor-authentication-for-ssh-on-ubuntu-1hnb</link>
      <guid>https://dev.to/rahulkspace/how-to-set-up-multi-factor-authentication-for-ssh-on-ubuntu-1hnb</guid>
      <description>&lt;p&gt;Securing SSH access is a top priority for any Linux server, and relying on passwords alone is no longer sufficient against modern attacks like credential stuffing and brute force attempts. Adding multi‑factor authentication (MFA) to SSH hardens access by requiring something known (key or password) plus something possessed (a one‑time code or hardware key). &lt;br&gt;
Ubuntu supports common MFA methods through PAM (Pluggable Authentication Modules), including app‑based TOTP(Time-based One-time) codes (Google Authenticator/Authy/FreeOTP) and FIDO/U2F security keys (YubiKey, Nitrokey).&lt;/p&gt;

&lt;p&gt;Why MFA for SSH is Needed?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Reduces risk from password reuse and brute‑force attacks by requiring a second factor even if a credential is compromised.&lt;/li&gt;
&lt;li&gt;Integrates cleanly with existing SSH setups using PAM without replacing SSH keys or existing workflows.&lt;/li&gt;
&lt;li&gt;Ubuntu includes enhanced support for FIDO/U2F, enabling hardware‑backed second factors for stronger, phishing‑resistant authentication&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Prerequisites:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Ubuntu 20.04 server with SSH access and sudo privileges.&lt;/li&gt;
&lt;li&gt;An authenticator app (Google Authenticator) on a phone if using TOTP. &lt;/li&gt;
&lt;li&gt;A backup console or out‑of‑band access in case of misconfiguration to avoid lockout&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fapnqm1ibfs2kwtjas1y2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fapnqm1ibfs2kwtjas1y2.png" alt="Setup Flow" width="800" height="1200"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Steps:&lt;br&gt;
Step 1 — Install the Google Authenticator PAM module&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Update and install the PAM module:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;code&gt;sudo apt update &amp;amp;&amp;amp; sudo apt install -y libpam-google-authenticator&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Step 2 — Enroll each user with google-authenticator&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;For every account that will use SSH with MFA, run: &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;code&gt;google-authenticator&lt;/code&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Recommended prompts:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;code&gt;time‑based tokens -&amp;gt; (y)&lt;/code&gt;&lt;br&gt;
&lt;code&gt;update the ~/.google_authenticator file -&amp;gt; (y)&lt;/code&gt;&lt;br&gt;
&lt;code&gt;disallow reuse -&amp;gt; (y)&lt;/code&gt; &lt;br&gt;
&lt;code&gt;keep default window (-&amp;gt; n)&lt;/code&gt;&lt;br&gt;
&lt;code&gt;enable rate limiting -&amp;gt; (y)&lt;/code&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Scan the displayed QR code in the authenticator app and store the emergency scratch codes securely&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Step 3 — Enable PAM integration for SSH&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Edit PAM config:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;code&gt;sudo vi /etc/pam.d/sshd&lt;/code&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add the Google Authenticator line (place it near the top so it’s evaluated early): auth required pam_google_authenticator.so&lt;/li&gt;
&lt;li&gt;If using SSH keys with MFA and not passwords, comment out the Un*x password include to avoid password prompts: &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;code&gt;#@include common-auth&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Notes:Using “nullok” makes the factor optional for users without enrolment; remove it later to enforce MFA globally.&lt;/p&gt;

&lt;p&gt;Step 4 — Configure sshd to prompt for the second factor&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Back up and edit SSH server config:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;code&gt;sudo cp /etc/ssh/sshd_config /etc/ssh/sshd_config.bak &amp;amp;&amp;amp; sudo vi /etc/ssh/sshd_config&lt;/code&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Ensure these are set:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;code&gt;UsePAM yes&lt;/code&gt;&lt;br&gt;
&lt;code&gt;ChallengeResponseAuthentication yes&lt;/code&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;If using SSH keys + MFA, explicitly require both:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;code&gt;AuthenticationMethods publickey,keyboard-interactive&lt;/code&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Save and restart SSH:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;code&gt;sudo systemctl restart sshd.service&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Step 5 — Test login in a second terminal&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Connect and verify that after key authentication (or in addition to password if configured), SSH prompts for a “Verification code”.&lt;/li&gt;
&lt;li&gt;Use -v for verbose SSH output to confirm publickey then keyboard‑interactive flow. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Best Practices and Recovery:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Keep a break‑glass account or console access before enforcing MFA server‑wide to prevent lockouts during testing.&lt;/li&gt;
&lt;li&gt;Use “nullok” initially if migrating gradually, then remove it to mandate MFA for all users once enrolled.&lt;/li&gt;
&lt;li&gt;Store emergency scratch codes securely in a password manager or vault.&lt;/li&gt;
&lt;li&gt;For key+MFA deployments, use AuthenticationMethods publickey,keyboard-interactive to cryptographically require both factors.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
Adding MFA to SSH on Ubuntu significantly elevates security by combining something possessed (TOTP code) with something known or inherent (SSH key or password). The TOTP route via libpam‑google‑authenticator is quick and widely compatible. With PAM and a few sshd settings, Ubuntu delivers a robust MFA setup that meaningfully reduces the risk of unauthorized SSH access.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://rahulkspace.netlify.app/" rel="noopener noreferrer"&gt;Let's Connect!&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ubuntu</category>
      <category>mfa</category>
      <category>security</category>
      <category>linux</category>
    </item>
    <item>
      <title>VPC Security: Building Fortress-Like Network Architecture</title>
      <dc:creator>Rahul Kumar Sharma</dc:creator>
      <pubDate>Sat, 19 Jul 2025 15:22:03 +0000</pubDate>
      <link>https://dev.to/rahulkspace/vpc-security-building-fortress-like-network-architecture-166k</link>
      <guid>https://dev.to/rahulkspace/vpc-security-building-fortress-like-network-architecture-166k</guid>
      <description>&lt;p&gt;In the ever-evolving landscape of cloud security, our Amazon Virtual Private Cloud (VPC) serves as the foundation of our network defence strategy. I've learned that VPC security isn't just about checking boxes—it's about building a digital fortress that actually works under pressure. &lt;/p&gt;

&lt;p&gt;This comprehensive guide will walk you through advanced VPC security configurations that transform our cloud infrastructure into an impenetrable network architecture.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why Your Current VPC Security Probably Isn't Enough ?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Most of the time we make the same mistake: set up basic security groups, maybe throw in a NACL or two, and call it secure. Then reality hits. A misconfigured application suddenly needs database access. A new microservice requires communication with three other services. Before you know it, security rules become a tangled mess of "temporary" fixes that somehow became permanent.&lt;/p&gt;

&lt;p&gt;A well-architected VPC security strategy operates on multiple layers of defence, each serving a specific purpose in our overall security posture. Unlike traditional on-premises networks, AWS VPCs offer unprecedented granular control over network traffic, but this power comes with the responsibility of proper configuration.&lt;br&gt;
The core principle of VPC security revolves around the concept of "default deny"—nothing should be allowed unless explicitly permitted. This approach ensures that even if one security control fails, multiple layers of protection remain in place to safeguard our resources.&lt;/p&gt;

&lt;h2&gt;
  
  
  Network Segmentation: The Foundation of Fortress Architecture
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Multi-Tier Subnet Strategy
&lt;/h3&gt;

&lt;p&gt;The cornerstone of robust VPC security lies in strategic subnet segmentation. Design your network with clear boundaries between different application tiers and security zones.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Public Subnets&lt;/strong&gt; should house only resources that genuinely need internet access, such as load balancers, NAT gateways, and bastion hosts. Keep these subnets minimal and heavily monitored.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Private Subnets&lt;/strong&gt; contain your application servers, databases, and internal services. These subnets should never have direct internet access, routing outbound traffic through NAT gateways or NAT instances in public subnets.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Isolated Subnets&lt;/strong&gt; provide the highest level of security for sensitive data stores and critical infrastructure components. These subnets have no route to the internet gateway, ensuring complete isolation from external networks.&lt;/p&gt;

&lt;h3&gt;
  
  
  Cross-VPC Communication Patterns
&lt;/h3&gt;

&lt;p&gt;For organizations managing multiple VPCs, implement secure communication patterns using VPC peering or Transit Gateway. Design these connections with specific route tables that limit cross-VPC traffic to only necessary communication paths.&lt;/p&gt;

&lt;p&gt;Consider implementing a hub-and-spoke model where a central security VPC manages shared services like DNS resolution, centralized logging, and security monitoring. This approach provides better visibility and control over inter-VPC traffic flows.&lt;/p&gt;

&lt;h2&gt;
  
  
  Security Groups: Your Application Firewall
&lt;/h2&gt;

&lt;p&gt;Security Groups function as stateful firewalls operating at the instance level. Their stateful nature means that return traffic for allowed inbound connections is automatically permitted, simplifying rule management while maintaining security.&lt;/p&gt;

&lt;h3&gt;
  
  
  Advanced Security Group Patterns
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Layered Security Groups&lt;/strong&gt; involve applying multiple security groups to a single instance, each serving a specific purpose. For example, one security group might handle SSH access for administrators, while another manages application-specific ports. This modular approach enhances maintainability and reduces the risk of overly permissive rules.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Reference-Based Rules&lt;/strong&gt; leverage security groups as sources and destinations rather than IP ranges. This approach creates dynamic relationships that automatically adapt as instances are launched or terminated within referenced security groups. For instance, allow database access only from instances in the "application-tier" security group.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Port Range Minimization&lt;/strong&gt; requires opening only the specific ports your applications need. Avoid common shortcuts like opening ranges 1-65535 or using 0.0.0.0/0 as source unless absolutely necessary for public-facing services.&lt;/p&gt;

&lt;h3&gt;
  
  
  Security Group Governance
&lt;/h3&gt;

&lt;p&gt;Implement naming conventions that clearly indicate purpose and ownership. Use tags to categorize security groups by environment, application, and responsible team. Regular audits should identify unused or overly permissive security groups, with automated tools flagging rules that deviate from established baselines.&lt;/p&gt;

&lt;h2&gt;
  
  
  Network Access Control Lists: The Perimeter Defence
&lt;/h2&gt;

&lt;p&gt;NACLs operate at the subnet level, providing stateless filtering that evaluates both inbound and outbound traffic independently. This stateless nature requires explicit rules for both directions of communication, offering more granular control but requiring careful planning.&lt;/p&gt;

&lt;h3&gt;
  
  
  Strategic NACL Implementation
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Default Deny Approach&lt;/strong&gt; starts with NACLs that deny all traffic, then explicitly allows necessary communication. This approach ensures that forgotten or misconfigured rules don't create security gaps.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Ephemeral Port Management&lt;/strong&gt; requires understanding how your applications communicate. Since NACLs are stateless, you must account for return traffic on ephemeral ports (typically 1024-65535 for Linux, 49152-65535 for Windows). Consider using more restrictive ranges based on your operating system and application requirements.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;DDoS and Attack Mitigation&lt;/strong&gt; can be enhanced through NACL rules that block known malicious IP ranges, suspicious traffic patterns, or protocols not used by your applications. While not a complete DDoS solution, NACLs provide an additional layer of filtering.&lt;/p&gt;

&lt;h3&gt;
  
  
  NACL Rule Ordering and Optimization
&lt;/h3&gt;

&lt;p&gt;NACL rules are processed in numerical order, with lower numbers taking precedence. Design your rule numbering system with gaps (100, 200, 300) to allow future rule insertion without renumbering. Place more specific rules before general ones to ensure proper traffic handling.&lt;/p&gt;

&lt;h2&gt;
  
  
  Advanced Network Isolation Strategies
&lt;/h2&gt;

&lt;h3&gt;
  
  
  VPC Endpoints for Service Communication
&lt;/h3&gt;

&lt;p&gt;VPC Endpoints enable private communication with AWS services without traversing the public internet. Gateway endpoints for S3 and DynamoDB provide secure, cost-effective access to these services. Interface endpoints, powered by AWS PrivateLink, support numerous AWS services and third-party applications.&lt;/p&gt;

&lt;p&gt;Implement endpoint policies to control which resources can access specific services through the endpoint. These policies add another layer of access control beyond IAM permissions, ensuring that network-level restrictions complement identity-based controls.&lt;/p&gt;

&lt;h3&gt;
  
  
  DNS Security and Resolution
&lt;/h3&gt;

&lt;p&gt;Configure VPC DNS settings to prevent DNS-based data exfiltration and ensure proper name resolution. Enable DNS resolution and DNS hostnames for your VPC to support proper functionality of AWS services and applications.&lt;/p&gt;

&lt;p&gt;Consider implementing Route 53 Resolver for hybrid environments, allowing secure DNS resolution between on-premises networks and your VPCs. Create resolver rules that direct specific domains to appropriate DNS servers while maintaining security controls.&lt;/p&gt;

&lt;h3&gt;
  
  
  Flow Logs for Network Visibility
&lt;/h3&gt;

&lt;p&gt;VPC Flow Logs capture network traffic metadata, providing crucial visibility into communication patterns and potential security issues. Configure flow logs at the VPC, subnet, and network interface levels to capture comprehensive traffic information.&lt;/p&gt;

&lt;p&gt;Analyze flow logs to identify unusual traffic patterns, unauthorized communication attempts, and optimization opportunities. Integrate flow log data with security information and event management (SIEM) systems for real-time monitoring and alerting.&lt;/p&gt;

&lt;h2&gt;
  
  
  Monitoring and Threat Detection
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Network-Level Security Monitoring
&lt;/h3&gt;

&lt;p&gt;Deploy AWS GuardDuty to leverage machine learning for threat detection across your VPC infrastructure. GuardDuty analyzes VPC Flow Logs, DNS logs, and CloudTrail events to identify malicious activity, compromised instances, and reconnaissance attempts.&lt;/p&gt;

&lt;p&gt;Implement custom CloudWatch metrics and alarms for network-related security events. Monitor metrics such as rejected connections, unusual traffic volumes, and connections from unexpected geographic locations.&lt;/p&gt;

&lt;h3&gt;
  
  
  Automated Response and Remediation
&lt;/h3&gt;

&lt;p&gt;Create automated response workflows using AWS Lambda and CloudWatch Events to respond to security incidents. For example, automatically isolate compromised instances by modifying their security group rules or launching replacement instances in clean subnets.&lt;/p&gt;

&lt;p&gt;Develop runbooks for common network security scenarios, including DDoS response, compromise containment, and emergency access procedures. Test these procedures regularly to ensure effectiveness during actual incidents.&lt;/p&gt;

&lt;h2&gt;
  
  
  Implementation Best Practices and Common Pitfalls
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Planning and Design Phase
&lt;/h3&gt;

&lt;p&gt;Begin with a comprehensive network design that accounts for current needs and future growth. Map out traffic flows between application tiers, external dependencies, and administrative access requirements. This upfront planning prevents security gaps and performance issues later.&lt;/p&gt;

&lt;p&gt;Document your security group and NACL strategy, including naming conventions, rule approval processes, and regular review schedules. Maintain an inventory of network resources with clear ownership and purpose documentation.&lt;/p&gt;

&lt;h3&gt;
  
  
  Common Configuration Mistakes
&lt;/h3&gt;

&lt;p&gt;Avoid overly permissive rules that grant broader access than necessary. Regularly audit rules that use 0.0.0.0/0 as a source or destination, ensuring they're justified and properly secured through other means.&lt;/p&gt;

&lt;p&gt;Don't rely solely on security groups while ignoring NACLs. While security groups provide excellent instance-level protection, NACLs offer additional subnet-level controls that can prevent lateral movement and provide defence in depth.&lt;/p&gt;

&lt;h3&gt;
  
  
  Continuous Improvement and Maintenance
&lt;/h3&gt;

&lt;p&gt;Establish regular review cycles for all network security configurations. As applications evolve and new threats emerge, your network security posture must adapt accordingly. Use infrastructure as code tools like CloudFormation or Terraform to maintain consistent, version-controlled network configurations.&lt;/p&gt;

&lt;p&gt;Implement security scanning and compliance checking tools that continuously monitor your VPC configuration against security best practices and regulatory requirements. Address identified issues promptly and update your baseline configurations to prevent similar problems in the future.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion: Building Your Network Fortress
&lt;/h2&gt;

&lt;p&gt;Creating a fortress-like VPC architecture requires careful planning, layered security controls, and ongoing vigilance. By implementing comprehensive network segmentation, properly configured security groups and NACLs, advanced isolation strategies, and robust monitoring, you create a network infrastructure that can withstand sophisticated attacks while supporting your business objectives.&lt;/p&gt;

&lt;p&gt;Remember that network security is not a one-time configuration but an ongoing process of monitoring, analysis, and improvement. The techniques and strategies outlined in this guide provide the foundation for a secure VPC architecture, but they must be adapted to your specific requirements and regularly updated as threats evolve.&lt;/p&gt;

&lt;p&gt;Your network is only as strong as its weakest component. By applying these advanced VPC security practices consistently across your entire infrastructure, you build not just a fortress, but an intelligent, adaptive defence system capable of protecting your most valuable digital assets in the cloud.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://rahulkspace.netlify.app/" rel="noopener noreferrer"&gt;Let's Connect!&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>vpc</category>
      <category>security</category>
      <category>network</category>
    </item>
    <item>
      <title>Reliable S3 Data Replication: Automatically Mirror New Files Without Worrying About Deletions</title>
      <dc:creator>Rahul Kumar Sharma</dc:creator>
      <pubDate>Sat, 17 Aug 2024 07:58:16 +0000</pubDate>
      <link>https://dev.to/rahulkspace/reliable-s3-data-replication-automatically-mirror-new-files-without-worrying-about-deletions-4mjf</link>
      <guid>https://dev.to/rahulkspace/reliable-s3-data-replication-automatically-mirror-new-files-without-worrying-about-deletions-4mjf</guid>
      <description>&lt;p&gt;Ensuring the security and redundancy of your data is crucial in today's data-driven environment. Duplicating your data over several storage places is a good approach to protect it. We'll discuss how to automatically copy data from a primary S3 bucket to a backup bucket in this blog article. This will make sure that your important files are always safe and backed up, even in the event that the primary bucket is filled with garbage.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;A primary S3 bucket, event notifications, an AWS Lambda function, and a backup S3 bucket are the main elements of the architecture. These elements cooperate in the following ways to provide smooth data replication:&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Primary Bucket: You upload your stuff to the primary S3 bucket here. An event notice is sent whenever a new file is added to this bucket. Although it is merely a portion of the solution, this bucket serves as the source of truth for your data.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Event Notification: You can set up AWS S3 to send out notifications for particular events, such the uploading of a new file. Here, the event notification is configured to be triggered each time a file is added to the main bucket. This notice initiates the subsequent stage in the process; it doesn't carry out any action on its own.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;AWS Lambda Function: An AWS Lambda function is automatically called when an event notification is received. A little serverless piece of code that executes in response to events is called a lambda function. The freshly uploaded file is copied from the primary bucket to the backup bucket by this function in our setup. The backup bucket is always updated with the most recent files thanks to this nearly fast process.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Backup Bucket: The replicated files are kept in the backup S3 bucket. The backup bucket is set up to keep files even when they are removed from the primary bucket, in contrast to the latter. This implies that the backup copy stays secure and undamaged in the backup bucket even if a file is unintentionally or purposely deleted from the primary bucket.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;How S3 Replication Work?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk0suoevjusrlodl9lnul.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk0suoevjusrlodl9lnul.png" alt="Source: AWS" width="800" height="342"&gt;&lt;/a&gt;&lt;br&gt;
Source:AWS&lt;/p&gt;

&lt;p&gt;Why do we need this?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Data Redundancy: Every file uploaded to the primary bucket is immediately replicated to a backup site thanks to this architecture. This redundancy, which offers a backup copy of your data that you can rely on in the event that the primary bucket's data is lost, is essential for disaster recovery and data security.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Protection Against Deletion: The backup bucket does not synchronize deletions, which is one of this setup's most notable characteristics. A file stays in the backup bucket even after it is removed from the primary bucket. Because you can always restore the file from the backup bucket, this is especially helpful in preventing inadvertent data loss.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Steps:&lt;br&gt;
Step1: Create Two S3 Buckets: primary-bucket and backup-bucket.&lt;br&gt;
Step2: Create an IAM Role for the Lambda Function&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go to the IAM service in the AWS Management Console.&lt;/li&gt;
&lt;li&gt;Click on "Roles" in the left-hand menu, then "Create role".&lt;/li&gt;
&lt;li&gt;Select "AWS service" and choose "Lambda".&lt;/li&gt;
&lt;li&gt;Click "Next: Permissions".&lt;/li&gt;
&lt;li&gt;Click "Create policy" and go to the JSON tab.
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "s3:GetObject",
        "s3:ListBucket"
      ],
      "Resource": [
        "arn:aws:s3:::primary-bucket-7",
        "arn:aws:s3:::primary-bucket-7/*"
      ]
    },
    {
      "Effect": "Allow",
      "Action": [
        "s3:PutObject"
      ],
      "Resource": [
        "arn:aws:s3:::backup-bucket-7/*"
      ]
    },
    {
      "Effect": "Allow",
      "Action": [
        "sns:Publish"
      ],
      "Resource": "arn:aws:sns:ap-south-1:965519929135:s3DataBackUpSNS"
    }
  ]
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Attack the policy to Lambda role.&lt;/p&gt;

&lt;p&gt;Step3: Create an SNS Topic and Subscribe Your Email. &lt;br&gt;
Step4: Create a Lambda function.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import boto3
import urllib.parse

s3 = boto3.client('s3')
sns = boto3.client('sns')

def lambda_handler(event, context):
    for record in event['Records']:
        source_bucket = record['s3']['bucket']['name']
        source_key = urllib.parse.unquote_plus(record['s3']['object']['key'])
        destination_bucket = 'backup-bucket-7'
        copy_source = {'Bucket': source_bucket, 'Key': source_key}

        try:
            # Copy the object to the backup bucket
            s3.copy_object(CopySource=copy_source, Bucket=destination_bucket, Key=source_key)
            print(f'Successfully copied {source_key} from {source_bucket} to {destination_bucket}')

            # Send SNS notification
            sns.publish(
                TopicArn='arn:aws:sns:ap-south-1:965519929135:s3DataBackUpSNS',
                Subject='File Uploaded to Backup Bucket',
                Message=f'The file {source_key} has been successfully uploaded to {destination_bucket}.'
            )
            print(f'Successfully sent SNS notification for {source_key}')
        except Exception as e:
            print(f'Error copying {source_key} from {source_bucket} to {destination_bucket}: {str(e)}')
            raise e

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;The source bucket in the Lambda function is dynamically determined based on the event that triggers the function. This means that we don't need to hard-code the source bucket name in the Lambda function code. Instead, the source bucket is extracted from the event record whenever a new object is uploaded to the primary-bucket.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffwsgxm9328v48uiyqnd7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffwsgxm9328v48uiyqnd7.png" alt="S3-Lambda" width="800" height="279"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Step5: Configure S3 Event Notifications for the Primary Bucket&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go to the S3 service in the AWS Management Console.&lt;/li&gt;
&lt;li&gt;Select the primary-bucket.&lt;/li&gt;
&lt;li&gt;Go to the "Properties" tab.&lt;/li&gt;
&lt;li&gt;Scroll down to "Event notifications" and click "Create event notification".&lt;/li&gt;
&lt;li&gt;Configure the event to trigger on s3:ObjectCreated:* and select the Lambda function S3ReplicationFunction.&lt;/li&gt;
&lt;li&gt;Save the event notification.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Step6: Configure S3 Event Notifications for the Backup Bucket&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go to the S3 service in the AWS Management Console.&lt;/li&gt;
&lt;li&gt;Select the backup-bucket.&lt;/li&gt;
&lt;li&gt;Go to the "Properties" tab.&lt;/li&gt;
&lt;li&gt;Scroll down to "Event notifications" and click "Create event notification".&lt;/li&gt;
&lt;li&gt;Configure the event to trigger on s3:ObjectCreated:* and select the same Lambda function S3ReplicationFunction.&lt;/li&gt;
&lt;li&gt;Save the event notification.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Happy Blogging!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://beacons.ai/rahulsharma.rks" rel="noopener noreferrer"&gt;Let's Connect&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>s3</category>
      <category>lambda</category>
      <category>automation</category>
    </item>
    <item>
      <title>Simplify EC2-S3 File Access with Instance Roles</title>
      <dc:creator>Rahul Kumar Sharma</dc:creator>
      <pubDate>Sun, 07 Jul 2024 15:47:38 +0000</pubDate>
      <link>https://dev.to/rahulkspace/simplify-ec2-s3-file-access-with-instance-roles-4ljp</link>
      <guid>https://dev.to/rahulkspace/simplify-ec2-s3-file-access-with-instance-roles-4ljp</guid>
      <description>&lt;h1&gt;
  
  
  Access all the buckets:
&lt;/h1&gt;

&lt;p&gt;Create an IAM Role for the EC2 Instance:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go to the IAM console in AWS and create a role.&lt;/li&gt;
&lt;li&gt;Select "AWS service" as the trusted entity and choose "EC2." Click "Next: Permissions."&lt;/li&gt;
&lt;li&gt;Attach the policy “AmazonS3ReadOnlyAccess” to access the S3 bucket. &lt;/li&gt;
&lt;li&gt;Click "Next: Tags" (optional) and then "Next: Review."&lt;/li&gt;
&lt;li&gt;Give the role a name and click "Create role."&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8vcx9pluh98rkmrdekri.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8vcx9pluh98rkmrdekri.png" alt="Trusted Entity Type" width="800" height="466"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpa2wj2n1kpwfirg304hn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpa2wj2n1kpwfirg304hn.png" alt="Permission" width="800" height="411"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Attach the IAM Role to the EC2 Instance:&lt;br&gt;
Go to the EC2 console.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select the instance that you want to grant S3 access.&lt;/li&gt;
&lt;li&gt;Click on the "Actions" button, navigate to "Security" and then "Modify IAM Role."&lt;/li&gt;
&lt;li&gt;Choose the IAM role you created in the previous step and click "Update IAM role."&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Testing: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;SSH into the instance to verify. &lt;/li&gt;
&lt;li&gt;Install awscli into the instance.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"  
unzip awscliv2.zip  
sudo ./aws/install
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgdyouyp5vrpg8r8yggac.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgdyouyp5vrpg8r8yggac.png" alt="Output1" width="800" height="268"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Access specific S3 bucket:
&lt;/h1&gt;

&lt;p&gt;Create a Custom Policy for S3 Access:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Click "Create policy" to define a custom policy that grants list access to all S3 buckets and read access to a specific S3 bucket.&lt;/li&gt;
&lt;li&gt;Click "JSON" and paste the following policy:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": "s3:ListAllMyBuckets",
            "Resource": "*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::bucket-name",
                "arn:aws:s3:::bucket-name/*"
            ]
        }
    ]
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create a New Role:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Click on "Roles" in the left sidebar, then click "Create role."&lt;/li&gt;
&lt;li&gt;Select "AWS service" as the trusted entity type.&lt;/li&gt;
&lt;li&gt;Choose "EC2" under the "Use case" section, then click "Next” and attach the policy which you created. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Attach the IAM Role to the EC2 Instance:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go to the EC2 console.&lt;/li&gt;
&lt;li&gt;Select the instance that you want to grant S3 access.&lt;/li&gt;
&lt;li&gt;Click on the "Actions" button, navigate to "Security" and then "Modify IAM Role."&lt;/li&gt;
&lt;li&gt;Choose the IAM role you created in the previous step and click "Update IAM role."&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Testing:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0pm3xfcl3d9snydm6gcc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0pm3xfcl3d9snydm6gcc.png" alt="Output2" width="800" height="340"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://beacons.ai/rahulsharma.rks" rel="noopener noreferrer"&gt;Let's Connect!&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ec2</category>
      <category>aws</category>
      <category>s3</category>
      <category>iam</category>
    </item>
    <item>
      <title>Automated AWS Bill Reminder System using Serverless Architecture with AWS Lambda</title>
      <dc:creator>Rahul Kumar Sharma</dc:creator>
      <pubDate>Sat, 23 Mar 2024 20:52:08 +0000</pubDate>
      <link>https://dev.to/rahulkspace/automated-aws-bill-reminder-system-using-serverless-architecture-with-aws-lambda-58o3</link>
      <guid>https://dev.to/rahulkspace/automated-aws-bill-reminder-system-using-serverless-architecture-with-aws-lambda-58o3</guid>
      <description>&lt;p&gt;The aim is to develop a serverless system using AWS Lambda to notify users when their AWS bill is pending, to ensure timely payment and avoid disruptions to AWS services.&lt;/p&gt;

&lt;p&gt;Components:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AWS Lambda Functions: Use Lambda functions to periodically check the status of AWS billing and send notifications to users when their bills are pending.&lt;/li&gt;
&lt;li&gt;AWS Billing and Cost Management: Utilize AWS Billing and Cost Management services to retrieve billing information and check the status of pending bills.&lt;/li&gt;
&lt;li&gt;Amazon SNS: Use Amazon Simple Notification Service (SNS) to send email notifications to users when their AWS bills are pending.&lt;/li&gt;
&lt;li&gt;AWS CloudWatch Events: Set up CloudWatch Events to trigger Lambda functions at regular intervals for checking the billing status.&lt;/li&gt;
&lt;li&gt;AWS SDKs: Leverage AWS SDKs (e.g., Boto3 for Python) to interact with AWS services programmatically and automate billing notifications.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Steps to Implement:&lt;/p&gt;

&lt;p&gt;Set up IAM Roles:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;To send notifications via Amazon SNS and use AWS Billing and Cost Management services, create an IAM role with the necessary permissions.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Write Lambda Function:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Develop a Lambda function that retrieves billing information using the AWS SDK. &lt;/li&gt;
&lt;li&gt;Implement logic to check if the bill is pending and trigger a notification if necessary.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Configure CloudWatch Event Rule:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Set up a CloudWatch Event rule to trigger the Lambda function at regular intervals (e.g., daily, weekly) to check the billing status.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Implement Notification Logic:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use the SNS service to send email notifications to users when their AWS bills are pending.&lt;/li&gt;
&lt;li&gt;Customize the notification message with relevant billing details and instructions for payment.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Handle Error Cases:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Incorporate error handling into the Lambda function to handle exceptions, including failed service requests and incorrectly retrieved billing data.&lt;/li&gt;
&lt;li&gt;Configure appropriate logging and monitoring using CloudWatch to track function executions and errors.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Testing:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Test the Lambda function and CloudWatch Event rule to ensure they trigger notifications correctly based on the billing status.&lt;/li&gt;
&lt;li&gt;Verify that users receive notifications as expected and that the notification content is accurate.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Deployment:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Deploy the Lambda function and CloudWatch Event rule in the AWS account where billing notifications are required.&lt;/li&gt;
&lt;li&gt;Configure necessary permissions and IAM roles for the Lambda function to access AWS Billing and SNS services.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Monitor and Maintain:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Monitor the execution of Lambda functions and CloudWatch Events to ensure they run as scheduled and handle any failures or exceptions promptly.&lt;/li&gt;
&lt;li&gt;Periodically review and update the system to accommodate changes in AWS billing policies or user requirements.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Lambda Function:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import json
import boto3

def lambda_handler(event, context):
    # Initialize AWS services
    sns_client = boto3.client('sns')
    billing_client = boto3.client('ce')  # AWS Cost Explorer

    # Retrieve billing information
    response = billing_client.get_cost_and_usage(
        TimePeriod={
            'Start': '2024-01-01',
            'End': '2024-03-01'  # Modify the date range as needed
        },
        Granularity='MONTHLY',
        Metrics=['UnblendedCost']  # You can customize the metrics as needed
    )

    # Extract the total cost for the current billing period
    total_cost = float(response['ResultsByTime'][0]['Total']['UnblendedCost']['Amount'])

    # Check if the bill is pending (total cost &amp;gt; 0)
    if total_cost &amp;gt; 0:
        # Send notification
        topic_arn = 'arn:aws:sns:ap-south-1:965519929135:awsBillNotification'  # Replace with your SNS topic ARN
        message = f'Your AWS bill for the current month is pending. Total amount due: ${total_cost:.2f}'
        subject = 'Action Required: Your AWS Bill is Pending'

        sns_client.publish(
            TopicArn=topic_arn,
            Message=message,
            Subject=subject
        )

        print('Billing notification sent successfully')
    else:
        print('No pending bills found')

    return {
        'statusCode': 200,
        'body': 'Billing notification process completed'
    }


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Testing:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj1dgcnox6h61v44hhimn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj1dgcnox6h61v44hhimn.png" alt="Lambda Test" width="800" height="263"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Output:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgsnwja9ji9dyutw0ei73.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgsnwja9ji9dyutw0ei73.png" alt="Email Output" width="800" height="269"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://beacons.ai/rahulsharma.rks" rel="noopener noreferrer"&gt;Find me Online!&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>lambda</category>
      <category>automation</category>
      <category>bill</category>
    </item>
    <item>
      <title>Exploring ChatGPT's Magic in the Linux Command Line</title>
      <dc:creator>Rahul Kumar Sharma</dc:creator>
      <pubDate>Thu, 07 Sep 2023 14:52:28 +0000</pubDate>
      <link>https://dev.to/rahulkspace/exploring-chatgpts-magic-in-the-linux-command-line-4lj3</link>
      <guid>https://dev.to/rahulkspace/exploring-chatgpts-magic-in-the-linux-command-line-4lj3</guid>
      <description>&lt;p&gt;Integrating ChatGPT into the Linux terminal opens a world of possibilities for natural language interactions and automation. By combining the power of OpenAI's ChatGPT with the flexibility of the Linux terminal, users can harness the capabilities of this language model for a wide range of tasks and applications.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;ShellGPT:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Users can communicate with the AI chatbot on their Linux terminal using ShellGPT, a command line interface for ChatGPT. It is based on the GPT large language model from OpenAI.&lt;/p&gt;

&lt;p&gt;Based on our text input, ShellGPT can provide intelligent suggestions and recommendations and even run shell commands. Additionally, it gains knowledge from our exchanges and improves with time. Users no longer need to input lengthy commands or memorize difficult Linux Terminal commands thanks to the ChatGPT application included into the command line. By having ChatGPT handle part of their tedious labor on their behalf, they may reduce errors while saving critical time.&lt;/p&gt;

&lt;p&gt;Prerequisites to install ChatGPT in Linux:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;AWS account&lt;/li&gt;
&lt;li&gt;Create an EC2 Instance(Ubuntu).&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Steps:&lt;/p&gt;

&lt;p&gt;a. Create an EC2 instance and SSH into it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqzty3r3oej1vcgnb9z9y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqzty3r3oej1vcgnb9z9y.png" alt="EC2Instance" width="800" height="155"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;b. Check if Python is installed in it or not.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8pf6f7anatcv79zl58v4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8pf6f7anatcv79zl58v4.png" alt="check python version" width="695" height="146"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;c. Install PIP and check the version.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F67kb5rc1h90q6ml2flpx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F67kb5rc1h90q6ml2flpx.png" alt="Install PIP" width="750" height="78"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi03hovj7dvrgwyj4csph.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi03hovj7dvrgwyj4csph.png" alt="PIP Version" width="750" height="101"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;d. Install venv.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuw03fuvzxvrj0ac6ld68.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuw03fuvzxvrj0ac6ld68.png" alt="Install venv" width="750" height="120"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;e. Setup ShellGPT to use ChatGPT.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Setup environment.&lt;/li&gt;
&lt;li&gt;Create a directory.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqfz01ii3236teb1mwp7w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqfz01ii3236teb1mwp7w.png" alt="mkdir" width="594" height="113"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create a virtual environment.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo7oqghcnf5l0xm26mrl8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo7oqghcnf5l0xm26mrl8.png" alt="create virtual env" width="750" height="98"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Activate the env.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsg3z09gcy0mbqp45qhw2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsg3z09gcy0mbqp45qhw2.png" alt="activate env" width="750" height="78"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;f. To get OpenAI API Key, navigate to OpenAI’s website.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkh6pk5xbwba3h6h5t185.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkh6pk5xbwba3h6h5t185.png" alt="OpenAi" width="753" height="248"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;g. Click on profile and select View API Keys.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6hee6pr51x3k2a26lp8b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6hee6pr51x3k2a26lp8b.png" alt="profile" width="753" height="398"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;h. Click on Create new Secret key.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftnzux0p16cef2zs8v0sk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftnzux0p16cef2zs8v0sk.png" alt="create key" width="753" height="294"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;i. Copy the Key and save it.&lt;/p&gt;

&lt;p&gt;j. Now, create an environment variable for this API key with the command below. In Linux, you can create an environment variable using the “export” command.&lt;/p&gt;

&lt;p&gt;Replace  placeholder with the actual API key you generated to use ChatGPT in the Linux terminal.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;export OPENAI_API_KEY=sk-YHhk0****************RMVLVZdqQjiNZ&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fof45wvol6jwi39lvug2q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fof45wvol6jwi39lvug2q.png" alt="export" width="800" height="80"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;k. To verify if the export is successfully use env.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fji45g485e2djx0ol34h9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fji45g485e2djx0ol34h9.png" alt="verifyexport" width="753" height="211"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;l. To store variable permanently, open .bashrc file and add the API key.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg0wofp4hkwbgdfnnznfp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg0wofp4hkwbgdfnnznfp.png" alt="bashrc" width="753" height="107"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;m. To make commands in effect, run the command: source .bashrc .&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8p6o6tseqvmxgxqtheqo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8p6o6tseqvmxgxqtheqo.png" alt="commandeffect" width="753" height="87"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;n. Install ShellGPT.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvjy4yg9pkgy9s3098a3o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvjy4yg9pkgy9s3098a3o.png" alt="InstallShellGPT" width="753" height="164"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;o. Now that we have install ShellGPT, we can use it.&lt;/p&gt;

&lt;p&gt;Syntax:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feq9ci483658b9mn72ott.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feq9ci483658b9mn72ott.png" alt="chat Syntax" width="453" height="112"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Options&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;--temperature&lt;/td&gt;
&lt;td&gt;Changes the randomness of the output&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;--top-probablity&lt;/td&gt;
&lt;td&gt;Limits to only the highest probable tokens or words&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;--chat&lt;/td&gt;
&lt;td&gt;Used to have a conversation with a unique name&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;--shell&lt;/td&gt;
&lt;td&gt;Used to get shell commands as output&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;--execute&lt;/td&gt;
&lt;td&gt;Executes the commands received as output from --shell option&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;--code&lt;/td&gt;
&lt;td&gt;Used to get code as output&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;In Action:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flpfxu3g9k0v710c48uaw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flpfxu3g9k0v710c48uaw.png" alt="OP1" width="800" height="203"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuysqwjndm6cfqtfrkygy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuysqwjndm6cfqtfrkygy.png" alt="OP2" width="800" height="136"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's &lt;a href="https://beacons.ai/rahulsharma.rks" rel="noopener noreferrer"&gt;Connect!&lt;/a&gt; &lt;/p&gt;

</description>
      <category>chatgpt</category>
      <category>openai</category>
      <category>linux</category>
      <category>automation</category>
    </item>
    <item>
      <title>Unlocking Real-Time Insights: Harnessing Lambda and SES to Supercharge S3 Bucket Alerts</title>
      <dc:creator>Rahul Kumar Sharma</dc:creator>
      <pubDate>Sat, 12 Aug 2023 20:21:30 +0000</pubDate>
      <link>https://dev.to/rahulkspace/unlocking-real-time-insights-harnessing-lambda-and-ses-to-supercharge-s3-bucket-alerts-4cka</link>
      <guid>https://dev.to/rahulkspace/unlocking-real-time-insights-harnessing-lambda-and-ses-to-supercharge-s3-bucket-alerts-4cka</guid>
      <description>&lt;p&gt;In the age of data-driven decision-making, real-time insights play a pivotal role in ensuring the smooth operation of business processes. Amazon Simple Storage Service (S3) offers a robust storage solution, while AWS Lambda provides the power of event-driven computing. When combined with Amazon Simple Email Service (SES), this trio becomes a formidable toolset for creating a highly efficient alert mechanism. In this blog post, we will delve into the process of leveraging Lambda and SES to supercharge S3 bucket alerts, enabling you to gain real-time insights and proactively respond to critical events.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Amazon Simple Storage Service (Amazon S3)&lt;/strong&gt; is a versatile and highly scalable cloud storage solution provided by Amazon Web Services (AWS). Designed to handle large amounts of data and seamlessly integrate with various AWS services, S3 has become a foundational component for countless applications, from simple file storage to complex data analysis and backup solutions.&lt;br&gt;
S3's features and capabilities include:&lt;/p&gt;

&lt;p&gt;Features and Capabilities of Amazon S3:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Scalability: S3 scales effortlessly to accommodate your growing storage needs. You can start small and expand to petabytes of data without any disruption.&lt;/li&gt;
&lt;li&gt;Data Security: S3 offers various security mechanisms, including access control policies, identity and access management (IAM), and encryption options (both in transit and at rest) to keep your data safe.&lt;/li&gt;
&lt;li&gt;Data Management: With S3's lifecycle policies, you can automatically transition objects between storage classes (Standard, Intelligent-Tiering, Glacier, etc.) based on access patterns and cost considerations.&lt;/li&gt;
&lt;li&gt;Versioning: S3 enables you to maintain multiple versions of an object, allowing you to recover from accidental deletions or changes.&lt;/li&gt;
&lt;li&gt;Event Notifications: S3 supports event triggers that can invoke AWS Lambda functions, enabling real-time data processing and event-driven architecture.&lt;/li&gt;
&lt;li&gt;Cross-Region Replication: You can replicate objects across different regions for disaster recovery or to reduce latency for global users.&lt;/li&gt;
&lt;li&gt;Data Analytics: S3 integrates seamlessly with AWS analytics services like Amazon Athena, Amazon Redshift Spectrum, and Amazon EMR for data processing and analysis.&lt;/li&gt;
&lt;li&gt;Static Website Hosting: S3 can host static websites, allowing you to serve web content directly from your bucket.&lt;/li&gt;
&lt;li&gt;Multipart Upload: For large files, S3 supports multipart uploads, making it efficient and resilient even in less reliable network conditions.&lt;/li&gt;
&lt;li&gt;Data Access Control: S3 offers fine-grained control over access permissions, allowing you to specify who can access your objects and how.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;AWS Lambda&lt;/strong&gt; is a serverless compute service provided by Amazon Web Services (AWS) that enables you to run code without provisioning or managing servers. It's designed to help you build and deploy applications quickly and efficiently by allowing you to focus solely on your code and its functionality, while AWS handles the underlying infrastructure, scaling, and maintenance.&lt;/p&gt;

&lt;p&gt;Features of AWS Lambda include:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Event-Driven Computing: Lambda functions are triggered by various events within AWS services, such as changes in data in Amazon S3 buckets, updates in DynamoDB tables, HTTP requests via Amazon API Gateway, or even custom events. This event-driven model promotes a reactive and highly scalable architecture.&lt;/li&gt;
&lt;li&gt;Scalability: Lambda functions scale automatically in response to the volume of incoming events. This means your code can handle a few requests or millions, without manual intervention, ensuring optimal performance.&lt;/li&gt;
&lt;li&gt;Pay-as-You-Go Pricing: With Lambda, you only pay for the compute time your code consumes. You're billed based on the number of requests and the time it takes for your code to execute. This cost-effective model eliminates the need to pay for idle server time.&lt;/li&gt;
&lt;li&gt;Language Flexibility: AWS Lambda supports a variety of programming languages, including Python, Node.js, Java, C#, and more, allowing developers to use their preferred language for building functions.&lt;/li&gt;
&lt;li&gt;Stateless Execution: Lambda functions are stateless, meaning they don't retain data between executions. This encourages the use of external data stores (like databases or Amazon S3) for persisting data.&lt;/li&gt;
&lt;li&gt;Integration with AWS Services: Lambda can easily integrate with other AWS services, enabling you to create powerful workflows and automation. For example, you can process data, generate reports, or trigger actions across various services.&lt;/li&gt;
&lt;li&gt;Customizable Configuration: You can configure various parameters for your Lambda functions, including memory allocation, timeout duration, and environment variables. This allows you to optimize performance and resource usage.&lt;/li&gt;
&lt;li&gt;Versioning and Aliases: Lambda supports versioning and aliases, allowing you to manage and deploy different versions of your code and direct traffic to specific versions.&lt;/li&gt;
&lt;li&gt;Testing and Debugging: AWS provides tools and resources for testing and debugging Lambda functions locally before deploying them to the cloud.&lt;/li&gt;
&lt;li&gt;Security and Access Control: Lambda integrates with AWS Identity and Access Management (IAM) for fine-grained control over who can invoke your functions and what resources they can access.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Amazon Simple Email Service (Amazon SES)&lt;/strong&gt; is a cloud-based email-sending service provided by Amazon Web Services (AWS). It offers a reliable, scalable, and cost-effective solution for sending transactional, marketing, and notification emails, allowing businesses to communicate effectively with their customers and users.&lt;/p&gt;

&lt;p&gt;Features of Amazon SES include:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Email Sending: Amazon SES enables you to send a wide range of email types, including one-to-one transactional emails, marketing campaigns, and automated notifications. It supports both HTML and plain text emails.&lt;/li&gt;
&lt;li&gt;High Deliverability: Amazon SES leverages Amazon's proven email infrastructure to help ensure that your emails are delivered to the recipient's inbox. It provides tools for monitoring and improving email deliverability.&lt;/li&gt;
&lt;li&gt;Scaling and Resilience: SES automatically scales to handle large volumes of email traffic, making it suitable for businesses of all sizes. It also offers built-in redundancy and fault tolerance.&lt;/li&gt;
&lt;li&gt;Flexible Integration: Amazon SES integrates seamlessly with other AWS services, making it easy to incorporate email functionality into your applications. It can be used in conjunction with AWS Lambda for event-driven email sending.&lt;/li&gt;
&lt;li&gt;Customizable Templates: You can create and manage email templates in Amazon SES, allowing you to maintain consistent branding and design across your email communications.&lt;/li&gt;
&lt;li&gt;Bounce and Complaint Handling: SES provides feedback loops to help you manage bounces and complaints, allowing you to maintain a clean and engaged recipient list.&lt;/li&gt;
&lt;li&gt;Content Filtering: SES includes options for content filtering, allowing you to add custom headers or apply rules to filter out certain types of content.&lt;/li&gt;
&lt;li&gt;Sending Statistics: SES provides detailed sending statistics, including delivery rates, bounce rates, and complaint rates, which can help you monitor and improve your email campaigns.&lt;/li&gt;
&lt;li&gt;Cost-Efficiency: Amazon SES offers pay-as-you-go pricing, meaning you only pay for the emails you send. It provides a cost-effective alternative to building and maintaining your email infrastructure.&lt;/li&gt;
&lt;li&gt;Data Security and Compliance: SES offers encryption options for data in transit and at rest. It also provides mechanisms to comply with email-related regulations, such as CAN-SPAM and GDPR.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Architecture Diagram&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnxhsxxfq9gqdg3reydmx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnxhsxxfq9gqdg3reydmx.png" alt="Image description" width="800" height="357"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Steps to implement this:&lt;br&gt;
One AWS Console in your browser, once you are logged in. &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create an S3 bucket.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Farkxke6ijlcts37sbqsj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Farkxke6ijlcts37sbqsj.png" alt="Image description" width="800" height="94"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create an IAM role to perform this.&lt;/li&gt;
&lt;li&gt;Set up an SES domain or email, verify the identities before we move to next step.&lt;/li&gt;
&lt;li&gt;Create a lambda function and choose a runtime, I will be using python3.7.&lt;/li&gt;
&lt;li&gt;Set up an S3 bucket trigger for the lambda function. This trigger will execute the lambda function, whenever any object is uploaded in the S3 bucket. &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzzg5wsgnbzou7lc0o0pk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzzg5wsgnbzou7lc0o0pk.png" alt="Image description" width="800" height="383"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Add the below code in a lambda file, which consists of HTML, CSS, and Python to send mail.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import json
import boto3
import os

ses_client = boto3.client('ses')

def send_email(bucket, key, recipient, event_type, sender_name, sender_email):
    sender = f'{sender_name} &amp;lt;{sender_email}&amp;gt;'
    subject = f'S3 Event Notification - {event_type}'
    body = f"""
        &amp;lt;html&amp;gt;
        &amp;lt;head&amp;gt;
        &amp;lt;/head&amp;gt;
        &amp;lt;body style="text-align: left; font-family: Arial, sans-serif; color: black; background-color: #e3f2fd; padding: 5%"&amp;gt;
            &amp;lt;h1 style="color: red; text-align: center"&amp;gt;S3 Alert!&amp;lt;/h1&amp;gt;
            &amp;lt;p&amp;gt;Hello {recipient},&amp;lt;/p&amp;gt;
            &amp;lt;p&amp;gt;A {event_type} event occurred in the bucket {bucket}.&amp;lt;/p&amp;gt;
            &amp;lt;p&amp;gt;File key: {key}&amp;lt;/p&amp;gt;
        &amp;lt;/body&amp;gt;
        &amp;lt;/html&amp;gt;
    """
    response = ses_client.send_email(
        Source=sender,
        Destination={
            'ToAddresses': [recipient],
        },
        Message={
            'Subject': {
                'Data': subject,
            },
            'Body': {
                'Html': {
                    'Data': body,
                },
            },
        }
    )
    print('Email sent to:', recipient)

def lambda_handler(event, context):
    recipients = ['user1@gmail.com', 'user2@gmail.com']
    sender_name = os.environ['SENDER_NAME']
    sender_email =  os.environ['SENDER_EMAIL']
    try:
        for record in event['Records']:
            bucket = record['s3']['bucket']['name']
            key = record['s3']['object']['key']
            event_type = record['eventName']

            for recipient in recipients:
                send_email(bucket, key, recipient, event_type, sender_name, sender_email)
        return {
            'statusCode': 200,
            'body': json.dumps('Emails sent successfully')
        }

    except Exception as e:
        print('Error sending emails:', e)
        return {
            'statusCode': 500,

            'body': json.dumps('Error sending emails')

        }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Select Configuration in the Lambda function and add the sender’s name and email. &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fko2f6ssslf1uz8h5k8qg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fko2f6ssslf1uz8h5k8qg.png" alt="Image description" width="800" height="259"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;To test it, upload a file into the bucket, you will receive a mail. &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftli7q6gnm2qccgmenjxi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftli7q6gnm2qccgmenjxi.png" alt="Image description" width="800" height="342"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can &lt;a href="https://beacons.ai/rahulsharma.rks" rel="noopener noreferrer"&gt;connect&lt;/a&gt; with me.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>s3</category>
      <category>lambda</category>
      <category>ses</category>
    </item>
  </channel>
</rss>
