<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Niran</title>
    <description>The latest articles on DEV Community by Niran (@niranyadav).</description>
    <link>https://dev.to/niranyadav</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/niranyadav"/>
    <language>en</language>
    <item>
      <title>Your Current Backup Automation Is Missing the Key: Ansible, AWS S3, and Cron 🔑</title>
      <dc:creator>Niran</dc:creator>
      <pubDate>Sun, 20 Oct 2024 06:50:41 +0000</pubDate>
      <link>https://dev.to/niranyadav/your-current-backup-automation-is-missing-the-key-ansible-aws-s3-and-cron-2mb8</link>
      <guid>https://dev.to/niranyadav/your-current-backup-automation-is-missing-the-key-ansible-aws-s3-and-cron-2mb8</guid>
      <description>&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fty8c6ejqwjc20x7h7i79.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fty8c6ejqwjc20x7h7i79.jpg" alt="Image description" width="617" height="336"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Your current backup automation is missing the key: Ansible, AWS S3, and Cron. Data backups are a crucial part of any system or application, ensuring that you don't lose important files when things go wrong. While you may already have an automation process in place, manual backups can still be time-consuming and prone to human error.&lt;/p&gt;

&lt;p&gt;That's where enhancing your automation comes in! In this blog, I'll show you how to supercharge your existing backup process by automating the backup of critical files and application data to AWS S3 using Ansible. We'll also set up Cron Jobs to handle the scheduling, ensuring your backups run smoothly and on time. Additionally, we'll cover how to restore those backups in case you ever need to retrieve them.&lt;/p&gt;

&lt;p&gt;To set up the automated backup and restore process using Ansible, AWS S3, and Cron Jobs, you need to install and configure the following components:&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;strong&gt;AWS Account:&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Ensure you create an AWS account if you don't have one.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;AWS CLI:&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Install the AWS Command Line Interface (CLI) to manage your AWS services from the command line.&lt;/li&gt;
&lt;li&gt;After installing, configure the AWS CLI with your AWS credentials.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws configure
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;You’ll need your &lt;strong&gt;AWS Access Key ID&lt;/strong&gt;, &lt;strong&gt;Secret Access Key&lt;/strong&gt;, &lt;strong&gt;default region&lt;/strong&gt;, and &lt;strong&gt;output format&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Ansible:&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://docs.ansible.com/ansible/latest/installation_guide/installation_distros.html" rel="noopener noreferrer"&gt;Install Ansible&lt;/a&gt; on your Linux server.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Python and Boto3:&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Ansible requires Python to run, so ensure it's installed.&lt;/li&gt;
&lt;li&gt;Install Boto3 (the AWS SDK for Python), which Ansible uses for AWS-related tasks.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip3 install boto3
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you encounter the "externally-managed-environment" error when trying to install boto3 using pip3, follow these steps to resolve the issue:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Install the Virtual Environment Package:&lt;/strong&gt; First, ensure you have the necessary package to create virtual environments. Run the following command:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo apt install python3-venv
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Create a Virtual Environment:&lt;/strong&gt; Use the following command to create a new virtual environment. You can name the folder anything you like; in this example, we'll use myenv:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;python3 -m venv myenv
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Activate the Virtual Environment:&lt;/strong&gt; Activate the virtual environment with this command:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;source myenv/bin/activate
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Your terminal prompt will change to indicate that you are now working inside the virtual environment.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Install boto3 Using pip3:&lt;/strong&gt; Now, you can install boto3 without encountering the previous error. Run:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip3 install boto3
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Verify the Installation:&lt;/strong&gt; After installation, you can check if boto3 was installed correctly by running:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip3 show boto3
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Deactivate the Virtual Environment:&lt;/strong&gt; Once you're done working, you can exit the virtual environment by running:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;deactivate
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;&lt;em&gt;Ansible Collections:&lt;/em&gt;&lt;/strong&gt; &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Install the necessary Ansible collections for AWS to ensure smooth integration with your cloud environment:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ansible-galaxy collection install amazon.aws
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;&lt;em&gt;Cron:&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Cron is typically pre-installed on Linux systems. You can check if it's running with:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;systemctl status cron
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;If it's not installed, you can install it:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo apt install cron -y
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;&lt;em&gt;Additional Tools (Optional):&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You might consider installing tools like gzip or tar if they're not already installed, as these are often used for compressing files.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;With all the necessary components installed and configured, here’s your next step:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Create S3 Bucket via AWS CLI:&lt;/em&gt;&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws s3 mb s3://my-app-backups --region us-east-1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;S3 Bucket Permissions:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Read Access: The IAM user or role used in your Ansible playbook must have permission to list and read objects from the specified S3 bucket. This is typically done through IAM policies. For example:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": "arn:aws:s3:::my-app-backups"
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:GetObject"
            ],
            "Resource": "arn:aws:s3:::my-app-backups/*"
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Replace my-app-backups with your desired bucket name, and adjust the region if needed.&lt;/p&gt;

&lt;p&gt;Once the bucket is created, you can begin uploading files automatically using Ansible playbooks. Let’s start by creating the playbooks:&lt;br&gt;
You’ll need two playbooks:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;One for backing up files to AWS S3.&lt;/li&gt;
&lt;li&gt;One for restoring the files from AWS S3.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Example: Backup Playbook (backup.yml)&lt;br&gt;
This playbook compresses a directory and uploads the tarball to your S3 bucket.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;---
# Playbook to back up the /etc/myapp directory to AWS S3.

- name: Backup /etc/myapp to AWS S3
  # Defines the play name, indicating its purpose.
  hosts: localhost
  # Specifies that the play will run on the local machine (localhost).
  tasks:
    # The list of tasks to be executed.

    - name: Compress /etc/myapp directory
      # Task to compress the directory into a tarball.
      archive:
        path: /etc/myapp
        # Source directory that will be compressed.
        dest: /tmp/myapp_backup.tar.gz
        # Destination path for the compressed tarball in the /tmp directory.

    - name: Upload tarball to AWS S3
      # Task to upload the tarball to an S3 bucket.
      aws_s3:
        bucket: my-app-backups
        # Name of the S3 bucket where the backup will be stored.
        object: backups/myapp_backup_{{ ansible_date_time.iso8601 }}.tar.gz
        # Specifies the object key (file path) in the S3 bucket. It includes the current date and time in ISO 8601 format to make the filename unique.
        src: /tmp/myapp_backup.tar.gz
        # The source file (the tarball) to be uploaded.
        mode: put
        # Specifies that this operation is a 'put' (upload) to the S3 bucket.
      delegate_to: localhost
      # Ensures the upload is performed from the localhost.

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Example: Restore Playbook (restore.yml)&lt;br&gt;
This playbook downloads the latest backup and restores it to the original location.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;---
- name: Restore files from S3 bucket to local
  hosts: localhost
  gather_facts: no

  tasks:
    - name: Ensure the restore directory exists
      file:
        path: /home/niran/restore  
        # Specify the path for the restore directory
        state: directory  
        # Ensure the directory is created if it doesn't exist

    - name: List objects in the S3 bucket
      amazon.aws.s3_object:
        bucket: my-app-backups  
        # Specify the S3 bucket to list objects from
        mode: list  
        # Set the mode to list objects
      register: s3_objects  
      # Register the output to use in later tasks

    - name: Debug full S3 objects output
      debug:
        var: s3_objects  
        # Display the full output of the S3 objects for debugging purposes

    - name: Download files from S3 bucket to local
      amazon.aws.s3_object:
        bucket: my-app-backups  
        # Specify the S3 bucket to download files from
        object: "{{ item }}"  
        # Use the item from the loop to specify the object to download
        dest: "/home/niran/restore/{{ item | basename }}"  
        # Set the destination for downloaded files
        mode: get  
        # Set the mode to download files
      loop: "{{ s3_objects.s3_keys }}"  
      # Loop through the keys of the S3 objects
      when: s3_objects.s3_keys is defined and s3_objects.s3_keys | length &amp;gt; 0  
      # Only run if there are keys available
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Before setting up the Cron job, manually run the playbooks to ensure they work as expected:&lt;br&gt;
&lt;strong&gt;Backup Test&lt;/strong&gt;&lt;br&gt;
Run the backup playbook:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ansible-playbook /path/to/backup.yml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Verify that the compressed file is created and uploaded to S3.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fafs7kjb4g1y8x3gqxv2i.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fafs7kjb4g1y8x3gqxv2i.jpg" alt="Image description" width="800" height="376"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Restore Test&lt;/strong&gt;&lt;br&gt;
Run the restore playbook:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ansible-playbook /path/to/restore.yml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Verify whether the restored files have been downloaded.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvq7qit0oebuctsp3w2go.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvq7qit0oebuctsp3w2go.jpg" alt="Image description" width="721" height="192"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; Ensure that the necessary permissions are set for both the S3 bucket and the local file system to avoid any access issues during the restore process.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Set Up the Cron Job:&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Open the Crontab file for editing:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;crontab -e
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Add the Cron job line at the end of the file, for example:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;30 11 * * * ansible-playbook /path/to/backup.yml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Save and exit the editor.&lt;/li&gt;
&lt;li&gt;You can verify that your Cron job has been set up correctly by listing the current Cron jobs:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;crontab -l
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt;Make sure to replace /path/to/backup.yml with the actual path to your Ansible backup playbook. Also, ensure that the user executing the Cron job has the necessary permissions to run Ansible and access the required files and directories.&lt;/p&gt;

&lt;p&gt;By automating backup and restore processes with Ansible, AWS S3, and Cron jobs, we’ve ensured reliable and consistent management of critical data. This approach not only simplifies complex tasks but also provides an efficient, scalable, and repeatable solution that can be adapted for various use cases. With daily backups scheduled via Cron and seamless restores handled through Ansible playbooks, your data management processes can be fully automated, saving both time and reducing the risk of human error.&lt;/p&gt;

&lt;p&gt;For those looking to implement this solution, you can find the full project repository on GitHub &lt;a href="https://github.com/niranyadav03/ansible-aws-s3-cron-backup-automation.git" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;~!@#$%^&amp;amp;*()_+&lt;br&gt;
Happy automating!&lt;/p&gt;

</description>
      <category>ansible</category>
      <category>aws</category>
      <category>automation</category>
      <category>backup</category>
    </item>
    <item>
      <title>What Really Happens When You Type "amazon.com" in Your Browser? 🤔</title>
      <dc:creator>Niran</dc:creator>
      <pubDate>Mon, 30 Sep 2024 14:05:26 +0000</pubDate>
      <link>https://dev.to/niranyadav/what-really-happens-when-you-type-amazoncom-in-your-browser-3a3j</link>
      <guid>https://dev.to/niranyadav/what-really-happens-when-you-type-amazoncom-in-your-browser-3a3j</guid>
      <description>&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3xalrh4g8kpfeh39g0em.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3xalrh4g8kpfeh39g0em.jpg" alt="Image description" width="800" height="542"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Ever clicked on a website and wondered what goes on in the background? Take "&lt;strong&gt;amazon.com&lt;/strong&gt;," for instance. The truth is, no matter which website you visit, the magic behind it works in pretty much the same way. Think about it: from locating the right server to loading all the images and content on your screen, there’s a lot happening in the blink of an eye! Let's dive in and explore what really happens, step by step.&lt;/p&gt;

&lt;p&gt;While we're using Amazon as an example, these steps apply to any website you visit!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step1&lt;/strong&gt; 🚦 : When you type a URL like "&lt;strong&gt;amazon.com&lt;/strong&gt;" into your browser, the first thing it needs to do is figure out where to send your request. Think of websites as having addresses, much like houses; they don’t just exist by name on the internet. Each website has a unique &lt;strong&gt;IP address&lt;/strong&gt; that points to its location.&lt;/p&gt;

&lt;p&gt;To find this address, your browser reaches out to a &lt;strong&gt;DNS server (Domain Name System)&lt;/strong&gt;, which acts like the internet’s phonebook. It looks up "amazon.com" and finds the corresponding IP address. If your browser can’t find the address stored locally or in your Internet Service Provider's cache, it will ask other DNS servers out there to help resolve the address. This DNS lookup is the crucial first step in loading any webpage, whether it's Amazon or any other site you visit.&lt;/p&gt;

&lt;p&gt;Now that your browser knows the IP address, it needs to connect to Amazon's server (or any website you’re trying to visit). This is done using something called &lt;strong&gt;TCP (Transmission Control Protocol)&lt;/strong&gt;, which is basically a reliable way for computers to chat with each other.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step2&lt;/strong&gt; 🤝 : To establish this connection, your device and Amazon’s server go through a process called a &lt;strong&gt;three-way handshake&lt;/strong&gt;. It’s like a quick greeting where they exchange three signals back and forth to make sure they’re on the same page and ready to communicate. This handshake is really important because it sets up a reliable connection before any actual data starts flowing—no matter what website you’re accessing.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step3&lt;/strong&gt; 🔐 : Since websites like Amazon use &lt;strong&gt;HTTPS&lt;/strong&gt; (which is the secure version of &lt;strong&gt;HTTP&lt;/strong&gt;), your browser has to set up something called an &lt;strong&gt;SSL/TLS handshake&lt;/strong&gt;. This process makes sure that the connection between your device and the website is &lt;strong&gt;encrypted and safe&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;First, your browser checks Amazon’s security certificate to confirm it’s legitimate and not a scam. Once everything checks out, an encrypted tunnel is created. This means that any data you share—like passwords or credit card information—is protected from anyone trying to intercept it.&lt;/p&gt;

&lt;p&gt;These days, most websites use HTTPS, so this secure handshake happens almost every time you visit a site, especially if it’s an online store.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step4&lt;/strong&gt; 📩 : Once your connection is all secure, your browser is set to send an &lt;strong&gt;HTTP request&lt;/strong&gt;! This is basically how your browser tells Amazon’s server (or any website’s server) what you want, whether it’s loading the homepage or searching for products.&lt;/p&gt;

&lt;p&gt;The browser sends what’s called a &lt;strong&gt;GET request&lt;/strong&gt;, and it includes some extra details like &lt;strong&gt;cookies&lt;/strong&gt;, &lt;strong&gt;user-agent info&lt;/strong&gt;, and even &lt;strong&gt;your location&lt;/strong&gt;. This helps the server personalize your experience, making it feel just right for you!&lt;/p&gt;

&lt;p&gt;Think of it like placing an order for a webpage—you ask for what you want, and the server does its thing and sends it right back to you!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step5&lt;/strong&gt; 💻 : When the server gets your request, it jumps into action to process it! For instance, Amazon’s servers gather everything they need to deliver the page you asked for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;HTML&lt;/strong&gt;: This is the backbone, the basic structure of the page.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;CSS&lt;/strong&gt;: Here’s where the styling comes in—think colors, fonts, and layouts that make everything look good.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;JavaScript&lt;/strong&gt;: This is what brings the page to life with interactive features like drop-down menus and search bars.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Images and multimedia&lt;/strong&gt;: And let’s not forget about the visuals—product photos, videos, and all the other media that make the site engaging!
This whole process happens for any website you visit, whether it’s Amazon, Google, or any other place on the web!&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Step6&lt;/strong&gt; 📦 : Once the server has processed your request, it sends an &lt;strong&gt;HTTP response&lt;/strong&gt; back to your browser, packed with all the data needed to display the page you asked for!&lt;/p&gt;

&lt;p&gt;This response includes everything: the HTML for structure, CSS for styling, JavaScript for interactivity, and any media files like images and videos.&lt;/p&gt;

&lt;p&gt;Then, your browser gets to work assembling the page. It starts by interpreting the HTML and applying the CSS to make everything look nice. This whole process is pretty much the same for every website you visit!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step7&lt;/strong&gt; 🏗️ : Now it’s time for your browser to really bring the webpage to life! It does this by building the &lt;strong&gt;DOM (Document Object Model)&lt;/strong&gt;, which organizes all that HTML into a structured format.&lt;/p&gt;

&lt;p&gt;Next, the browser applies the CSS to make everything look visually appealing—this is where the colors, fonts, and layouts come into play! It also runs JavaScript to add those cool interactive features, like search functions, animations, and personalized product suggestions that make your browsing experience so much better.&lt;/p&gt;

&lt;p&gt;While all this is happening, your browser might send out a few more requests to load images, fonts, or other external scripts. Whether you’re on Amazon or any other website, this rendering step is where everything really comes together and you see the final product!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step8&lt;/strong&gt; 🌐 : Once the rendering is all done, your browser shows you the fully-loaded webpage! Now you can dive into Amazon’s homepage, search for products, or click through different categories—everything you need is right at your fingertips!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step9&lt;/strong&gt; ⚡️ : Your browser usually keeps some data, like images and scripts, saved in its &lt;strong&gt;cache&lt;/strong&gt;. This means that the next time you visit amazon.com (or any other site), it can load certain resources much faster!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Final Thoughts&lt;/strong&gt; ✨ : Next time you hop onto a website—whether it’s Amazon, Google, or anywhere else—you’ll have a better idea of the complex yet super-fast process that makes it all happen. In just a matter of milliseconds, your browser handles all the heavy lifting, from locating the website’s server to showing you the fully rendered page. Pretty awesome, right? 🔥&lt;/p&gt;

&lt;p&gt;!@#$%^&amp;amp;*()_+&lt;/p&gt;

&lt;p&gt;Happy browsing!🚀💖&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>networking</category>
      <category>beginners</category>
      <category>discuss</category>
    </item>
    <item>
      <title>Unlocking Serverless: Build Your First Python AWS Lambda Function</title>
      <dc:creator>Niran</dc:creator>
      <pubDate>Fri, 20 Sep 2024 13:32:10 +0000</pubDate>
      <link>https://dev.to/niranyadav/unlocking-serverless-build-your-first-python-aws-lambda-function-3ga4</link>
      <guid>https://dev.to/niranyadav/unlocking-serverless-build-your-first-python-aws-lambda-function-3ga4</guid>
      <description>&lt;p&gt;&lt;strong&gt;Setting Up Your AWS Lambda Function 🚀&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Let’s build something awesome!&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Log in to AWS Management Console.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Just type &lt;strong&gt;“Lambda”&lt;/strong&gt; in the AWS search bar and click on the Lambda service.&lt;br&gt;
&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqisbsusqfbsgthkmmov0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqisbsusqfbsgthkmmov0.png" alt="Image description" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the AWS Lambda dashboard, click &lt;strong&gt;“Create a function”&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Choose the &lt;strong&gt;“Author from scratch”&lt;/strong&gt; option.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5p8h2slipqp3u6u1u00v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5p8h2slipqp3u6u1u00v.png" alt="Image description" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why Choose “Author from Scratch” for Your AWS Lambda Function? 🤔&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Build it Your Way!:&lt;/strong&gt; Authoring from scratch gives you complete control over your Lambda function’s code, settings, and environment. Customize everything to fit your exact needs, from the code itself to how it works with other AWS services and handles different inputs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pick Your Language:&lt;/strong&gt; When you author from scratch, you can choose the runtime that best suits your programming language preference, such as Python, Node.js, or Java. This gives you the power to write code exactly how you like it!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Full Control Over Permissions:&lt;/strong&gt; Starting from scratch allows you to define or attach IAM roles and policies according to the function’s needs. AWS provides basic Lambda execution roles by default, but you can customize them for more specific permissions as your function evolves.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;No Unnecessary Overhead:&lt;/strong&gt; Unlike pre-built templates or blueprints, authoring from scratch ensures that no unnecessary code, libraries, or configurations️ are included in your function. This helps keep the function lightweight and optimized for your specific task.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Deep Dive into Serverless:&lt;/strong&gt; Authoring from scratch is an excellent way to deepen your understanding of AWS Lambda and serverless architecture. By setting up everything manually, you gain insights into how different components interact and how to optimize them for performance and cost.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Give your function a &lt;strong&gt;name&lt;/strong&gt; (e.g., ImageMetadataExtractor, as this is the function we are creating).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Python 3.9&lt;/strong&gt; (or the latest supported version) as the runtime.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select the architecture for your Lambda function. Choose** x86_64 **(widely compatible with existing libraries and binaries).&lt;br&gt;
&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fskefsqm0mo97ztbkfuvg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fskefsqm0mo97ztbkfuvg.png" alt="Image description" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Locate the Execution Role and choose &lt;strong&gt;“Create a new role with basic Lambda permissions”&lt;/strong&gt; option.&lt;br&gt;
&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpdj7y3wq0jdp1g9mk330.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpdj7y3wq0jdp1g9mk330.png" alt="Image description" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Why Choose “Create a new role with basic Lambda permissions” ? 🤔&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Choosing the &lt;strong&gt;“Create a new role with basic Lambda permissions”&lt;/strong&gt; option simplifies the setup process for your AWS Lambda function by automatically granting the essential permissions needed for effective operation. This option is ideal for beginners or for those who need a quick setup without complex permission configurations. It allows your Lambda function to log execution results to Amazon CloudWatch, facilitating easy monitoring and troubleshooting. This approach minimizes overhead and lets developers focus on coding, with the flexibility to update permissions later as the application’s requirements evolve.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Advanced Settings: Optional Tweaks for Your Function
You can see the &lt;strong&gt;Advanced Settings&lt;/strong&gt; option when creating your Lambda function. While these settings provide valuable features, they are not mandatory for the basic operation of your function. You can choose to enable them based on your specific needs:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Enable Code Signing:&lt;/strong&gt; Optional, but recommended for enhancing security and code integrity.&lt;br&gt;
&lt;strong&gt;Enable Function URL:&lt;/strong&gt; Useful if you want to expose your function as an HTTP endpoint, but not required if external access is unnecessary.&lt;br&gt;
&lt;strong&gt;Enable Tags:&lt;/strong&gt; While not required, using tags is a best practice for resource management and cost tracking.&lt;br&gt;
&lt;strong&gt;Enable VPC:&lt;/strong&gt; Necessary only if your function needs to access resources within a VPC; otherwise, it can be skipped.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fko8mag179yquyvx6qk8s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fko8mag179yquyvx6qk8s.png" alt="Image description" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Click &lt;strong&gt;“Create Function”&lt;/strong&gt;. AWS will create the Lambda function and take you to the function’s configuration page.
&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F673zrj4ohv5w1fgwx9cb.png" alt="Image description" width="" height=""&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Time to Code! 💻&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Writing the Python Code&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Scroll Down to the Code Source Section:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You’ll see a basic code editor where you can modify the Lambda function code.&lt;/p&gt;

&lt;p&gt;AWS Lambda doesn’t come pre-equipped with all the tools you might need. It’s like trying to build a house without a toolbox. Similarly, Lambda requires you to bring along specialized equipment, like the Pillow library, for the function we’re creating.&lt;/p&gt;

&lt;p&gt;On your local computer, create a new folder to hold your Lambda function code and dependencies. This will be your project directory. For example, you can create a folder named “lambda_function”.&lt;/p&gt;

&lt;p&gt;Open your terminal or command prompt and navigate to your project directory. Use the following command to install the Pillow library into your project:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install Pillow -t .
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create a new Python file named lambda_function.py in your project directory and also in the Lambda function editor, replace the default code with the following metadata extraction function(lambda_function.py):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from PIL import Image
import json
import io
import base64

def lambda_handler(event, context):
    try:
        # Decode the base64 image data from the event
        image_data = base64.b64decode(event['image_data'])
        image = Image.open(io.BytesIO(image_data))

        # Extract image metadata
        metadata = {
            "format": image.format,
            "size": image.size,
            "mode": image.mode,
            "info": image.info
        }

        return {
            'statusCode': 200,
            'body': json.dumps(metadata)
        }
    except Exception as e:
        return {
            'statusCode': 500,
            'body': str(e)
        }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This function expertly handles base64-encoded image data. It seamlessly decodes the image, leverages Pillow’s capabilities to extract crucial metadata like format, size, and mode, and returns a JSON response for easy consumption.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff2efwvm20rs128d6qt65.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff2efwvm20rs128d6qt65.png" alt="Image description" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After editing the code, click the Deploy button to save and deploy the function.&lt;/p&gt;

&lt;p&gt;Ensure your working directory has these essential files and folders:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;PIL/
lambda_function.py
pillow-10.4.0.dist-info/
pillow.libs/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Navigate to the directory containing these essential files and folders. Run the following command to create a deployment package:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;zip -r9 lambda_function.zip PIL lambda_function.py pillow-10.4.0.dist-info pillow.libs

(This will create a lambda_function.zip file that includes all necessary files and directories.)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the Code source section, navigate to the Upload from option.&lt;/p&gt;

&lt;p&gt;Select the .zip file you carefully created earlier (likely named lambda_function.zip).&lt;/p&gt;

&lt;p&gt;Click the Upload from(.zip files) button and browse to select the lambda_function.zip file.&lt;/p&gt;

&lt;p&gt;Finally, click Save to update your Lambda function with this powerful new deployment package!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv5ooff0jshbzsy32yux0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv5ooff0jshbzsy32yux0.png" alt="Image description" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configuring and Testing a Lambda Function 🚀&lt;/strong&gt;&lt;br&gt;
Let’s get your Lambda function up and running!&lt;/p&gt;

&lt;p&gt;Verify the Runtime Settings.&lt;/p&gt;

&lt;p&gt;Check the Handler setting. It should be set to:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;lambda_function.lambda_handler
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If it’s not set correctly, click &lt;strong&gt;Edit&lt;/strong&gt;(if available), enter lambda_function.lambda_handler, and then click &lt;strong&gt;Save&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9j9n5t2q5e5jgrnd29b5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9j9n5t2q5e5jgrnd29b5.png" alt="Image description" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click on the Test button at the function editor.&lt;/p&gt;

&lt;p&gt;Create a new test event with a name (e.g., TestImageMetadata).&lt;/p&gt;

&lt;p&gt;Select Private to keep the test event available only to you. This ensures that only you can access and use this test event.&lt;/p&gt;

&lt;p&gt;Use this sample JSON data to simulate an image:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "image_data": "base64-encoded-image-string"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxv98v2pisjzhn7xbefqm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxv98v2pisjzhn7xbefqm.png" alt="Image description" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click Save changes to save the test event.&lt;/p&gt;

&lt;p&gt;Click Test to execute the Lambda function with the test event.&lt;/p&gt;

&lt;p&gt;Review the execution results.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwwh213so8nh78sdpmssh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwwh213so8nh78sdpmssh.png" alt="Image description" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Manage Environment Variables ⚙️&lt;/strong&gt;&lt;br&gt;
Scroll down to the Environment variables section on the Lambda function configuration page.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsde6o82ab1y0jefh368w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsde6o82ab1y0jefh368w.png" alt="Image description" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click on Edit to modify the environment variables.&lt;/p&gt;

&lt;p&gt;Add a new environment variable:&lt;/p&gt;

&lt;p&gt;Key: DEFAULT_MODE&lt;br&gt;
Value: RGB&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxdejfdgvblzx7jupo3xq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxdejfdgvblzx7jupo3xq.png" alt="Image description" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click Save to apply the changes.&lt;/p&gt;

&lt;p&gt;Modify your lambda_function.py code to use the environment variable for the default mode if not specified.&lt;/p&gt;

&lt;p&gt;Ensure your code includes error handling with a try-except block:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import os
from PIL import Image
import json
import io
import base64

def lambda_handler(event, context):
    try:
        # Decode the base64 image data from the event
        image_data = base64.b64decode(event['image_data'])
        image = Image.open(io.BytesIO(image_data))

        # Use environment variable for default mode if not specified
        mode = os.getenv('DEFAULT_MODE', image.mode)

        # Extract image metadata
        metadata = {
            "format": image.format,
            "size": image.size,
            "mode": mode,
            "info": image.info
        }

        return {
            'statusCode': 200,
            'body': json.dumps(metadata)
        }
    except Exception as e:
        return {
            'statusCode': 500,
            'body': str(e)
        }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This code will flexibly use the value of the DEFAULT_MODE environment variable if a specific mode isn't provided in the input data.&lt;/p&gt;

&lt;p&gt;Go to the Test tab of the Lambda function.&lt;/p&gt;

&lt;p&gt;Click Test to invoke your Lambda function with the new code.&lt;/p&gt;

&lt;p&gt;Ensure that the test event is set up correctly and reflects the changes made.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;View Logs in CloudWatch 👀&lt;/strong&gt;&lt;br&gt;
Click on the &lt;strong&gt;Monitor&lt;/strong&gt; tab. Click on &lt;strong&gt;View logs in CloudWatch&lt;/strong&gt; to see the execution logs. Review the logs for any errors or successful executions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; For example, I’ve used an image metadata extractor code, but you can easily replace it with any Python code you need. Whether you’re building AI models, automating tasks, or creating custom tools, this setup with AWS Lambda and S3 is incredibly flexible.&lt;/p&gt;

&lt;p&gt;Enjoy the process! 😊&lt;/p&gt;

</description>
      <category>awslambda</category>
      <category>serverless</category>
      <category>python</category>
      <category>s3</category>
    </item>
    <item>
      <title>Easy Web App Deployment: Python Flask, MongoDB, and Nginx with Docker Compose 🚀🐍</title>
      <dc:creator>Niran</dc:creator>
      <pubDate>Sun, 08 Sep 2024 05:29:57 +0000</pubDate>
      <link>https://dev.to/niranyadav/easy-web-app-deployment-python-flask-mongodb-and-nginx-with-docker-compose-20oi</link>
      <guid>https://dev.to/niranyadav/easy-web-app-deployment-python-flask-mongodb-and-nginx-with-docker-compose-20oi</guid>
      <description>&lt;p&gt;Welcome to this blog!💻🚀 The primary goal here isn’t to delve deeply into the code but to demonstrate how to efficiently use &lt;strong&gt;Docker Compose🐳&lt;/strong&gt; to deploy a web application, including the deployment process, using &lt;strong&gt;Flask🐍&lt;/strong&gt;, &lt;strong&gt;MongoDB🗄️&lt;/strong&gt;, and &lt;strong&gt;Nginx🌐&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkaall0p1pv0bx9umsoyj.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkaall0p1pv0bx9umsoyj.jpg" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why use Docker Compose?🐳&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Docker Compose is a game-changer for managing complex Docker applications with multiple containers🛠.️ Instead of juggling multiple Docker commands, you can define and configure all your services in a single &lt;strong&gt;docker-compose.yml&lt;/strong&gt; file📝. This makes it easy to create, start, stop, and scale your entire application with just one command💥.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Takeaways from this Blog📝&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Simplified management&lt;/strong&gt;: Docker Compose makes it easy to manage multiple containers as a single unit⚙️&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Consistent environments&lt;/strong&gt;: Ensure your application runs consistently across different environments📦&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Efficient deployment&lt;/strong&gt;: Deploy your full-stack application with a few simple commands🚀&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Isolated components&lt;/strong&gt;: Docker containers provide isolation, preventing conflicts between services🔒&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Declarative configuration&lt;/strong&gt;: Define your application’s desired state in a YAML file📝&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scalability&lt;/strong&gt;: Easily scale up or down individual services based on demand📈&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Prerequisites🛠️&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Before you start, make sure you have &lt;strong&gt;Docker&lt;/strong&gt; and &lt;strong&gt;Docker Compose&lt;/strong&gt; installed on your system. These tools are essential for building and running the containers for your application.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.docker.com/engine/install/" rel="noopener noreferrer"&gt;Install Docker&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.docker.com/compose/install/linux/#install-the-plugin-manually" rel="noopener noreferrer"&gt;Install Docker Compose&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Dive In!🚀&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The application is all set up and ready to go. Just clone the repository, build the Docker images, and start the containers. You’re minutes away from getting everything up and running with Docker Compose!⚙️🐳&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Clone the repository to your machine⬇️:&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

git clone https://github.com/niranyadav03/dockerized-python-application.git


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Project Structure 📂:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Project structure should look like this:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

dockerized-python-application/
├── app
│   ├── app.py
│   ├── Dockerfile
│   ├── requirements.txt
│   └── templates
│       └── index.html
├── docker-compose.yml
└── nginx
    ├── default.conf
    └── Dockerfile


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Navigate to the Project Directory📂:&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

cd dockerized-python-application


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Build and Run the Containers🚢:&lt;/strong&gt;&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

docker-compose up --build


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Note⚠️:&lt;/strong&gt; Make sure to run the build commands from the &lt;strong&gt;dockerized-python-application&lt;/strong&gt; directory📂. This is important because the Docker Compose file (&lt;strong&gt;docker-compose.yml&lt;/strong&gt;) and other essential files are located there🔍. Running the build from this directory ensures that Docker Compose can access everything it needs to set up the services properly🛠️.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Access the Application🌐:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Once the containers are up and running, you can access the application.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Web UI🌐&lt;/strong&gt;: Open &lt;a href="http://localhost" rel="noopener noreferrer"&gt;http://localhost&lt;/a&gt; in your browser to view the To-Do List app.&lt;/p&gt;

&lt;p&gt;After using the To-Do List application through the UI, make sure everything is working properly by checking the MongoDB database to confirm it’s storing your data correctly😊&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;a href="https://youtu.be/DM65_JyGxCo?si=zzcqPhaq2K0q92Bg" rel="noopener noreferrer"&gt;Understand Docker Compose🐳&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Enjoy exploring your new setup with Docker Compose! 🚀&lt;/p&gt;

&lt;p&gt;Happy Deploying! 🎉&lt;/p&gt;

</description>
      <category>docker</category>
      <category>python</category>
      <category>nginx</category>
      <category>mongodb</category>
    </item>
  </channel>
</rss>
