<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Osama Shamout</title>
    <description>The latest articles on DEV Community by Osama Shamout (@oshamout).</description>
    <link>https://dev.to/oshamout</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/oshamout"/>
    <language>en</language>
    <item>
      <title>Creating Interactive Globe Visualizations with Globe.gl: A Step-by-Step Guide using Amazon S3 and CloudFront</title>
      <dc:creator>Osama Shamout</dc:creator>
      <pubDate>Wed, 20 Mar 2024 09:23:06 +0000</pubDate>
      <link>https://dev.to/oshamout/creating-interactive-globe-visualizations-with-a-step-by-step-guide-using-amazon-s3-and-cloudfront-26g2</link>
      <guid>https://dev.to/oshamout/creating-interactive-globe-visualizations-with-a-step-by-step-guide-using-amazon-s3-and-cloudfront-26g2</guid>
      <description>&lt;p&gt;In today's data-driven landscape, effective visualization plays a crucial role in making complex data accessible and informative. In this blog post, we will explore how to harness the power of &lt;a href="https://Globe.gl" rel="noopener noreferrer"&gt;Globe.gl&lt;/a&gt; , coupled with some fundamental Three.js knowledge, to transform raw AWS User Groups data into an engaging and interactive globe-spanning visualization. we'll explore how AWS services can significantly expedite the deployment of our global visualization, making it accessible to audiences worldwide within minutes. Throughout this blog post, our goal is to provide you with a clear, step-by-step guide to help you effortlessly navigate this process. Below is a quick outline:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Unveiling the Data: Understanding AWS User Groups:&lt;/strong&gt;&lt;br&gt;
We start by exploring the AWS User Groups dataset. This dataset represents a global community of AWS cloud enthusiasts. Understanding its format is foundational to our visualization endeavor. We will also modify the JSON file to include latitude, longitude, description, and images for each User Group (where available) using Python and Google Maps API.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Setting the Stage: Preparing for the Journey&lt;/strong&gt;&lt;br&gt;
In this section, we prepare our workspace by integrating essential tools such as Vite, &lt;a href="http://Globe.gl" rel="noopener noreferrer"&gt;http://Globe.gl&lt;/a&gt;, and Three.js. These foundational elements empower us to craft a visually compelling and technically robust globe visualization.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Building the Globe: A Comprehensive Guide&lt;/strong&gt;&lt;br&gt;
The core of this blog post, this section includes a step-by-step construction of our globe visualization. Starting from the basic HTML and CSS components and culminating in the JavaScript functions, we will build an interactive globe that showcases the global presence of AWS User Groups.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Going Global in Minutes: Leveraging AWS Services&lt;/strong&gt;&lt;br&gt;
This section showcases the strategic utilization of AWS services. With Amazon S3, Lightsail, ACM, and CloudFront at our disposal, we expedite the deployment of our globe visualization to a global audience. Furthermore, we explore the integration of this visualization as a subdomain within an existing website, ensuring accessibility to a worldwide audience.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Without further ado, let’s get started!&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 1 - Unveiling the Data: Understanding AWS User Groups:
&lt;/h2&gt;

&lt;p&gt;AWS User Groups stand as vibrant local communities, bringing together AWS enthusiasts and developers from around the globe. These groups hold a wealth of information, organized within a JSON file, providing a comprehensive repository of AWS User Group details worldwide. Within this JSON file, each entry boasts essential information, including ugName, ugLocation, ugURL, and ugCategory, organized to represent the essence of each User Group. Let's take a closer look at the structure with a couple of examples:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[
    {
        "ugName": " AWS  User Group Timisoara",

        "ugLocation": "Timisoara, Romania",

        "ugURL": "https://www.meetup.com/aws-timisoara ",

        "ugCategory": "General AWS User Groups"
    },
    {
        "ugName": "AWS Abidjan User Group",

        "ugLocation": "Abidjan, Ivory Coast",

        "ugURL": "https://www.meetup.com/groupe-meetup-abidjan-amazon-web-services/ ",

        "ugCategory": "General AWS User Groups"
    },
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Yet, we need to go a step further by adding more information such as the latitude and longitude information for each User Group. This data is crucial for plotting locations on our globe visualization. To achieve this, we've created a Python function that extracts, processes, and enhances the dataset.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Imports &amp;amp; Initialization&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The journey begins by ensuring that User Group names are clean and uniform. We remove any unnecessary white spaces, ensuring consistency in our data. Do not forget to import the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import requests
from bs4 import BeautifulSoup
import json
import os
import re
from geopy.geocoders import GoogleV3
import time

  extracted_detailed_data = []
    for item in extracted_data:
      url = item['ugURL']
      ugName = item['ugName']
      ugLocation = item['ugLocation']
      ugCategory = item['ugCategory']

      #Clean the UG Name
      ugName = re.sub("^\s+", "", ugName)
      ugName = re.sub("\s{2,}", " ", ugName)
      ugNameClean = ugName.replace(" ", "_")
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Geocode User Group Locations&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Next, we initialize variables for latitude and longitude, initially set to "NA" (Not Available). These variables will soon be populated with geographical coordinates. To determine the latitude and longitude of each User Group's location, we leverage the power of geocoding. Using the Google Maps Geocoding API, we convert User Group locations into precise coordinates.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Initialize Latitude and Longitude Variables
      ugLatitude = "NA"
      ugLongitude = "NA"

      api_key = "YOUR_GOOGLE_API_KEY_HERE"
      geolocator = GoogleV3(api_key=api_key)
      location = geolocator.geocode(ugLocation)
      if location:
      ugLatitude = location.latitude
      ugLongitude = location.longitude
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Note: you can obtain Google API Keys for free within a certain usage limit. Other geocoding libraries are also available in which you can explore.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Retrieve Additional Information&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Our function doesn't stop at geographical data. Thus, we aim to scrape additional details from the User Group's Meetup page to retrieve:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Member Count: The number of members in the User Group.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Description: A brief description of the User Group.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Since the majority of AWS User Groups are hosted on Meetup.com, our function is finely tuned to collect information from this platform. For a data retrieval experience, we identified the "member-count-link" as the unique identifier within the HTML structure of the requested webpage (you can refer to BeautifulSoup documentation if this concept is new to you). &lt;/p&gt;

&lt;p&gt;To ensure data retrieval and robust exceptions handling, our function is designed to handle unexpected obstacles that may arise during data extraction. In such situations, we gracefully fill in these details with "NA" (Not Available), ensuring a continuous and uninterrupted data enrichment process. This comprehensive approach guarantees that you receive all the data and enables you to interpret it effectively, even when faced with challenges presented by the webpage.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;        try:
            response = requests.get(url)
            if response.status_code == 200:
                time.sleep(1)
                soup = BeautifulSoup(response.text, "html.parser")

                #Get UG Members Number
                try:
                    member_count_element = soup.find('a', id='member-count-link')
                    member_count_text = member_count_element.get_text(strip=True)
                    ugMembersNumber = member_count_text.split()[0]
                except:
                    ugMembersNumber = "NA"

                #Get UG Description
                try:
                    description_element = soup.find('p', class_='mb-4')
                    ugDescription = description_element.get_text(strip=True)
                except:
                    ugDescription = "NA"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Retrieve User Group Images and Save Them&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Visual aesthetics play a pivotal role in our project, and these images may find valuable applications in our future expansions. In a similar effort to our previous data extraction processes, our function adeptly retrieves these images from meetup.com and handles exceptions gracefully. After successfully locating and preparing the image for download, we establish a dedicated directory named "UGImages" and  organize the images within it. To maintain consistency and clarity, we utilize the cleaned User Group name and the appropriate file extension as the new name for each image.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  # Find the image element by its class attribute with error handling
                try:
                    img = soup.find("img", class_="md:rounded-lg object-cover object-center aspect-video w-full")
                    #Extract the image source URL
                    img_src = img.get("src")

                    #Check if the image source URL is valid
                    if img_src and img_src.startswith("http"):
                        #Extract the file extension from the URL
                        file_extension = os.path.splitext(img_src)[1]

                        #Construct the filename using ugNameClean
                        filename = f"{ugNameClean}{file_extension}"

                        #Download the image
                        img_response = requests.get(img_src)

                        #Define the folder name
                        folder_name = "UGImages"

                        #Make sure the folder exists, create it if not
                        if not os.path.exists(folder_name):
                            os.makedirs(folder_name)

                        #Check if the image download was successful
                        if img_response.status_code == 200:
                            # Specify the full path to the file
                            file_path = os.path.join(folder_name, filename)

                            # Open and write the image content to the specified file
                            with open(file_path, "wb") as f:
                                f.write(img_response.content)

                        #Set ugImage to ugNameClean when an image is available
                        ugImage = ugNameClean
                    else:
                        # If no image found, set img_src and ugImage to "NA"
                        img_src = "NA"
                        ugImage = "NA"
                except:
                    #If an exception occurs while finding the image, set img_src and ugImage to "NA"
                    img_src = "NA"
                    ugImage = "NA"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Data Structuring and Function Return&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Finally, the extracted data is structured into a dictionary for each User Group, containing essential information like name, location (with coordinates), URL, category, member count, description, and an image (if available). These structured data entries are then appended to a list, creating a comprehensive dataset. Then, once everything is set we return the new JSON file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  #Store extracted data in a dictionary
            data = {
                'ugName': ugName,
                'ugLocation': {
                    'ugLocation': ugLocation,
                    'ugLatitude': ugLatitude,
                    'ugLongitude': ugLongitude
                },
                'ugURL': url,
                'ugCategory': ugCategory,
                'ugMembersNumber': ugMembersNumber,
                'ugDescription': ugDescription,
                'ugImage': ugImage
            }
            extracted_detailed_data.append(data)
        except:
            data = {
                'ugName': ugName,
                'ugLocation': {
                    'ugLocation': ugLocation,
                    'ugLatitude': ugLatitude,
                    'ugLongitude': ugLongitude
                },
                'ugURL': url,
                'ugCategory': ugCategory,
                'ugMembersNumber': "NA",
                'ugDescription': "NA",
                'ugImage': "NA"
            }
            extracted_detailed_data.append(data)

    return extracted_detailed_data
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;File Creation&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Once the dataset is ready, we store it in a JSON file named 'detail_UGInfo.json,' ensuring that the data is well-preserved for our visualization project.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;detail_UGInfo = extract_UGInfo(extracted_data)


if detail_UGInfo:
    with open('detail_UGInfo.json', 'w', encoding='utf-8') as json_file:
        json.dump(detail_UGInfo, json_file, ensure_ascii=False, indent=4)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step 2 - Setting the Stage: Preparing for the Journey
&lt;/h2&gt;

&lt;p&gt;With our data ready, it's time to lay the groundwork for our project. Begin by ensuring you have the necessary tools at your disposal: npm and npx. These tools will facilitate the installation and execution of Vite, our chosen local development tool for this venture.&lt;/p&gt;

&lt;p&gt;To get started, open your terminal and initiate the following steps:&lt;/p&gt;

&lt;p&gt;Create a directory named "AWSUserGroups-Globe."&lt;/p&gt;

&lt;p&gt;Within this newly created directory, generate three subfolders: "css" with a “style.css” file, "js" with a “script.js” file, and images with your choice of a 2D earth map and sky background picture*. &lt;/p&gt;

&lt;p&gt;Crucially, craft an "index.html" file.&lt;/p&gt;

&lt;p&gt;Equally important, place the "detail_UGInfo.json" file—created earlier—within the same directory as the "index.html" file.&lt;/p&gt;

&lt;p&gt;The structure for your folder should something like this:&lt;br&gt;
AWSUserGroups-Globe ------ css ---- style.css&lt;br&gt;
                                          ------- js   ---- script.js&lt;br&gt;
                                          ------- images ----- earth-night.jpg (ex)&lt;br&gt;
                                                                     ----- sky.jpg (ex)&lt;br&gt;
                                          ------- index.html&lt;br&gt;
                                          ------- detail_UGInfo&lt;/p&gt;

&lt;p&gt;*Note that we will include the Digico Solutions branding and logos within our project, please make sure you modify your code accordingly.&lt;/p&gt;

&lt;p&gt;To get started, you can two options for acquiring the necessary libraries. The first involves importing "Globe.GL " with the commandimport Globe from 'globe.gl'; and "three.js" with import * as THREE from 'three';*, and subsequently downloading their package dependencies. Alternatively, you can opt for the second which requests these libraries directly within your HTML using script tags. For our tutorial, we will adopt the latter approach.&lt;/p&gt;

&lt;p&gt;Here's the desired folder structure to ensure your project is properly organized:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmhfqe86u3eelphrgdd07.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmhfqe86u3eelphrgdd07.png" alt="Image description" width="800" height="518"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Importing Three.js is optional when you don't require additional components to customize your Globe.GL  visualization.&lt;/em&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Step 3 - Building the Globe: A Comprehensive Guide
&lt;/h2&gt;

&lt;p&gt;Three.js: Three.js is a powerful JavaScript library that simplifies 3D graphics rendering in web applications. It provides a versatile and intuitive platform for creating interactive 3D scenes, making it an ideal choice for developing immersive web-based visualizations.&lt;/p&gt;

&lt;p&gt;Globe.GL: Globe.GL, built on top of Three.js, specializes in simplifying the creation of globe visualizations. It streamlines the process of rendering Earth-like globes with geographic data and offers numerous features and customization options. By integrating Globe.GL into our project, we leverage its capabilities to craft an engaging and interactive globe-spanning visualization.&lt;/p&gt;

&lt;p&gt;Initialization:&lt;/p&gt;

&lt;p&gt;After creating the project tree, navigate to it on the terminal and write the command npx vite which might prompt you to install the configuration files, enter y to continue. If everything works correctly, you should have your localhost:5173 open from your browser.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftluq1kpfoi9jm4texeaw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftluq1kpfoi9jm4texeaw.png" alt="Image description" width="800" height="566"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In our index.html file we will set up the document as such:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;!DOCTYPE html&amp;gt;
&amp;lt;html lang="en"&amp;gt;

&amp;lt;head&amp;gt;
    &amp;lt;meta charset="UTF-8"&amp;gt;
    &amp;lt;meta name="viewport" content="width=device-width, initial-scale=1.0"&amp;gt;
    &amp;lt;link rel="preconnect" href="https://fonts.gstatic.com"&amp;gt;
    &amp;lt;link href="https://fonts.googleapis.com/css2?family=Cairo&amp;amp;display=swap" rel="stylesheet"&amp;gt;
    &amp;lt;link rel="icon" href="./images/digico-icon.ico" type="image/x-icon"&amp;gt;
    &amp;lt;link rel="stylesheet" type="text/css" href="./css/style.css"&amp;gt;
    &amp;lt;script src="https://code.jquery.com/jquery-3.6.0.min.js"&amp;gt;&amp;lt;/script&amp;gt;
    &amp;lt;title&amp;gt;AWS User Groups&amp;lt;/title&amp;gt;
&amp;lt;/head&amp;gt;

&amp;lt;body&amp;gt;
    &amp;lt;div class="loader-wrapper"&amp;gt;
        &amp;lt;img class="loader-img" src="./images/digico-loader-icon.png" alt="Loading..."&amp;gt;
    &amp;lt;/div&amp;gt;
    &amp;lt;div id="globeContainer"&amp;gt;
        &amp;lt;div id="globeViz"&amp;gt;
        &amp;lt;/div&amp;gt;
    &amp;lt;script src="//unpkg.com/globe.gl"&amp;gt;&amp;lt;/script&amp;gt;
    &amp;lt;script src="./js/script.js"&amp;gt;&amp;lt;/script&amp;gt;
&amp;lt;/body&amp;gt;

&amp;lt;/html&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the &lt;/p&gt; section, we've configured the website for enhanced usability and aesthetics:

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Responsive Design: We've included the  tag to ensure that the web page seamlessly adapts to various screen sizes. This is crucial for creating a mobile-friendly browsing experience.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Page Title: The &lt;/p&gt;AWS User Groups tag sets the title of your web page, which appears in the browser tab. It serves as an identifier and provides users with context about the content they're viewing.&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Favicon: The website icon, stored in our "images" folder (noted as ./images/), is set up here. It's the small icon you see in your browser tab, making your site easily recognizable.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Font Selection: We've opted for the Cairo Font and imported it from Google Fonts using the appropriate link. This choice enhances the visual appeal of our web content.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;CSS Styling: The page's style and layout are defined by an external CSS stylesheet linked with . This separation of styles from content ensures clean and maintainable code.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;JavaScript Library: To enhance interactivity and create a loading screen that improves user experience during the initial loading process, we've imported the jQuery library with . This library streamlines DOM manipulation and facilitates smooth loading of other resources like libraries and images.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In the &lt;/p&gt; Section:

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;Loading Screen: We've incorporated a loading screen within a &lt;/p&gt; element with the class "loader-wrapper." Inside, you'll find an image tagged with &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/.%2Fimages%2Fdigico-loader-icon.png" alt="Loading..." width="800" height="400"&gt;. This loading screen element serves an important purpose: it provides visual feedback to users while the web page is fetching necessary resources. This helps maintain a smooth and responsive user experience, especially when loading libraries and images. You can modify the image by replacing the path found above.
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Main Content Area: The primary section where our project unfolds is encapsulated within &lt;/p&gt;. This designated space is crucial because it's where our interactive globe visualization takes center stage. The nested  is where the Globe.GL library will render the  3D globe that showcases AWS User Groups around the world. Think of this as the canvas upon which our data comes to life.
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Library Inclusion: Right within this section, we import an essential library with . This library is responsible for rendering and animating the Globe.GL  visualization on our web page. It essentially provides the necessary scripts that runs our interactive globe.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Script Integration: To complete the setup, we load our custom JavaScript script from . This script acts as the engine behind our project, orchestrating data handling, animation, and user interactions. It integrates with the Globe.GL library to create the immersive AWS User Groups visualization you'll explore.&lt;/p&gt;&lt;/li&gt;


&lt;p&gt;To verify that your script is functioning as expected within the HTML, you can include a simple test message like console.log("Hello World!"); in your script.js file. Then, open your browser's developer tools (usually accessible via Command ⌘ + Option ⌥ + I on MacOS X) and navigate to the console tab.&lt;/p&gt;

&lt;p&gt;If your script is running correctly, you should see a message logged in the console, similar to this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fit1vr56kocqd7f3jkemf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fit1vr56kocqd7f3jkemf.png" alt="Image description" width="800" height="479"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;br&gt;&lt;br&gt;
&lt;strong&gt;Functionality 1 - Loader Screen:&lt;/strong&gt;&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;p&gt;In the script.js file, we establish a loading screen effect with the following code:&lt;br&gt;
&lt;/p&gt;

&lt;/ul&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$(window).on("load", function () {
    $(".loader-wrapper").fadeOut("slow");
});
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;This code ensures that when the web page loads, the loading screen (defined by the .loader-wrapper class in our HTML) smoothly fades out.&lt;/p&gt;

&lt;p&gt;To style the loading screen as a full-screen element with a rotating animation, we've applied CSS rules in the style.css file. These rules are designed to create an engaging loading experience:&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.loader-wrapper {
    width: 100%;
    height: 100%;
    position: absolute;
    top: 0;
    left: 0;
    background-color: #000000;
    display: flex;
    justify-content: center;
    align-items: center;
}

.loader-img {
    display: inline-block;
    width: auto;
    height: auto;
    position: relative;
    animation: rotateImage 3s infinite ease-out;
}
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;You should see some similar to this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fppj1int95pc8eisjdxew.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fppj1int95pc8eisjdxew.png" alt="Image description" width="800" height="515"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once the page loads, this screen should disappear.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Functionality 2 - Creating the Globe:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The heart of our project is the globe visualization created using the globe.gl library. This dynamic globe serves as a canvas for displaying AWS User Groups worldwide. We configure various aspects of the globe, such as point colors, labels, and animations. Additionally, we load custom globe and background images to enhance the visual experience.&lt;/p&gt;

&lt;p&gt;We want to first create the globe with the following code:&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let globe = Globe()
            .globeImageUrl('./images/earth-night.jpg')
            .backgroundImageUrl('./images/starry-sky.png')
            (document.getElementById('globeViz'));
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9urdnbmbiqxniwhw19rc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9urdnbmbiqxniwhw19rc.png" alt="Image description" width="800" height="479"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We kick off the process by fetching details about AWS User Groups from the 'detail_UGInfo.json' file using the 'fetch' function. This dataset contains essential information about each user group, such as their name, location, URL, category, membership count, and descriptions. This data serves as the foundation for creating our interactive globe visualization.&lt;/p&gt;

&lt;p&gt;To properly display these groups on the globe, we first create 'pinsData.' This configuration extracts latitude and longitude coordinates from the JSON dataset. Additionally, it assigns attributes such as size, color, and information about each group. These attributes determine how the pins appear and behave on the globe.&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const pinsData = jsonData.map(item =&amp;gt; ({
            lat: item.ugLocation.ugLatitude,
            lng: item.ugLocation.ugLongitude,
            size: 0.025,
            color: 'gold',
            info: item
        }));
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;Next, we instruct our globe visualization to utilize the 'pinsData' dataset. We configure the globe with specific attributes, defining its appearance and interactivity. These attributes include the globe image, background image, hover precision for lines, and the behavior of the pins based on their size and color.&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let p_data;
let globe;

fetch('detail_UGInfo.json')
    .then(response =&amp;gt; response.json())
    .then(jsonData =&amp;gt; {
        p_data = jsonData;
        //Pin config
        //Extract latitudes and longitudes from JSON data
        const pinsData = jsonData.map(item =&amp;gt; ({
            lat: item.ugLocation.ugLatitude,
            lng: item.ugLocation.ugLongitude,
            size: 0.025,
            color: 'gold',
            info: item
        }));

        globe = Globe()
            .globeImageUrl('./images/earth-night.jpg')
            .backgroundImageUrl('./images/starry-sky.png')
            .lineHoverPrecision(0)
            .pointsData(pinsData)
            .pointAltitude('size')
            .pointColor('color')
            (document.getElementById('globeViz'));
    })
    .catch(error =&amp;gt; {
        console.error('Error loading JSON:', error);
    });
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2x3zhv7ctatjocpzyd0v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2x3zhv7ctatjocpzyd0v.png" alt="Image description" width="800" height="479"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;But what if we want to enhance user interaction by allowing them to hover over a location marker and view detailed information about each AWS User Group? We can achieve this by customizing the globe to include a '.pointLabel' feature, which facilitates on-hover actions and data display using HTML. This feature draws upon the 'info' attribute derived from our data-fetching process.&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  .pointLabel(d =&amp;gt; `&amp;lt;div id="info-box"&amp;gt;&amp;lt;h3&amp;gt;${d.info.ugName}&amp;lt;/h3&amp;gt;
&amp;lt;p&amp;gt;&amp;lt;strong&amp;gt;Location:&amp;lt;/strong&amp;gt; ${d.info.ugLocation.ugLocation}&amp;lt;/p&amp;gt;
&amp;lt;p&amp;gt;&amp;lt;strong&amp;gt;Members:&amp;lt;/strong&amp;gt; ${d.info.ugMembersNumber}&amp;lt;/p&amp;gt;
&amp;lt;p&amp;gt;&amp;lt;strong&amp;gt;URL:&amp;lt;/strong&amp;gt; &amp;lt;a href="${d.info.ugURL}" target="_blank"&amp;gt;${d.info.ugURL}&amp;lt;/a&amp;gt;&amp;lt;/p&amp;gt;&amp;lt;/div&amp;gt;`)
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;Also, with the '.onPointClick' function, we can enable users to open the URL associated with an AWS User Group in a new browser tab when they click on a group's location marker.&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.onPointClick(d =&amp;gt; window.open(d.info.ugURL, '_blank'))
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;Now, the full code for our globe function looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let p_data;
let globe;

fetch('detail_UGInfo.json')
    .then(response =&amp;gt; response.json())
    .then(jsonData =&amp;gt; {
        p_data = jsonData;
        //Pin config
        //Extract latitudes and longitudes from JSON data
        const pinsData = jsonData.map(item =&amp;gt; ({
            lat: item.ugLocation.ugLatitude,
            lng: item.ugLocation.ugLongitude,
            size: 0.025,
            color: 'gold',
            info: item
        }));

        globe = Globe()
            .globeImageUrl('./images/earth-night.jpg')
            .backgroundImageUrl('./images/starry-sky.png')
            .lineHoverPrecision(0)
            .pointsData(pinsData)
            .pointAltitude('size')
            .pointColor('color')
            .pointLabel(d =&amp;gt; `&amp;lt;div id="info-box"&amp;gt;&amp;lt;h3&amp;gt;${d.info.ugName}&amp;lt;/h3&amp;gt;
&amp;lt;p&amp;gt;&amp;lt;strong&amp;gt;Location:&amp;lt;/strong&amp;gt; ${d.info.ugLocation.ugLocation}&amp;lt;/p&amp;gt;
&amp;lt;p&amp;gt;&amp;lt;strong&amp;gt;Members:&amp;lt;/strong&amp;gt; ${d.info.ugMembersNumber}&amp;lt;/p&amp;gt;
&amp;lt;p&amp;gt;&amp;lt;strong&amp;gt;URL:&amp;lt;/strong&amp;gt; &amp;lt;a href="${d.info.ugURL}" target="_blank"&amp;gt;${d.info.ugURL}&amp;lt;/a&amp;gt;&amp;lt;/p&amp;gt;&amp;lt;/div&amp;gt;`)
            .onPointClick(d =&amp;gt; window.open(d.info.ugURL, '_blank'))
            (document.getElementById('globeViz'));
    })
    .catch(error =&amp;gt; {
        console.error('Error loading JSON:', error);
    });
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;You might have noticed that the visualization is not quite as appealing as we'd like it to be. Don't worry; we can fix that! In our style.css file, we have the power to enhance the aesthetics to perfectly match our preferences.&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;body {
    margin: auto;
    background-color: #000000;
}
#info-box {
        padding-top: 5px;
        padding-bottom: 10px;
        padding-left: 20px;
        padding-right: 20px;
        background-color: rgba(0, 0, 0, 0.4);
        color: #ffffff;
}
h3 {
    font-size: 25px;
    margin-bottom: 10px;
}
p+p {
    margin-top: 5px;
}
a {
    color: #3498db;
    text-decoration: none;
}
a:hover {
    text-decoration: underline;
}

&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;&lt;em&gt;Notice: our #info-box is the hovering card in our globe visualization, you can further modify as needed.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff1035mdmwuunxe29iaoy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff1035mdmwuunxe29iaoy.png" alt="Image description" width="800" height="479"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Functionality 3 - Arc Data Handling:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;One of the nice features we can add to our project is the dynamic arcs that connect different AWS User Groups on the globe. To achieve this, we have two functions: initArcDataSet and updateArcDataSet. The former lays the foundation by generating a set of random arcs based on our loaded data. It's the initial step in creating the connections you'll witness. We pick random start and end points for each arc, ensuring they fall within the data's geographical bounds and avoiding duplicates. This gives us a diverse and engaging display of arcs., while the latter build upon the arcs created by initArcDataSet, this function keeps our visualization dynamic and fresh. Here's the neat part: we begin by loading a specific number of arcs using initArcDataSet. Then, we take it up a notch by introducing new random arc locations, sourced from the latitude and longitude data in our JSON file. These new arcs join the existing ones, creating a constantly evolving visual experience. We rely on the numLocations variable to generate random indices within the length of our JSON data, ensuring no duplicate indices exist too.&lt;/p&gt;

&lt;p&gt;For this we need to initialize new variables:&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let a_data; //Data to alternate the arcs
let numLocations; //JSON data length
const numArcs = 7; //Number of arcs to draw
let arcsData = []; //Array to hold the arcs data 
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;a. initArcDataSet&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;function initArcDataSet(jsonData = p_data) {
    //Arc config
    numLocations = jsonData.length;
    for (let i = 0; i &amp;lt; numArcs; i++) {
        //Generate random indices within the bounds of the data
        let randomStartIndex = Math.floor(Math.random() * numLocations);
        let randomEndIndex = Math.floor(Math.random() * numLocations);
        //Ensure the randomEndIndex is different from randomStartIndex
        if (randomStartIndex === randomEndIndex) {
            randomEndIndex = Math.floor(Math.random() * numLocations);
        }
        //Ensure that the generated indices are within bounds
        if (randomStartIndex &amp;gt;= 0 &amp;amp;&amp;amp; randomStartIndex &amp;lt; numLocations &amp;amp;&amp;amp;
            randomEndIndex &amp;gt;= 0 &amp;amp;&amp;amp; randomEndIndex &amp;lt; numLocations) {
            arcsData.push({
                startLat: jsonData[randomStartIndex].ugLocation.ugLatitude,
                startLng: jsonData[randomStartIndex].ugLocation.ugLongitude,
                endLat: jsonData[randomEndIndex].ugLocation.ugLatitude,
                endLng: jsonData[randomEndIndex].ugLocation.ugLongitude,
                color: ['gold', 'blue']
            });
        } else {
            console.warn('Generated indices out of bounds:', randomStartIndex, randomEndIndex);
        }
    }
    console.log("End first time: jsondata", arcsData)
    return arcsData;
}
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;b. updateArcDataSet&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let arc_change = 2;
function updateArcDataSet(jsonData = a_data) {
    //Arcs to modify
    for (let i = 0; i &amp;lt; arc_change; i++) {
        let k = Math.floor(Math.random() * numArcs);
        //Generate random indices within the bounds of the data
        let randomStartIndex = Math.floor(Math.random() * numLocations);
        let randomEndIndex = Math.floor(Math.random() * numLocations);

        //Ensure the randomEndIndex is different from randomStartIndex
        if (randomStartIndex === randomEndIndex) {
            randomEndIndex = Math.floor(Math.random() * numLocations);
        }
        //Ensure that the generated indices are within bounds
        if (randomStartIndex &amp;gt;= 0 &amp;amp;&amp;amp; randomStartIndex &amp;lt; numLocations &amp;amp;&amp;amp;
            randomEndIndex &amp;gt;= 0 &amp;amp;&amp;amp; randomEndIndex &amp;lt; numLocations) {
            arcsData[k] = {
                startLat: p_data[randomStartIndex].ugLocation.ugLatitude,
                startLng: p_data[randomStartIndex].ugLocation.ugLongitude,
                endLat: p_data[randomEndIndex].ugLocation.ugLatitude,
                endLng: p_data[randomEndIndex].ugLocation.ugLongitude,
                color: ['gold', 'blue']
            };
        } else {
            console.warn('Generated indices out of bounds:', randomStartIndex, randomEndIndex);
        }
    }
    a_data = arcsData;
    globe.arcsData(a_data);
}
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;&lt;em&gt;Tip: easily adjust the number of arcs to change by modifying the 'for' loop or by creating a variable for better control e.g. arc_change.&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Bonus challenge: consider implementing a mechanism to gracefully handle exceptions when dealing with 'Not Available' (NA) locations during arc regeneration.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;c. Updating the Globe to Display Arcs&lt;/p&gt;

&lt;p&gt;To update the globe and display the arcs, add the following configurations to the 'Globe' variable:&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;            .arcsData(a_data)
            .arcColor('color')
            .arcDashLength(1)
            .arcDashGap(1)
            .arcDashInitialGap(1)
            .arcsTransitionDuration(0)
            .arcDashAnimateTime(3500)
            .arcLabel(d =&amp;gt; d.info.ugLocation.ugLocation)
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;Additionally, set a window interval to ensure continuous arc updates within a 3500ms timeframe:&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;window.setInterval(updateArcDataSet, 3500);
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;&lt;strong&gt;Functionality 4 - Responsive Design:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Our project offers a responsive design that effortlessly adjusts to different devices. In our script.js, we utilize window.innerWidth to detect the user's device and dynamically adjust the globe's interaction attributes, such as OnPointClick. This means whether you're using a desktop, tablet, or mobile device, our globe provides an engaging and informative experience tailored to your screen.&lt;/p&gt;

&lt;p&gt;Complemented with media queries (&lt;a class="mentioned-user" href="https://dev.to/media"&gt;@media&lt;/a&gt;) in our style.css, we ensure that the globe visualization and content are presented optimally on screens of varying sizes.&lt;/p&gt;

&lt;p&gt;For script.js, we employ device size detection to tailor the user experience. Specifically, we customize the pointLabel for tablets and mobile phones to include an 'Open URL' button. In contrast, for desktop users, we enable onPointClick to open URLs in new windows. To ensure optimal positioning of the globe on various screen sizes, we access the Three.js components of our Globe.GL globe and adjust the camera position. This is achieved through&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const cam = globe.camera();
if (isMobile) {
    cam.position.z = 600; //Positioned for mobile devices
} else if (isTablet) {
    cam.position.z = 500; //Positioned for tablet devices
}
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;Here's the corresponding compatibility code:&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;        //Device detection
        const isMobile = window.innerWidth &amp;lt;= 767;
        const isTablet = window.innerWidth &amp;lt;= 1024;

        const cam = globe.camera();
        if (isMobile) {
            cam.position.z = 600;
            globe.pointLabel(d =&amp;gt; `&amp;lt;div id="info-box"&amp;gt;&amp;lt;h3&amp;gt;${d.info.ugName}&amp;lt;/h3&amp;gt;
            &amp;lt;p&amp;gt;&amp;lt;strong&amp;gt;Location:&amp;lt;/strong&amp;gt; ${d.info.ugLocation.ugLocation}&amp;lt;/p&amp;gt;
            &amp;lt;p&amp;gt;&amp;lt;strong&amp;gt;Members:&amp;lt;/strong&amp;gt; ${d.info.ugMembersNumber}&amp;lt;/p&amp;gt;
            &amp;lt;p&amp;gt;&amp;lt;strong&amp;gt;URL:&amp;lt;/strong&amp;gt; &amp;lt;a href="${d.info.ugURL}" target="_blank"&amp;gt;${d.info.ugURL}&amp;lt;/a&amp;gt;&amp;lt;/p&amp;gt; &amp;lt;button id="url-button" onclick="window.location.href='${d.info.ugURL}"';"&amp;gt; Open URL&amp;lt;/button&amp;gt;&amp;lt;/div&amp;gt;`)
        }
        else if (isTablet) {
            cam.position.z = 500;
            globe.pointLabel(d =&amp;gt; `&amp;lt;div id="info-box"&amp;gt;&amp;lt;h3&amp;gt;${d.info.ugName}&amp;lt;/h3&amp;gt;
            &amp;lt;p&amp;gt;&amp;lt;strong&amp;gt;Location:&amp;lt;/strong&amp;gt; ${d.info.ugLocation.ugLocation}&amp;lt;/p&amp;gt;
            &amp;lt;p&amp;gt;&amp;lt;strong&amp;gt;Members:&amp;lt;/strong&amp;gt; ${d.info.ugMembersNumber}&amp;lt;/p&amp;gt;
            &amp;lt;p&amp;gt;&amp;lt;strong&amp;gt;URL:&amp;lt;/strong&amp;gt; &amp;lt;a href="${d.info.ugURL}" target="_blank"&amp;gt;${d.info.ugURL}&amp;lt;/a&amp;gt;&amp;lt;/p&amp;gt;button id="url-button" onclick="window.location.href='${d.info.ugURL}"';"&amp;gt; Open URL&amp;lt;/button&amp;gt;
           &amp;lt;/div&amp;gt;`)

        } else { //Desktop
            globe.onPointClick(d =&amp;gt; window.open(d.info.ugURL, '_blank'))
        }
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;Globe on mobile (e.g. iPhone 12 Pro):&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuawl5n5zv2klr3jcwxfq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuawl5n5zv2klr3jcwxfq.png" alt="Image description" width="800" height="479"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For Tablets (e.g. iPad Air):&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa3ht6ngibt37j89mgraw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa3ht6ngibt37j89mgraw.png" alt="Image description" width="800" height="479"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;More Elements (Text + Logo):&lt;/p&gt;

&lt;p&gt;To elevate the visual appeal of our project, we're introducing two key elements: a concise text, 'Find an AWS Group,' and Digico's logo. We've decided to position these elements – for desktop and tablet devices, they appear on the bottom right, and for mobile devices, they are centered.&lt;/p&gt;

&lt;p&gt;Adding Elements to index.html:&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;div id="text-overlay"&amp;gt;
    &amp;lt;p id="custom-inner-text"&amp;gt;&amp;lt;/p&amp;gt;
&amp;lt;/div&amp;gt;
&amp;lt;div id="image-container"&amp;gt;&amp;lt;a href="https://digico.solutions/" target="_blank"&amp;gt;
            &amp;lt;img id="logo" src="./images/digico-logo.png" alt="digicon"&amp;gt;
&amp;lt;/a&amp;gt;&amp;lt;/div&amp;gt;
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;Notice the 'id' attributes assigned to these elements, which we'll reference in script.js and style.css.&lt;/p&gt;

&lt;p&gt;Creating Responsive Designs in style.css:&lt;/p&gt;

&lt;p&gt;Desktop CSS (Screens ≥ 1024px):&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;/* CSS for Desktop */
@media (min-width: 1024px) {
    #info-box {
        padding-top: 5px;
        padding-bottom: 10px;
        padding-left: 20px;
        padding-right: 20px;
        background-color: rgba(0, 0, 0, 0.4);
        color: #ffffff;
    }
    #text-overlay {
        position: absolute;
        top: 8%;
        left: 3%;
    }
    #custom-inner-text {
        width: 100%;
        /* padding: 10px; */
        color: white;
        font-size: 40px;
        font-family: 'Cairo', sans-serif;
        font-weight: 300;
    }
    #logo {
        width: 20%;
        padding-right: 35px;
        margin: auto;
        display: none;
        position: absolute;
        bottom: 0;
        right: 0;
    }
    p {
        font-size: 15px;
        line-height: 1.5;
        font-family: 'Cairo', sans-serif;
        margin: 0;
    }
}
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;Tablet CSS (Screens ≤ 1024px):&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;/* CSS for screens with a maximum width of 1024px (tablets) */
@media (max-width: 1024px) {
    #text-overlay {
        position: absolute;
        top: 2%;
        left: 3%;
    }
    #custom-inner-text {
        width: 100%;
        margin: 0;
        color: white;
        font-size: 40px;
        font-family: 'Cairo', sans-serif;
        font-weight: 300;
    }
    #logo {
        width: 35%;
        margin: auto;
        display: none;
        position: absolute;
        bottom: 0;
        right: 0;
    }
}
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;Mobile CSS (Screens ≤ 767px):&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;/* CSS for screens with a maximum width of 767px (mobile screens) */
@media (max-width: 767px) {
    #info-box {
        padding: 10px;
        background-color: rgba(0, 0, 0, 0.4);
        color: #ffffff;
    }
    #text-overlay {
        width: 100%;
        margin: 0;
    }
    #custom-inner-text {
        text-align: center;
        margin: auto;
        color: white;
        margin-left: -10px;
        font-size: 30px;
        font-family: 'Cairo', sans-serif;
        font-weight: 300;
    }
    #logo {
        width: 60%;
        margin-right: 9vh;
        display: none;
    }
    h3 {
        font-size: 18px;
        margin-bottom: 10px;
    }
    p {
        font-size: 14px;
        line-height: 1.5;
        font-family: 'Cairo', sans-serif;
        margin: 0;
    }
}
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;This code ensures that the text and logo are displayed optimally on various devices, enhancing the overall visual experience of our project.&lt;/p&gt;

&lt;p&gt;Adding Text After Loading:To ensure a seamless loading experience without prematurely displaying text, we've implemented the following code to manipulate the HTML elements. This enables us to insert the desired text while maintaining proper formatting. Additionally, we make the previously positioned logo visible by changing its display property to 'block.'&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      //Accessing the custom inner text element
      const customInnerText = document.getElementById('custom-inner-text');

      //Creating text nodes for the lines
      const line1 = document.createTextNode('Find an ');
      const lineBreak = document.createElement('br');
      const line2 = document.createTextNode('AWS User Group');

      //Appending the lines and line break
      customInnerText.appendChild(line1);
      customInnerText.appendChild(lineBreak);
      customInnerText.appendChild(line2);

      //Making the logo visible
      document.getElementById('logo').style.display = 'block';
&lt;/code&gt;&lt;/pre&gt;



&lt;p&gt;Final Visualization Images:&lt;/p&gt;

&lt;p&gt;Desktop (MacBook Pro M1)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftcx0gajoedk0x6e8d6ns.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftcx0gajoedk0x6e8d6ns.png" alt="Image description" width="800" height="479"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Tablet (iPad Air)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwszzrnlrc6vn43h6t5pu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwszzrnlrc6vn43h6t5pu.png" alt="Image description" width="800" height="479"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Mobile (iPhone 12)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F71elft5rxlbifxr43615.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F71elft5rxlbifxr43615.png" alt="Image description" width="800" height="479"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 4 - Going Global in Minutes: Leveraging AWS Services
&lt;/h2&gt;

&lt;p&gt;In this step, we'll dive into the world of AWS (Amazon Web Services) and explore how we can rapidly expand the project's global reach. AWS offers a plethora of services and tools that can help your project scale effortlessly, and we'll guide you through the key steps to harness this power effectively. Get ready to take your project to new heights with AWS!&lt;/p&gt;

&lt;p&gt;Before getting started make sure you have an AWS account to follow through on the tutorial. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Deploying on S3&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In this section, we'll set up a static website hosting environment using Amazon S3, a highly scalable and cost-effective AWS service.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Sign in to Your AWS Account&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Begin by signing in to your AWS (Amazon Web Services) account. If you don't have an AWS account, you can create one by visiting the AWS Console at &lt;a href="https://console.aws.amazon.com/" rel="noopener noreferrer"&gt;https://console.aws.amazon.com/&lt;/a&gt;. Ensure that you are logged in with the necessary permissions to create and manage S3 buckets.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu5fbphqilpl03gk1a9xs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu5fbphqilpl03gk1a9xs.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create an S3 Bucket&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In the AWS Console, locate the search box at the top center and search for "S3."&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Click on "S3" to access the Amazon S3 service.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Now, let's create a new S3 bucket to host your static website. Click on the "Create Bucket" button.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fez6v5yey15t91xu2yieq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fez6v5yey15t91xu2yieq.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8ejcr3p91fwcanl37ei6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8ejcr3p91fwcanl37ei6.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Setting up the  S3 Bucket&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Choosing the right name for your S3 bucket is crucial, especially if you plan to use a custom subdomain or domain for your website. Here are some key considerations:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Bucket Name: This should match the subdomain or domain you intend to use. For example, if you want your website to be accessible at &lt;a href="http://www.example.com" rel="noopener noreferrer"&gt;http://www.example.com&lt;/a&gt;, your bucket name must be &lt;a href="http://www.example.com" rel="noopener noreferrer"&gt;www.example.com&lt;/a&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Bucket Naming Rules: Bucket names must be globally unique across all of AWS. So, make sure your chosen name is not already in use by another AWS user.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In our example, we'll name the bucket "usergroups.digico.solutions." Here, "usergroups" becomes a subdomain within our domain "digico.solutions."&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxsqsgjjgmvmv18utvhl4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxsqsgjjgmvmv18utvhl4.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Note: the naming is now currently awsusergroups.digico.solutions&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Then, navigate down to untick the “Block all public access” option to allow public access to your website files. Acknowledge the implications by ticking the acknowledgment box.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhj23z3zb82wx6m50ajq8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhj23z3zb82wx6m50ajq8.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;While configuring your S3 bucket, you can adjust other settings as needed including the Object Ownership and Encryption, the default is satisfactory in our project. However, it's advisable to enable versioning for your bucket. Versioning keeps multiple versions of an object, allowing you to recover from accidental deletions or overwrites. Please note that each versioned object is treated as a separate object, which may incur additional storage costs.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffzygi9d72p4khx93pk2n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffzygi9d72p4khx93pk2n.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgpukg4fjk4k7suh7da0q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgpukg4fjk4k7suh7da0q.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally, click on "Create bucket" to save your bucket configuration. After successfully creating the bucket, you should see a confirmation message like "Successfully created bucket 'usergroups.digico.solutions'."&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh7llgoxxsgr5l0x1o3ne.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh7llgoxxsgr5l0x1o3ne.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffllfpiav99ym6ejxrc0a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffllfpiav99ym6ejxrc0a.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now that your bucket is set up, you need to upload your project files into it before deploying it as a static website. Follow these steps:&lt;/p&gt;

&lt;p&gt;Navigate to your bucket by searching for it in the search box under the "Buckets" section.&lt;/p&gt;

&lt;p&gt;Click on the bucket name to open the bucket objects page.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg7lmloi6wvftcmw1fnhs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg7lmloi6wvftcmw1fnhs.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Uploading Files and Folders:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Since your bucket is empty, you need to add the objects to it. Start by clicking the "Upload" button.&lt;/p&gt;

&lt;p&gt;In the upload page, use both the "Add Files" and "Add Folders" options. Note that you can't select multiple folders for batch upload; you need to iteratively click "Upload" for each file or folder.&lt;/p&gt;

&lt;p&gt;Upload all the folders and files from your local device into the S3 bucket. Pay attention to how the paths are formed when you upload a folder.&lt;/p&gt;

&lt;p&gt;Click "Upload" to transfer the items to the S3 bucket. Wait for a green banner to appear with the message "Upload Succeeded," then click "Close."&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F90neqxjguzq7m955uqjf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F90neqxjguzq7m955uqjf.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmq8fuivvmwe0yalxnyyy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmq8fuivvmwe0yalxnyyy.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs1ongsio60t66b7bxc1j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs1ongsio60t66b7bxc1j.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9vdvh9e5x7h5najw3gwy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9vdvh9e5x7h5najw3gwy.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3vgmbjvnlteneloil2l6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3vgmbjvnlteneloil2l6.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Enabling Static Website Hosting:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Now that your project files are in the bucket, you need to enable static website hosting. Follow these steps:&lt;/p&gt;

&lt;p&gt;Go to the bucket's "Properties" section.&lt;/p&gt;

&lt;p&gt;Scroll down to the "Static website hosting" section and click "Edit."&lt;/p&gt;

&lt;p&gt;In the "Edit" page, enable static website hosting by ticking the "Enable" option.&lt;/p&gt;

&lt;p&gt;Ensure that the "Hosting type" is set to "Host a static website."&lt;/p&gt;

&lt;p&gt;Specify the name of your index document, which serves as the default page for your website. Typically, this is "index.html" as we sat up previously.&lt;/p&gt;

&lt;p&gt;Scroll to the end of the page and click "Save changes."&lt;/p&gt;

&lt;p&gt;Once completed, you should see “Successfully edited static website hosting.”&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3bca3d6rvnyiabp6pdtk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3bca3d6rvnyiabp6pdtk.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbmfi6295m6228j8yv6zi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbmfi6295m6228j8yv6zi.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8utrugg877ln79lekzgr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8utrugg877ln79lekzgr.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvaoz17x5d0w3y6iw1a88.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvaoz17x5d0w3y6iw1a88.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq5vzignbp22k9svhymxu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq5vzignbp22k9svhymxu.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By this step, you should have configured your S3 bucket to host your static website, uploaded your project files, and enabled static website hosting. The website should now be accessible through the S3 bucket URL. In our case the S3 endpoint is accessible and can be found if we scroll down to the end of the properties page at: &lt;a href="http://usergroups.digico.solutions.s3-website-eu-west-1.amazonaws.com" rel="noopener noreferrer"&gt;http://usergroups.digico.solutions.s3-website-eu-west-1.amazonaws.com&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvvrp7pizatguou5ttdw2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvvrp7pizatguou5ttdw2.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;However, if you attempt to access the URL to view the objects, you may encounter an "Access Denied" error. But don't worry; this is a common issue, and it happens because we haven't set up a Bucket Policy yet. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz8k6b5swwg4stsbastwr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz8k6b5swwg4stsbastwr.png" alt="Image description" width="800" height="452"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Bucket policies are essential as they define the access rules for our S3 bucket. They determine who can access and interact with the objects stored within the bucket. To make our S3 bucket publicly accessible, we need to create a Bucket Policy that grants public read access to our objects. &lt;/p&gt;

&lt;p&gt;To customize the bucket's policy, follow these steps:&lt;/p&gt;

&lt;p&gt;Navigate to the "Permissions" tab within your previously named bucket. In our case, the "usergroups.digico.solutions" bucket.&lt;/p&gt;

&lt;p&gt;Scroll down to the "Bucket policy" section and click "Edit."&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4f7da8tggrh7yqx7zvcx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4f7da8tggrh7yqx7zvcx.png" alt="Image description" width="800" height="373"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3ws0sr56vdsbtt0ks1sj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3ws0sr56vdsbtt0ks1sj.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There are different ways to create policies. However, to simplify the policy creation process, we'll use the Policy generator. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhzrcqvehnyzbl1evyu3z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhzrcqvehnyzbl1evyu3z.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwizi8wd5n8g9tml96teh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwizi8wd5n8g9tml96teh.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;On this page we should several steps in order to create a correct and sound policy:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Step 1: Select Policy Type&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Choose "S3 Bucket Policy."&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Step 2: Define Effect and Principal&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For "Effect," select "Allow" (this determines whether the action is allowed or denied).&lt;/p&gt;

&lt;p&gt;For "Principal," choose "*" (which means all users).&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Step 3: Set Service and Actions&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Service should be set to "Amazon S3."&lt;/p&gt;

&lt;p&gt;From the "Action" dropdown menu, select "GetObject" (allowing users to retrieve objects).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsnvvyyb4fvdelmtw3jxs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsnvvyyb4fvdelmtw3jxs.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F66zr5lw2fj6lat7ze1qr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F66zr5lw2fj6lat7ze1qr.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Step 4: Define Resource ARN&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To specify the ARN (Amazon Resource Name), go back to the "Edit bucket policy" page and copy the ARN displayed there.&lt;/p&gt;

&lt;p&gt;Paste this ARN into the "Generate Policy" page.&lt;/p&gt;

&lt;p&gt;Important: Append "/&lt;em&gt;" to the bucket ARN, making it in the format "arn:aws:s3:::YOUR_BUCKET_NAME/&lt;/em&gt;" This ensures the policy applies to both the bucket itself and its contents.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F62wuccs08d1xg4vacidv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F62wuccs08d1xg4vacidv.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff2hvo8ssqmn998fs2hsh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff2hvo8ssqmn998fs2hsh.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Step 5: Add and Generate Policy Statement&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Click "Add Statement" to include the policy statement you've designed.&lt;/p&gt;

&lt;p&gt;As this is the only statement needed, scroll down and click "Generate Policy."&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo5626sa627dvt2guj64n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo5626sa627dvt2guj64n.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Step 6: Apply the Generated Policy&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Copy the generated policy.&lt;/p&gt;

&lt;p&gt;Return to the "Edit bucket policy" page.&lt;/p&gt;

&lt;p&gt;Paste the generated policy into the designated field.&lt;/p&gt;

&lt;p&gt;Scroll down and click "Save changes."&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdp39kcx4bxmi97p79xcl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdp39kcx4bxmi97p79xcl.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuaefhray59tfzs9i6hzo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuaefhray59tfzs9i6hzo.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbdoxx6urimys19ytiyqn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbdoxx6urimys19ytiyqn.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5vueejpffebs23pcbg5c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5vueejpffebs23pcbg5c.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let’s try again now our S3 endpoint to see if our website is displaying correctly: &lt;a href="http://usergroups.digico.solutions.s3-website-eu-west-1.amazonaws.com" rel="noopener noreferrer"&gt;http://usergroups.digico.solutions.s3-website-eu-west-1.amazonaws.com&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;The result:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fekvw80bh12tpav4slfqx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fekvw80bh12tpav4slfqx.png" alt="Image description" width="800" height="452"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Securing and Deploying the Globe on your Domain **
&lt;/h3&gt;

&lt;p&gt;In this section, we'll guide you through the process of deploying your project while ensuring it follows best practices for security and accessibility. We'll start by addressing the need for secure deployment and discuss the steps involved.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Issuing an SSL Certificate&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;Our journey begins with securing your website through the issuance of an SSL (Secure Sockets Layer) certificate via the AWS Certificate Manager (ACM). This crucial certificate ensures the encryption and security of data in transit, safeguarding the integrity of your website.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Accessing AWS Certificate Manager (ACM)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Start by accessing the AWS Certificate Manager from the AWS Console. You can easily find it by typing "Certificate Manager" in the search box.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Selecting the Right Region&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;For compatibility with CloudFront, it's essential to request or import the certificate in the US East (N. Virginia) Region (us-east-1). This region ensures integration with the AWS services we'll use later in the deployment process. In our case, we will proceed with requesting the certificate.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzadpvhocpplsdh97kz53.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzadpvhocpplsdh97kz53.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqge01k2dsvsrna8c3i7s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqge01k2dsvsrna8c3i7s.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Requesting Your SSL Certificate&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In the AWS Certificate Manager, click on "Request" to initiate the certificate issuance process.&lt;/p&gt;

&lt;p&gt;Ensure that you select "Request a public certificate" under the Certificate Type and then click "Next."&lt;/p&gt;

&lt;p&gt;Specify the fully qualified domain name (FQDN) or the subdomain/domain URL you plan to use for your website. This is a crucial step, as it helps the certificate to be correctly associated with your domain.&lt;/p&gt;

&lt;p&gt;For validation, choose the DNS records option. This is the method we'll use to verify ownership of the domain.&lt;/p&gt;

&lt;p&gt;Scroll down and select "RSA 2048" as the key algorithm. While there are other options like "ECSDA 256" and "ECSDA P 384," "RSA 2048" is a commonly chosen option due to its compatibility and robust encryption.&lt;/p&gt;

&lt;p&gt;Finally, click on "Request" located at the bottom left of the page.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fih9ww4oatplvw821mavt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fih9ww4oatplvw821mavt.png" alt="Image description" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdh4sr2l224l0mx0h6yq2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdh4sr2l224l0mx0h6yq2.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6inakvodcevuq03rv8zp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6inakvodcevuq03rv8zp.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1lmsfryfwia5mwjs86un.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1lmsfryfwia5mwjs86un.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Viewing Certificate Request Status&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;If your certificate request has been successfully submitted, you'll see a banner confirming it with a unique ID (XXXX...). Click "View certificate" from the banner.&lt;/p&gt;

&lt;p&gt;Scroll down to the "Domains" section of the certificate details. Here, you'll find the necessary values required for setting up DNS records. Make sure to copy both the "CNAME Name" and "CNAME Value" as we'll utilize them in the upcoming sections.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqr47r9x9toqkte8i5j4n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqr47r9x9toqkte8i5j4n.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F46avjojdmeo910nepabi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F46avjojdmeo910nepabi.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fahyhx4cxs3ix4nrhc43q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fahyhx4cxs3ix4nrhc43q.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, we'll proceed to set up DNS records to complete the verification process and enable HTTPS for Globe website.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Managing DNS Records
&lt;/h3&gt;

&lt;p&gt;Accessing and managing DNS (Domain Name System) records for your domain is a crucial part of our deployment process. These records are the key to configuring your domain settings effectively. In our demonstration, we utilize Lightsail to handle DNS records. If you're new to this process, we highly recommend referring to the Lightsail DNS Entry Guide for detailed instructions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Accessing Lightsail Domains &amp;amp; DNS&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Begin by accessing Lightsail Domains &amp;amp; DNS and set up a DNS Zone if you haven't already. This zone will serve as the foundation for configuring your DNS records.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Adding a DNS Record&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Navigate to the DNS records within Lightsail and look for the option to "Add record." This action will allow you to create a new DNS record necessary for our setup.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjpasoxulraiir1mw1t5g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjpasoxulraiir1mw1t5g.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configuring the DNS Record&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In the record type, select "CNAME." This type of record is suitable for our purposes and aligns with our certificate setup.&lt;/p&gt;

&lt;p&gt;Under "Record name," paste the CNAME name obtained from the AWS Certificate Manager. Please note that depending on your hosting service's DNS record input, your domain may already be present. If this is the case, ensure that the CNAME name matches the text from the AWS Certificate Manager.&lt;/p&gt;

&lt;p&gt;In the "Route traffic to" field, enter the CNAME value that you previously copied from the certificate details.&lt;/p&gt;

&lt;p&gt;Finally, click "Save" to confirm the addition of the CNAME record. It's essential to verify that the CNAME record has been successfully added under the CNAME Records section, taking into account your specific hosting service's requirements.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmflg2ywtqj8j8hl9v0jn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmflg2ywtqj8j8hl9v0jn.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flqcsk87xe8zdcmlf76uy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flqcsk87xe8zdcmlf76uy.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fced90i1ba64t2omdsakp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fced90i1ba64t2omdsakp.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqvttiy5ka8uq0wmd3t0u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqvttiy5ka8uq0wmd3t0u.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Typically, you can expect the SSL certificate to be issued by AWS Certificate Manager within approximately 30 minutes. To confirm, revisit the ACM page and click on 'List certificates'.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2vbfl1vwcjncioelt9c0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2vbfl1vwcjncioelt9c0.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3: Configuring CloudFront
&lt;/h3&gt;

&lt;p&gt;Now that we have secured our SSL certificate and managed DNS records, let's move on to configuring Amazon CloudFront, AWS's content delivery network service. CloudFront will serve as the entry point for our website, ensuring faster content delivery to users. In this step, we will set up CloudFront to use the S3 bucket as its origin, associate it with the SSL certificate we obtained earlier, and implement a crucial HTTP to HTTPS redirect for added security.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Access CloudFront: Begin by accessing CloudFront through the AWS Console. If you're starting fresh, select "Create a CloudFront distribution." If you've previously set up a distribution, choose the "Create" option.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqzlviwrbu6t70t6chcsp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqzlviwrbu6t70t6chcsp.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9m3n1wxrjm4zz7vj19yi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9m3n1wxrjm4zz7vj19yi.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select Origin: Choose an appropriate name for your origin. From the dropdown menu, select the AWS origin, which corresponds to the S3 bucket we've created earlier. Opt for "Use website endpoint" for consistency, as it simplifies routing.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fredb3ojjna3avmkkhgeh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fredb3ojjna3avmkkhgeh.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmuasjg4lczp15yt7x5e1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmuasjg4lczp15yt7x5e1.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Configure Default Cache Behavior and WAF: Scroll down while retaining most of the default configuration. However, under "Default cache behavior," go to "Viewer" and then "Viewer protocol policy." Opt for "Redirect HTTP to HTTPS" to ensure all traffic is secured via HTTPS. This step is crucial for data protection. Also, under Web Application Firewall (WAF), choose Do not enable security protections as we are only having this as a simple project. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbrbnl6smnjd1460pxho0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbrbnl6smnjd1460pxho0.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz5oqesbrwgbq93p8s3tg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz5oqesbrwgbq93p8s3tg.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Fine-Tune Settings: Continue scrolling down until you reach the "Settings" section. Here, select "Use all edge locations (best performance)" under "Pricing class." This choice optimizes the delivery of our globe visualization project.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Configure Alternate Domain (CNAME): In this step, add the same fully qualified domain name for which we issued the SSL certificate under the "Alternate domain name (CNAME)" section. In our case, this is "usergroups.digico.solutions." This step is crucial for correct configuration to avoid potential ERROR 403 issues.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Custom SSL Certificate: In the section near the bottom, choose the previously issued SSL certificate from the dropdown menu. This certificate is essential for securing your custom domain.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd3qqfyggvbfqqy15388w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd3qqfyggvbfqqy15388w.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdrjodg1a5tggre6andka.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdrjodg1a5tggre6andka.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create Distribution: Finally, scroll to the bottom and click "Create distribution" to complete the CloudFront setup.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgn8i8zve6eoshedvy9qf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgn8i8zve6eoshedvy9qf.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Upon successful creation, you should see a green banner with “Successfully created new distribution”, signaling that the process is underway. Keep in mind that the status under "Last modified" may initially show "deploying." This status change is normal and occurs because CloudFront can take up to 25 minutes to fully deploy your configuration.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff8d9lqawycy6hrlximyu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff8d9lqawycy6hrlximyu.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 4: Updating DNS Records for CloudFront
&lt;/h3&gt;

&lt;p&gt;Finally, we'll configure a CNAME record for the subdomain "usergroups.digico.solutions" to direct it to the CloudFront distribution. This step marks the culmination of the process, enabling secure access to the globe visualization project via HTTPS, using our designated subdomain. To accomplish this:&lt;/p&gt;

&lt;p&gt;Copy the Distribution Domain Name from your CloudFront setup.&lt;/p&gt;

&lt;p&gt;Navigate back to the DNS Zone of your hosting service, just like in the previous steps where we managed DNS records.&lt;/p&gt;

&lt;p&gt;Click on "Add record" to create a new DNS record.&lt;/p&gt;

&lt;p&gt;Choose "CNAME" as the Record type.&lt;/p&gt;

&lt;p&gt;In the "Record name" field, type the subdomain, which, in our case, is "usergroups."&lt;/p&gt;

&lt;p&gt;In the "Record value" field, paste the copied Distribution Domain Name from CloudFront. Make sure to remove the "https://" part, as we only need the domain name.&lt;/p&gt;

&lt;p&gt;Click "Save" to confirm the addition of the CNAME record.&lt;/p&gt;

&lt;p&gt;Finally, make sure that the CNAME record has been added.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzzv96gtqgkbh1yd258yo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzzv96gtqgkbh1yd258yo.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk1gdj5dtsni6ow2mrlc9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk1gdj5dtsni6ow2mrlc9.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Figo9fziy4ik8ix90i52l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Figo9fziy4ik8ix90i52l.png" alt="Image description" width="800" height="395"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Optional: Use Only CloudFront to Display the Website&lt;/p&gt;

&lt;p&gt;Despite our deployment being complete, we still would like to restrict access to our S3 bucket and keep our CloudFront distribution as the only accessible point for our Globe.&lt;/p&gt;

&lt;p&gt;To do this, will have a simple alternative route in setting up both our S3 Bucket and our CloudFront distribution. &lt;/p&gt;

&lt;p&gt;With this step performed, our deployment is now complete with the URL usergroups.digico.solutions ready using several AWS services:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Amazon S3&lt;/li&gt;
&lt;li&gt;AWS Certificate Manager&lt;/li&gt;
&lt;li&gt;Amazon Lightsail&lt;/li&gt;
&lt;li&gt;Amazon CloudFront&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To try it out the AWS User Group Globe, please visit awsusergroups.digico.solutions&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy7xy17ds66sbq6lx283q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy7xy17ds66sbq6lx283q.png" alt="Image description" width="800" height="456"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I hope you enjoyed the reading, see you in the next one!&lt;/p&gt;



</description>
      <category>aws</category>
      <category>cloudfront</category>
      <category>visualization</category>
      <category>s3</category>
    </item>
    <item>
      <title>Outgrowing Your File System? Meet AWS EFS</title>
      <dc:creator>Osama Shamout</dc:creator>
      <pubDate>Thu, 08 Feb 2024 11:05:29 +0000</pubDate>
      <link>https://dev.to/oshamout/outgrowing-your-file-system-meet-aws-efs-149b</link>
      <guid>https://dev.to/oshamout/outgrowing-your-file-system-meet-aws-efs-149b</guid>
      <description>&lt;p&gt;Today, I would like to introduce you to the capabilities of Amazon Elastic File System (EFS) and how it can propel your business forward. EFS not only facilitates effortless data storage scalability but also enhances performance and cost-effectiveness, providing the agility your business needs to thrive in today's competitive market.&lt;/p&gt;

&lt;h2&gt;
  
  
  EFS in a Brief
&lt;/h2&gt;

&lt;p&gt;Imagine a storage system that adapts to your needs, effortlessly growing to petabytes on demand. No more scrambling for extra space or waiting for slow backups – EFS takes care of it all. But EFS isn't just about size; it's about blazing-fast performance. Think lightning-quick throughput and near-instantaneous access for demanding workloads like streaming video or big data processing. Your applications will never be held back by sluggish storage again. And the best part? EFS is fully managed. That means no more headaches with provisioning, patching, or maintenance. EFS handles it all behind the scenes, freeing you to focus on what truly matters – your business.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F51nuba4vbxxogam6f924.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F51nuba4vbxxogam6f924.png" alt="AWS EFS Diagram" width="800" height="322"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  EFS Key Features
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Effortless Management
&lt;/h3&gt;

&lt;p&gt;​Amazon EFS shines in its fully managed nature, supporting Network File System (NFS v4.0 and v4.1) and delivering shared file system storage for Linux workloads. Eliminating the need for managing file servers, hardware updates, or backups, Amazon EFS streamlines file system creation and configuration, allowing a focus on applications rather than infrastructure.&lt;/p&gt;

&lt;p&gt;With elasticity taking center stage, Amazon EFS enables automatic scaling of file system storage and throughput capacity. This eliminates the need for manual provisioning, ensuring seamless scaling to petabytes of storage, hundreds of thousands of IOPS, and tens of gigabytes per second of throughput, aligning with evolving workloads.&lt;/p&gt;

&lt;h3&gt;
  
  
  Resilience and Availability
&lt;/h3&gt;

&lt;p&gt;​Tailoring to different durability and availability needs, Amazon EFS offers Regional file systems, optimizing durability with data stored across multiple Availability Zones (AZs). Meanwhile, EFS One Zone file systems provide a cost-effective solution with slightly less durability. Boasting 99.999999999% (11 nines) durability over a year, Amazon EFS ensures data availability with AZ-specific EFS mount targets.&lt;/p&gt;

&lt;h3&gt;
  
  
  Versatile Storage Classes and Security Measures
&lt;/h3&gt;

&lt;p&gt;​Amazon EFS simplifies storage management by introducing three distinct storage classes, each tailored to specific access patterns and cost considerations. EFS Standard, fueled by SSD, excels in delivering high performance for frequently accessed data. On the other hand, EFS Infrequent Access (IA) and EFS Archive provide cost-optimized solutions catering to infrequently accessed data, adding a layer of flexibility to your storage strategy.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Effortless Transition with Automated File Tiering:&lt;br&gt;
​&lt;/strong&gt;&lt;br&gt;
AWS EFS streamlines storage efficiency by automatically tiering files between the storage classes based on access patterns. Through EFS Lifecycle Management and Intelligent-Tiering, this process is enhanced, ensuring seamless transitions between EFS Standard, EFS IA, and EFS Archive. Whether guided by custom or default policies, this automated approach facilitates efficient file management, with automatic transitions back to EFS Standard for faster access when needed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff4mdiwl0lecammw81had.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff4mdiwl0lecammw81had.png" alt="AWS EFS - Storage Tiering" width="800" height="303"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Enhanced Disaster Recovery with EFS Replication:​&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In scenarios requiring disaster recovery, EFS Replication steps in seamlessly across multiple Availability Zones (AZs), safeguarding data integrity. This feature ensures that your data remains secure and accessible, even in the face of unforeseen challenges.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Centralized Data Protection with AWS Backup:​&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To further fortify data protection within the Amazon EFS ecosystem, AWS Backup takes care of your back up needs. Centralizing and automating backup policies, AWS Backup simplifies the process of data protection and recovery. This cohesive approach ensures that your data is consistently safeguarded, offering peace of mind in the event of any data-related challenges.&lt;/p&gt;

&lt;h2&gt;
  
  
  Amazon EFS in Real-World Scenarios
&lt;/h2&gt;

&lt;p&gt;​&lt;strong&gt;Containerized Applications with Elastic Scalability:&lt;/strong&gt;&lt;br&gt;
Amazon EFS proves instrumental in supporting containerized applications, especially those deployed on Amazon Elastic Container Service (ECS) or Elastic Kubernetes Service (EKS). With the ability to scale file storage based on containerized workloads, EFS ensures that storage capacity adjusts dynamically to meet the demands of fluctuating container instances. This use case is ideal for businesses seeking an agile and scalable solution for containerized applications, where the dynamic nature of workloads requires a file storage system that can adapt in real-time.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;DevOps and Continuous Integration/Continuous Deployment (CI/CD):&lt;/strong&gt;&lt;br&gt;
For organizations embracing DevOps practices and CI/CD pipelines, Amazon EFS facilitates efficient and collaborative development workflows. Development and testing environments often require shared storage to ensure consistency and collaboration among team members. EFS offers a scalable and fully managed file system, eliminating the need for manual provisioning and maintenance. CI/CD pipelines can benefit from the elasticity of EFS, ensuring that storage scales automatically to accommodate the growing demands of automated testing and deployment processes. This use case is well-suited for organizations looking to streamline their development pipelines and enhance collaboration among development teams.&lt;br&gt;
​&lt;br&gt;
&lt;strong&gt;Big Data Analytics and Data Lakes:&lt;/strong&gt;&lt;br&gt;
Amazon EFS serves as a robust storage solution for big data analytics, providing the necessary throughput and scalability for processing large datasets. Data lakes, which often store diverse types of data for analytics purposes, can leverage EFS to store and share data across different analytics tools and compute instances. Whether it's running Apache Spark jobs, processing log files, or conducting machine learning analyses, Amazon EFS ensures that the underlying storage infrastructure scales seamlessly, allowing organizations to focus on deriving insights from their data without worrying about storage limitations.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Media and Entertainment Workflows:&lt;/strong&gt;&lt;br&gt;
In the realm of media and entertainment, where large files and collaborative workflows are the norm, Amazon EFS stands out as a reliable solution. Content creators and editors can leverage shared file systems to access and collaborate on media assets in real-time. The high throughput and low-latency characteristics of EFS ensure smooth video editing, rendering, and content distribution. Whether it's a film production team collaborating across different locations or a media agency managing a vast repository of digital assets, Amazon EFS provides the performance and collaboration capabilities needed for seamless content creation.&lt;/p&gt;

&lt;h2&gt;
  
  
  Embrace the Future of Data Storage​
&lt;/h2&gt;

&lt;p&gt;In conclusion, Amazon Elastic File System (EFS) emerges as a powerful solution, propelling businesses forward with its scalability, high performance, and cost-effectiveness. As businesses navigate the cloud landscape, strategically assessing where EFS fits best is crucial. Whether you're running containerized applications, handling big data analytics, or securing critical backups, EFS emerges as a versatile solution.&lt;/p&gt;

</description>
      <category>beginners</category>
      <category>aws</category>
      <category>efs</category>
      <category>storage</category>
    </item>
    <item>
      <title>Leveraging AWS Redshift for Your Organization's Needs</title>
      <dc:creator>Osama Shamout</dc:creator>
      <pubDate>Fri, 02 Feb 2024 11:51:21 +0000</pubDate>
      <link>https://dev.to/oshamout/leveraging-aws-redshift-for-your-organizations-needs-o3l</link>
      <guid>https://dev.to/oshamout/leveraging-aws-redshift-for-your-organizations-needs-o3l</guid>
      <description>&lt;p&gt;In today's digital age, data is the new currency, and organizations that can effectively harness its power are poised for success. But with the sheer volume of data constantly growing, it's easy to feel overwhelmed. That's where Amazon Redshift comes in, a powerful data warehouse solution that helps you transform your data from a liability into an asset.&lt;/p&gt;

&lt;p&gt;Imagine having access to a tool that can effortlessly analyze petabytes of data, providing you with the insights you need to make informed decisions, optimize operations, and gain a competitive edge. That's the promise of Amazon Redshift, a fully managed cloud-based solution that delivers exceptional performance, durability, and scalability.&lt;/p&gt;

&lt;p&gt;This blog post will be your guide to navigating the world of Amazon Redshift. We'll uncover its hidden gems and how it can help you harness the power of data analytics to propel your organization forward.&lt;/p&gt;

&lt;p&gt;So, are you ready to transform data from a burden into a driving force for success? Join us as we dive into the world of Amazon Redshift and discover how it can revolutionize your data analytics journey.&lt;/p&gt;

&lt;h2&gt;
  
  
  Table of Contents
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;
Overview of Amazon Redshift

&lt;ol&gt;
&lt;li&gt;Ideal usage patterns for Amazon Redshift&lt;/li&gt;
&lt;/ol&gt;


&lt;/li&gt;

&lt;li&gt;

Key Benefits of Amazon Redshift

&lt;ol&gt;
&lt;li&gt;Performance&lt;/li&gt;
&lt;li&gt;Durability and availability&lt;/li&gt;
&lt;li&gt;Cost Model&lt;/li&gt;
&lt;li&gt;Scalability and Elasticity&lt;/li&gt;
&lt;li&gt;Interfaces&lt;/li&gt;
&lt;/ol&gt;


&lt;/li&gt;

&lt;li&gt;

Data Ingestion and Loading

&lt;ol&gt;
&lt;li&gt;Data Ingestion Methods&lt;/li&gt;
&lt;li&gt;Data Loading Methods&lt;/li&gt;
&lt;li&gt;Data Ingestion Best Practices&lt;/li&gt;
&lt;/ol&gt;


&lt;/li&gt;

&lt;li&gt;Anti-patterns to Avoid When Using Amazon Redshift&lt;/li&gt;

&lt;li&gt;Conclusion&lt;/li&gt;

&lt;/ol&gt;

&lt;h2&gt;
  
  
  Overview of Amazon Redshift &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;Amazon Redshift is a fully managed, petabyte-scale data warehouse service in the cloud. It provides businesses with a powerful and cost-effective way to analyze large datasets using their existing business intelligence tools. Amazon Redshift is designed to handle the most demanding workloads, and it offers a variety of features to make it easy to use and manage. &lt;/p&gt;

&lt;p&gt;Under the hood, Amazon Redshift offers a range of node types, each tailored to specific workloads and performance requirements. RA3 nodes with managed storage provide independent scaling for compute and storage, making them ideal for data-intensive workloads with anticipated growth. DC2 nodes, with their local SSD storage, excel at compute-intensive tasks and datasets under 1 TB. Amazon Redshift Serverless further simplifies resource provisioning by automatically allocating resources based on workload demands. By carefully selecting the appropriate node type, users can optimize performance and cost-effectiveness for their unique data warehousing needs.&lt;/p&gt;

&lt;h3&gt;
  
  
  Ideal usage patterns for Amazon Redshift &lt;a&gt;&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Amazon Redshift has emerged as a powerful and versatile data warehousing solution, catering to a wide spectrum of use cases. Its ability to handle massive datasets with exceptional performance makes it an ideal choice for organizations seeking to extract meaningful insights from their data. Some key use cases where Amazon Redshift shines are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Analyzing large datasets:&lt;/strong&gt; Amazon Redshift can be used to analyze large datasets, such as sales data, customer data, and product data.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Storing historical data:&lt;/strong&gt; Amazon Redshift can be used to store historical data, such as financial data, marketing data, and operational data.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Running complex queries:&lt;/strong&gt; Amazon Redshift can be used to run complex queries, such as those that require joins, aggregations, and filtering.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Key Benefits of Amazon Redshift &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Performance &lt;a&gt;&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Amazon Redshift is designed to deliver fast query performance, even on large datasets. This is due to a number of factors, including its columnar storage format, its Massively Parallel Processing (MPP) architecture, and its use of local attached storage.&lt;/p&gt;

&lt;h3&gt;
  
  
  Durability and availability &lt;a&gt;&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Amazon Redshift is a fully managed service, so you don't have to worry about managing the infrastructure. Amazon Redshift also offers a variety of features to ensure that your data is always available, including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Data replication:&lt;/strong&gt; Amazon Redshift automatically replicates your data to multiple nodes within your cluster. This ensures that your data is always available, even if one node fails.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Snapshots:&lt;/strong&gt; Amazon Redshift automatically takes snapshots of your data. Snapshots can be used to restore your data to a previous point in time.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Backups:&lt;/strong&gt; Amazon Redshift automatically backs up your data to Amazon S3. This ensures that your data is always protected, even in the event of a disaster.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Cost model &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;Amazon Redshift is a pay-as-you-go service, so you only pay for what you use. This makes it a cost-effective solution for businesses of all sizes. The cost of Amazon Redshift is based on the following factors:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Data warehouse node hours:&lt;/strong&gt; The number of hours your data warehouse nodes are running.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Backup storage:&lt;/strong&gt; The amount of storage used to store backups of your data.&lt;/li&gt;
&lt;li&gt;** Data transfer:** The amount of data transferred to or from Amazon Redshift.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Scalability and Elasticity &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;Amazon Redshift's scalability and elasticity capabilities empower users to adapt their data warehouse infrastructure to meet the ever-changing demands of their workloads. Whether faced with unpredictable workloads or fluctuating query concurrency, Amazon Redshift provides seamless solutions to ensure consistently high performance and availability.&lt;/p&gt;

&lt;p&gt;For those seeking automated elasticity, Amazon Redshift Serverless takes the lead by intelligently adjusting data warehouse capacity to match demand. This dynamic elasticity eliminates the need for manual intervention, ensuring that users are always prepared to handle even the most demanding spikes in demand. Additionally, Concurrency Scaling enhances overall query concurrency by dynamically adding cluster resources, responding to increased demand for concurrent users and queries. Both Amazon Redshift Serverless and Concurrency Scaling ensure full availability for read and write operations during scaling.&lt;/p&gt;

&lt;p&gt;For those preferring granular control over their data warehouse capacity, Elastic Resize provides a straightforward solution. This method allows users to scale their clusters based on performance requirements, addressing performance bottlenecks related to CPU, memory, or I/O overutilization. However, with Elastic Resize, the cluster experiences a brief unavailability period lasting four to eight minutes. Changes take effect immediately, providing a swift response to evolving demands.&lt;/p&gt;

&lt;p&gt;In essence, Elastic Resize is focused on adding or removing nodes from a single Redshift cluster within minutes, optimizing query throughput for specific workloads, such as ETL tasks or month-end reporting. On the other hand, Concurrency Scaling augments overall query concurrency by adding additional cluster resources dynamically, responding to increased demand for concurrent users and queries.&lt;/p&gt;

&lt;h3&gt;
  
  
  Interfaces &lt;a&gt;&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Amazon Redshift provides a variety of interfaces to make it easy to use and manage for the developers. These interfaces include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Amazon Redshift query editor v2:&lt;/strong&gt; A web-based query editor that you can use to run SQL queries against your data.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon Redshift APIs:&lt;/strong&gt; A set of APIs that you can use to programmatically manage your Amazon Redshift cluster.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AWS Command Line Interface (CLI):&lt;/strong&gt; A command-line tool that you can use to manage your Amazon Redshift cluster.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon Redshift console:&lt;/strong&gt; A web-based console that you can use to manage your Amazon Redshift cluster.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Data Ingestion and Loading &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;Effectively ingesting and loading data into your Amazon Redshift data warehouse is crucial for performing accurate and timely analytics. Amazon Redshift offers a variety of data ingestion methods to accommodate different data sources and workload requirements.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn7v87swfe6h6hvm2kzx0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn7v87swfe6h6hvm2kzx0.png" alt="Amazon Redshift Product Diagram" width="800" height="288"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h5&gt;
  
  
  &lt;em&gt;Amazon Redshift Product Diagram, Amazon Redshift Product Website 2023.&lt;/em&gt;
&lt;/h5&gt;

&lt;h3&gt;
  
  
  Data Ingestion Methods &lt;a&gt;&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Managing the inflow of data into your Amazon Redshift data warehouse is pivotal for accurate and timely analytics. This involves understanding the diverse methods and best practices for data ingestion, ensuring that your organization can seamlessly integrate various data sources. Hence, Amazon Redshift offers a variety of data ingestion methods to accommodate different data sources and workload requirements such as:&lt;br&gt;
&lt;strong&gt;Amazon S3:&lt;/strong&gt; Amazon S3 is the most common data source for Amazon Redshift ingestion. Data can be loaded from Amazon S3 using the COPY command, which efficiently copies data in parallel across all compute nodes in the cluster.&lt;br&gt;
Amazon DynamoDB: Amazon Redshift can ingest data directly from Amazon DynamoDB tables using the COPY command. This method is particularly useful for loading real-time data from DynamoDB streams.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Amazon EMR and AWS Glue:&lt;/strong&gt; Amazon EMR and AWS Glue can be used to process and transform data before loading it into Amazon Redshift. These services provide a range of data processing capabilities, including data cleansing, filtering, and transformation.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS Data Pipeline:&lt;/strong&gt; AWS Data Pipeline is a data orchestration service that can be used to automate the process of ingesting data from various sources into Amazon Redshift. Data Pipeline can be used to schedule data ingestion jobs, track data lineage, and monitor data quality.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;SSH-enabled hosts:&lt;/strong&gt; Data can also be loaded into Amazon Redshift from SSH-enabled hosts, both on Amazon EC2 instances and on-premises servers. This method is useful for ingesting data from legacy systems or custom applications.&lt;/p&gt;

&lt;h3&gt;
  
  
  Data Loading Methods &lt;a&gt;&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Loading data into Amazon Redshift is a critical aspect of the data analytics journey. The COPY command, optimized for parallel processing, and the UNLOAD command, facilitating data export, are pivotal tools. Understanding these loading methods and incorporating best practices ensures efficient and reliable data loading, contributing to the overall success of your analytics endeavors.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;COPY command:&lt;/strong&gt; The COPY command is the most efficient way to load data into Amazon Redshift. It allows you to specify the data source, format, and destination table. The COPY command automatically optimizes data loading for parallel processing across multiple compute nodes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;UNLOAD command:&lt;/strong&gt; The UNLOAD command is used to unload data from Amazon Redshift into a variety of formats, including CSV, JSON, and Parquet. This method is useful for exporting data for further analysis or for creating data backups.&lt;/p&gt;

&lt;h3&gt;
  
  
  Data Ingestion Best Practices &lt;a&gt;&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;To ensure efficient and reliable data ingestion, follow these best practices:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Choose an appropriate data ingestion method:&lt;/strong&gt; Select the data ingestion method that best suits your data source, workload requirements, and desired level of automation.&lt;/p&gt;

&lt;p&gt;**Partition data: **Partitioning data based on frequently used query predicates can significantly improve query performance and reduce storage costs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Compress data:&lt;/strong&gt; Compressing data using appropriate compression algorithms can reduce storage requirements and improve data loading performance.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Monitor data ingestion:&lt;/strong&gt; Regularly monitor data ingestion metrics to identify and address any performance bottlenecks or data quality issues.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Implement data governance:&lt;/strong&gt; Establish clear data governance policies to ensure data quality, consistency, and security.&lt;/p&gt;

&lt;h2&gt;
  
  
  Anti-patterns to Avoid When Using Amazon Redshift &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;Amazon Redshift is a powerful and versatile data warehousing solution, but it is important to use it appropriately to avoid performance issues and unnecessary costs. Here are some anti-patterns to avoid when using Amazon Redshift:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Using Amazon Redshift for OLTP Workloads&lt;/strong&gt;&lt;br&gt;
Amazon Redshift is optimized for analytical workloads, such as data warehousing and business intelligence. It is not designed for Online Transaction Processing (OLTP) workloads, which involve frequent insertions, updates, and deletes. If you need to run OLTP workloads, you should use a traditional row-based database system, such as Amazon RDS, which is specifically designed for handling transactional data.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Storing BLOB Data&lt;/strong&gt;&lt;br&gt;
Amazon Redshift is not designed to store Binary Large Objects (BLOBs), which are large unstructured data objects such as images, videos, and audio files. Storing BLOBs in Amazon Redshift can significantly impact performance and increase storage costs. If you need to store BLOBs, you should use Amazon S3, a highly scalable and cost-effective object storage service.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Using Amazon Redshift for Real-time Analytics&lt;/strong&gt;&lt;br&gt;
Amazon Redshift is designed for batch processing and analytical workloads, and it is not optimized for real-time analytics. If you need to perform real-time analytics on data streams, you should use a streaming data platform, such as Amazon Kinesis Data Streams or Amazon MSK, which are specifically designed for handling real-time data ingestion and analysis.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;Amazon Redshift stands as a beacon in the vast sea of data, offering organizations a transformative journey from data chaos to analytical clarity. As we've explored its features, benefits, and best practices, it's evident that Redshift isn't just a data warehouse; it's a catalyst for unlocking the true potential of your data.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>redshift</category>
      <category>bigdata</category>
      <category>cloud</category>
    </item>
    <item>
      <title>AWS Data Engineering Certification</title>
      <dc:creator>Osama Shamout</dc:creator>
      <pubDate>Wed, 01 Nov 2023 10:58:59 +0000</pubDate>
      <link>https://dev.to/oshamout/aws-data-engineering-certification-3iak</link>
      <guid>https://dev.to/oshamout/aws-data-engineering-certification-3iak</guid>
      <description>&lt;p&gt;Welcome to the AWS Data Engineering Associate Certification Study Guide! If you’re new to data engineering and want to become certified in AWS, you’re in the right place. This guide is designed for beginners with no prior knowledge and should get you started and ready to get certified with AWS!&lt;/p&gt;

&lt;h2&gt;
  
  
  What is Data Engineering?
&lt;/h2&gt;

&lt;p&gt;Data engineering is a field within data management that focuses on the practical application of data collection and data processing. Data engineers are responsible for designing, building, and maintaining the architecture (often referred to as the data pipeline) that enables an organization to collect, store, and analyze data efficiently and effectively. In other words, Data engineering is the powerhouse behind your favorite apps, services, and online experiences. It’s the what ensures data flows seamlessly, enabling companies to make smart decisions, create personalized user experiences, and even predict your next online purchase. Let’s take a closer look at how data engineering works and why it’s essential, with real-world examples:&lt;/p&gt;

&lt;h3&gt;
  
  
  Data Collection - The Netflix Example
&lt;/h3&gt;

&lt;p&gt;Imagine you’re watching Netflix. While you’re binging on your favorite show, data engineers at Netflix are collecting valuable information about your viewing habits. They use tools and processes to capture what you watch, how long you watch it, and whether you binge-watch on weekends or savor an episode each day.&lt;/p&gt;

&lt;h3&gt;
  
  
  Data Transformation - Amazon's Product Recommendations
&lt;/h3&gt;

&lt;p&gt;Ever noticed how Amazon seems to know exactly what you want to buy next? Data engineers at Amazon work behind the scenes to transform and analyze massive amounts of data. They take your past purchases, your browsing history, and even reviews from other customers to create those uncannily accurate product recommendations.&lt;/p&gt;

&lt;h3&gt;
  
  
  Data Storage - Airbnb's User Profiles
&lt;/h3&gt;

&lt;p&gt;Airbnb, the global home-sharing platform, relies on data engineering to store and manage user profiles, property details, and booking information. Data engineers ensure this information is securely stored and can be quickly retrieved when you’re looking for that perfect getaway.&lt;/p&gt;

&lt;h3&gt;
  
  
  Data Processing - Uber's Real-Time Rides
&lt;/h3&gt;

&lt;p&gt;When you book an Uber ride, data engineering springs into action. It processes real-time data from both drivers and riders, calculating routes and pricing on the fly. This dynamic data processing ensures you get a ride quickly and at a fair price.&lt;/p&gt;

&lt;h3&gt;
  
  
  Data Integration - Spotify's Playlists
&lt;/h3&gt;

&lt;p&gt;Spotify is all about personalized music experiences. Data engineers at Spotify integrate data from various sources, like your listening history, your friends’ playlists, and music charts, to create customized playlists and music recommendations.&lt;/p&gt;

&lt;h3&gt;
  
  
  Data Quality - Banking and Fraud Detection
&lt;/h3&gt;

&lt;p&gt;In the world of finance, data engineers play a crucial role in maintaining data quality. They implement rigorous data validation and monitoring processes to detect unusual transactions that could signal fraud. Banks use these data engineering techniques to protect your hard-earned money.&lt;/p&gt;

&lt;h3&gt;
  
  
  Scalability and Performance - Twitter's Real-Time Feeds
&lt;/h3&gt;

&lt;p&gt;Twitter’s data engineering team ensures that tweets flow in real-time to millions of users. They’ve built a scalable infrastructure that handles the enormous volume of data generated by users worldwide, all while maintaining performance and reliability.&lt;/p&gt;

&lt;h3&gt;
  
  
  Data Governance and Compliance - Healthcare and HIPAA
&lt;/h3&gt;

&lt;p&gt;In the healthcare industry, data engineers adhere to strict data governance rules, such as HIPAA. They implement data security measures to protect sensitive patient information, ensuring that only authorized personnel can access it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why become a Data Engineer?
&lt;/h2&gt;

&lt;p&gt;Data Engineering forms the bedrock upon which Data Science is constructed. Think of it as the solid foundation that supports the entire data-driven ecosystem. Just as quality ingredients are vital to a great meal, reliable data is essential for meaningful insights. Without Data Engineering, there can be no Data Science. It’s the starting point for logging, storing, and analyzing data, paving the way for the exciting world of machine learning, AI, and deep learning. Furthermore, Data Engineering provides solid job security, boasting an expected growth rate of 17.6%1 and an average salary of $103,000 in the U.S.2, a staggering 42% higher than the U.S. average salary. So, why not start your journey with Data Engineering by taking up the AWS Certification and build a secure and promising future in the data domain?&lt;/p&gt;

&lt;h2&gt;
  
  
  Pre-requisites for an AWS Data Engineer Associate Certification
&lt;/h2&gt;

&lt;p&gt;Before you dive into your journey to become an AWS Certified Data Engineer – Associate, there are a few things to consider. While there are no strict pre-requisites for this certification, it’s recommended that candidates have the equivalent of 2-3 years of experience in data engineering or data architecture, coupled with at least 1-2 years of hands-on experience with AWS services. This combination of knowledge and practical experience will help you navigate the certification process more effectively.&lt;/p&gt;

&lt;h2&gt;
  
  
  Exam Details
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Exam Format
&lt;/h3&gt;

&lt;p&gt;Understanding the exam format is crucial for effective preparation. The AWS Data Engineer Associate Certification exam consists of 85 questions. These questions come in two primary formats: multiple choice and multiple response. It’s important to note that unanswered questions are scored as incorrect, but there’s no penalty for guessing. The exam duration is 170 minutes, with 50 questions contributing to your final score. Additionally, there are 15 unscored questions, used to gather performance data for future exams.&lt;/p&gt;

&lt;h3&gt;
  
  
  Exam Content and Essential AWS Services
&lt;/h3&gt;

&lt;p&gt;To excel in the AWS Data Engineer Associate Certification, a solid foundation in data engineering principles and a deep knowledge of AWS services are indispensable. The exam content spans four crucial domains, each requiring proficiency in various aspects of data engineering. Let’s delve into these domains:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Data Ingestion and Transformation (34%):&lt;/strong&gt;&lt;br&gt;
This domain underlines the candidate’s capacity to proficiently ingest and transform data, orchestrate data pipelines, and apply programming concepts effectively. It encompasses a wide spectrum of data transformation techniques, making it a critical skill for a data engineer.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Data Store Management (26%):&lt;/strong&gt;&lt;br&gt;
Choosing optimal data stores, designing efficient data models, cataloging data schemas, and managing data lifecycles are the core elements of this domain. It demands a comprehensive understanding of how data should be structured and managed, aligning it with your organization’s requirements.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Data Operations and Support (22%):&lt;/strong&gt;&lt;br&gt;
Operationalizing, maintaining, and monitoring data pipelines, along with ensuring data quality and consistency, fall within this domain. An AWS Data Engineer must possess the skills to troubleshoot issues, optimize performance, and facilitate data operations seamlessly.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Data Security and Governance (18%):&lt;/strong&gt;&lt;br&gt;
The final domain addresses crucial aspects like authentication, authorization, data encryption, privacy, and governance. It also focuses on enabling effective logging, ensuring data is handled securely and according to compliance standards.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fngqnaisk0ywji0l0lr6m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fngqnaisk0ywji0l0lr6m.png" alt="Content Distribution in the Data Engineering Certificate" width="800" height="514"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;While mastering these domains is essential, practical experience in previous data engineering frameworks is just the beginning. You must also harness the power of AWS services to apply these concepts in the AWS ecosystem effectively. Here are some of the key AWS services to get you started:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Amazon S3 (Simple Storage Service):&lt;/strong&gt; At the core of AWS data storage, Amazon S3 plays a pivotal role in data engineering. Candidates must understand how to use S3 for data storage, ingestion, and distribution. Proficiency in setting up S3 buckets, managing permissions, and configuring data lifecycle policies is vital.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;AWS Glue:&lt;/strong&gt; AWS Glue is a fully managed ETL service, simplifying the preparation and loading of data. Candidates should be skilled in using AWS Glue to create ETL jobs, data catalogs, and data transformation scripts. Automating data pipelines and orchestrating ETL processes is a must.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Amazon Redshift:&lt;/strong&gt; A robust data warehousing service for analytics, Amazon Redshift demands knowledge in data modeling, query optimization, and working with large datasets. Understanding schema design, query optimization, and data loading best practices is essential.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;AWS EMR (Elastic MapReduce):&lt;/strong&gt; As a cloud-native big data platform, AWS EMR is designed for processing vast data volumes. Candidates should be familiar with EMR clusters, Hadoop, Spark, and other big data technologies. The ability to create and manage EMR clusters, work with S3-stored data, and optimize EMR job performance is vital.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Amazon Kinesis:&lt;/strong&gt; Amazon Kinesis offers real-time data streaming and analytics services. A strong grasp of Kinesis Data Streams, Kinesis Data Firehose, and Kinesis Data Analytics is necessary. The knowledge to ingest, process, and analyze streaming data is crucial for data engineering.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;AWS Lambda:&lt;/strong&gt; AWS Lambda, a serverless compute service, is ideal for event-driven processing. Proficiency in using Lambda functions to automate data processing, trigger events, and perform serverless data transformations is vital. This includes configuring Lambda functions, working with event sources, and handling errors.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Amazon Athena:&lt;/strong&gt; Amazon Athena is an interactive query service for analyzing data in Amazon S3 using standard SQL. Candidates should be proficient in writing SQL queries for data analysis, creating views, and data catalogs within Athena. Understanding how to use Athena for ad-hoc queries and reporting is important.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;AWS Step Functions:&lt;/strong&gt; AWS Step Functions is a serverless orchestration service for building workflows. Candidates should understand how to create and manage serverless workflows to orchestrate data pipelines, manage dependencies, and ensure reliable data processing.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;These services form the core toolkit for AWS Data Engineers. Mastering them is essential to excel in the AWS Data Engineer Associate Certification exam and to thrive in data engineering roles.&lt;/p&gt;

&lt;h2&gt;
  
  
  Exam Resources
&lt;/h2&gt;

&lt;p&gt;To embark on a successful journey towards AWS Data Engineer Associate certification, it’s crucial to have the right resources at your disposal. AWS offers a comprehensive selection of recommended materials to aid in your preparation, ensuring that you’re well-equipped to excel in the certification exam. Among the most important resources are:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://aws.amazon.com/certification/certified-data-engineer-associate/" rel="noopener noreferrer"&gt;AWS Certified Data Engineer Associate Exam Page:&lt;/a&gt; This official AWS certification exam page serves as your gateway to a wealth of information and relevant links. Here, you’ll find essential details about the exam, including registration, pricing, and any updates or announcements. It’s your starting point for accessing all other pertinent resources.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://d1.awsstatic.com/training-and-certification/docs-data-engineer-associate/AWS-Certified-Data-Engineer-Associate_Exam-Guide.pdf" rel="noopener noreferrer"&gt;AWS Certified Data Engineer – Associate (DEA-C01) Exam Guide by AWS:&lt;/a&gt; The official AWS exam guide is an invaluable resource. It outlines the service requirements, typical knowledge areas that need to be demonstrated, and the skills you must master to succeed in the exam. It provides a structured overview of the topics you’ll encounter, making it an essential reference during your preparation.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://explore.skillbuilder.aws/learn/course/internal/view/elearning/16985/aws-certified-data-engineer-associate-official-practice-question-set-dea-c01-english" rel="noopener noreferrer"&gt;Official AWS Practice Question Set for DEA-C01:&lt;/a&gt; This official practice question set is a valuable resource for self-assessment. It includes sample questions designed to test your knowledge and readiness for the certification exam. Practicing with these questions will help you gauge your understanding of the exam topics and identify areas that may need further study.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://www.udemy.com/course/aws-data-engineer/" rel="noopener noreferrer"&gt;Udemy AWS Data Engineer Course:&lt;/a&gt; If you’re looking for a comprehensive and hands-on preparation course, consider enrolling in the Udemy AWS Data Engineer course. Led by best-selling Udemy instructors Frank Kane and Stéphane Maarek, this course is a collaboration between two experts who have collectively taught over 2 million people worldwide. Frank Kane, with his extensive experience in wrangling massive data sets during his nine-year tenure at Amazon, brings a unique perspective to the course. It already boasts a remarkable 4.8-star review rating at the time of writing. This course combines Stéphane’s depth on AWS with Frank’s expertise, making it an excellent choice for in-depth certification preparation, and for me, Stéphane has helped me tremendously in passing all my AWS Certification Exams!&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;These resources collectively provide a well-rounded preparation package, ensuring you’re equipped with the knowledge and skills needed to excel in the AWS Data Engineer Associate certification exam. Utilize these materials to enhance your understanding of the exam topics and boost your confidence for the actual test.&lt;/p&gt;

&lt;h2&gt;
  
  
  Registration
&lt;/h2&gt;

&lt;p&gt;Registering for the AWS Data Engineering Associate Certification exam is a straightforward process. By following AWS’s guidelines, which include scheduling and fee information, you can take this significant step towards enhancing your career. To ensure a smooth and hassle-free registration, it’s advisable to plan your exam date and time well in advance.&lt;/p&gt;

&lt;p&gt;Here’s what you need to know when registering:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Exam Availability:&lt;/strong&gt; The registration period for the AWS Data Engineering Associate Certification exam commences on October 31, 2023. During this time frame, candidates can take the exam between November 27, 2023, and January 12, 2024. This flexibility in scheduling allows you to choose a convenient time to showcase your skills and knowledge. It’s important to note that during this period, the exam is in its Beta stage. AWS Certification employs beta exams to evaluate the performance of exam items before integrating them into live exams. If the beta proves successful, candidates who pass will be among the first to earn the new certification. Beta exam results will be available 90 days from the close of the beta exam. Afterward, the official exam will be administered starting from March 2024.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Exam Duration and Format:&lt;/strong&gt; The AWS Data Engineering Associate Certification exam is a comprehensive test, with a total duration of 170 minutes. It features 85 questions, which can be either multiple-choice or multiple-response. This format ensures a thorough assessment of your understanding of the material.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Cost:&lt;/strong&gt; Registering for the exam requires a fee of 75 USD*. For any additional cost-related information, it’s recommended to visit the “Exam pricing” section.&lt;/p&gt;

&lt;p&gt;**There may be other fees related to the testing center, please consult your nearby testing center for further information. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Testing Options:&lt;/strong&gt; As an aspiring candidate, you have the flexibility to select your preferred testing mode. You can either opt for an in-person exam conducted at a nearby Pearson VUE testing center or choose the convenience of an online proctored exam. This choice allows you to align your examination experience with your preferences.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Languages Offered:&lt;/strong&gt; The exam is available exclusively in English.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Your journey to becoming an AWS Data Engineer Associate is an exciting adventure into the world of data engineering. This certification not only validates your knowledge and skills but also opens doors to a realm of opportunities in the ever-expanding field of data management. Whether you’re new to data engineering or aiming to take your existing expertise to new heights, this certification sets you on a path to success.&lt;/p&gt;

&lt;p&gt;Best of luck in your studies and on your journey to become AWS certified!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>dataengineering</category>
      <category>cloud</category>
      <category>beginners</category>
    </item>
  </channel>
</rss>
