<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Ayinde Jamiu</title>
    <description>The latest articles on DEV Community by Ayinde Jamiu (@ayindejamiu).</description>
    <link>https://dev.to/ayindejamiu</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/ayindejamiu"/>
    <language>en</language>
    <item>
      <title>Sports API System with Amazon ECS and API Gateway</title>
      <dc:creator>Ayinde Jamiu</dc:creator>
      <pubDate>Sun, 26 Jan 2025 08:21:09 +0000</pubDate>
      <link>https://dev.to/ayindejamiu/sports-api-system-with-amazon-ecs-and-api-gateway-1da1</link>
      <guid>https://dev.to/ayindejamiu/sports-api-system-with-amazon-ecs-and-api-gateway-1da1</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;This project demonstrates the creation of a containerized API management system for querying real-time sports data. The system uses Amazon ECS (Fargate) for container orchestration, Amazon API Gateway for RESTful endpoints, and integrates seamlessly with an external Sports API. The implementation emphasizes best practices in cloud computing, including API management, container orchestration, and secure AWS service integration.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Sports API Key&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Register for a free account at serpapi.com and obtain your API key for accessing sports data.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS Account&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Ensure you have an active AWS account. Familiarity with ECS, API Gateway, Docker, and Python is recommended.&lt;/p&gt;

&lt;p&gt;*&lt;em&gt;AWS CLI&lt;br&gt;
*&lt;/em&gt;&lt;br&gt;
Install and configure the AWS CLI to enable programmatic interaction with AWS services.&lt;/p&gt;

&lt;p&gt;*&lt;em&gt;Docker CLI and Desktop&lt;br&gt;
*&lt;/em&gt;&lt;br&gt;
Install Docker CLI and Docker Desktop to build and push container images efficiently.&lt;/p&gt;

&lt;p&gt;Project Structure &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh1kthn258fibbuonqwva.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh1kthn258fibbuonqwva.png" alt="Image description" width="374" height="155"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, let us begin &lt;/p&gt;

&lt;p&gt;create a new directory &lt;br&gt;
&lt;code&gt;mkdir containerized-sports-api&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Move into the directory &lt;br&gt;
&lt;code&gt;cd containerized-sports-api&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Clone my repository &lt;br&gt;
&lt;code&gt;git clone https://github.com/Ayindejamiu/sportapiwithecs.git&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;create ECR repository &lt;br&gt;
&lt;code&gt;aws ecr create-repository --repository-name sports-api --region us-east-1&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Authenticate Build and Push the Docker Image&lt;br&gt;
`aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin .dkr.ecr.us-east-1.amazonaws.com&lt;/p&gt;

&lt;p&gt;docker build --platform linux/amd64 -t sports-api .&lt;br&gt;
docker tag sports-api:latest .dkr.ecr.us-east-1.amazonaws.com/sports-api:sports-api-latest&lt;br&gt;
docker push .dkr.ecr.us-east-1.amazonaws.com/sports-api:sports-api-latest`&lt;/p&gt;

&lt;p&gt;Note: Replace  with your AWS Account Id. &lt;/p&gt;

&lt;p&gt;After pushing, your screen should be like this &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0e7w2c4wl95p25ufmn7b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0e7w2c4wl95p25ufmn7b.png" alt="Image description" width="770" height="185"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Setup ECS Cluster with Fargate&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Create an ECS Cluster:&lt;/p&gt;

&lt;p&gt;Go to the ECS Console → Clusters → Create Cluster&lt;/p&gt;

&lt;p&gt;Name your Cluster (sports-api-cluster)&lt;/p&gt;

&lt;p&gt;For Infrastructure, select Fargate, then create Cluster&lt;/p&gt;

&lt;p&gt;Create a Task definition&lt;/p&gt;

&lt;p&gt;Go to Task Definitions → Create New Task Definition&lt;/p&gt;

&lt;p&gt;Name your task definition (sports-api-task)&lt;/p&gt;

&lt;p&gt;For Infrastructure, select Fargate&lt;/p&gt;

&lt;p&gt;Add the container:&lt;/p&gt;

&lt;p&gt;Name your container (sports-api-container)&lt;/p&gt;

&lt;p&gt;Image URI: go to ECR and check out the private repo. You can find image URI over there.&lt;/p&gt;

&lt;p&gt;Container Port: 8080&lt;/p&gt;

&lt;p&gt;Protocol: TCP&lt;/p&gt;

&lt;p&gt;Port Name: Leave Blank&lt;/p&gt;

&lt;p&gt;App Protocol: HTTP&lt;/p&gt;

&lt;p&gt;Define Environment Variables:&lt;/p&gt;

&lt;p&gt;Key: SERP_API_KEY&lt;/p&gt;

&lt;p&gt;Value: &lt;/p&gt;

&lt;p&gt;*&lt;em&gt;Create task definition&lt;br&gt;
*&lt;/em&gt;&lt;br&gt;
Run the Service with an ALB&lt;/p&gt;

&lt;p&gt;Go to Clusters → Select Cluster → Service → Create.&lt;/p&gt;

&lt;p&gt;Capacity provider: Fargate&lt;/p&gt;

&lt;p&gt;Select Deployment configuration family (sports-api-task)&lt;/p&gt;

&lt;p&gt;Name your service (sports-api-service)&lt;/p&gt;

&lt;p&gt;Desired tasks: 2&lt;/p&gt;

&lt;p&gt;Networking: Create new security group&lt;/p&gt;

&lt;p&gt;Networking Configuration:&lt;/p&gt;

&lt;p&gt;Type: All TCP&lt;/p&gt;

&lt;p&gt;Source: Anywhere&lt;/p&gt;

&lt;p&gt;Load Balancing: Select Application Load Balancer (ALB).&lt;/p&gt;

&lt;p&gt;ALB Configuration:&lt;/p&gt;

&lt;p&gt;Create a new ALB:&lt;/p&gt;

&lt;p&gt;Name: sports-api-alb&lt;/p&gt;

&lt;p&gt;Target Group health check path: "/sports"&lt;/p&gt;

&lt;p&gt;Create service&lt;/p&gt;

&lt;p&gt;Test the ALB&lt;/p&gt;

&lt;p&gt;After deploying the ECS service, note the DNS name of the ALB (e.g., sports-api-alb-.us-east-1.elb.amazonaws.com)&lt;/p&gt;

&lt;p&gt;Confirm the API is accessible by visiting the ALB DNS name in your browser and adding /sports at end (e.g, &lt;a href="http://sports-api-alb-" rel="noopener noreferrer"&gt;http://sports-api-alb-&lt;/a&gt;.us-east-1.elb.amazonaws.com/sports)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuewfwx3qwsg253jy4xun.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuewfwx3qwsg253jy4xun.png" alt="Image description" width="800" height="78"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configure API Gateway&lt;/strong&gt;&lt;br&gt;
Create a New REST API:&lt;br&gt;
Go to API Gateway Console → Create API → REST API&lt;br&gt;
Name the API (e.g., Sports API Gateway)&lt;br&gt;
Set Up Integration:&lt;br&gt;
Create a resource /sports&lt;br&gt;
Create a GET method&lt;br&gt;
Choose HTTP Proxy as the integration type&lt;br&gt;
Enter the DNS name of the ALB that includes "/sports" (e.g. &lt;a href="http://sports-api-alb-" rel="noopener noreferrer"&gt;http://sports-api-alb-&lt;/a&gt;.us-east-1.elb.amazonaws.com/sports&lt;br&gt;
Deploy the API:&lt;br&gt;
Deploy the API to a stage (e.g., dev)&lt;br&gt;
Note the endpoint URL&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F91dfls4eu469q69tr2z1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F91dfls4eu469q69tr2z1.png" alt="Image description" width="800" height="78"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>ANUG AWS re:Invent Recap: How It Went Down</title>
      <dc:creator>Ayinde Jamiu</dc:creator>
      <pubDate>Sat, 18 Jan 2025 21:10:19 +0000</pubDate>
      <link>https://dev.to/ayindejamiu/anug-aws-reinvent-recap-how-it-went-down-4ahb</link>
      <guid>https://dev.to/ayindejamiu/anug-aws-reinvent-recap-how-it-went-down-4ahb</guid>
      <description>&lt;p&gt;In just 12 days, we planned and executed an amazing AWS re:Invent Recap event that brought together cloud enthusiasts, experts, and community members to relive the highlights of AWS re:Invent 2024. Here’s a look at how it all unfolded:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Journey Begins&lt;/strong&gt;&lt;br&gt;
Our planning kicked off on January 6th, 2025, where we proposed the event date and began laying the groundwork for a successful program.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Securing Outstanding Speakers&lt;/strong&gt;&lt;br&gt;
We prioritized finding speakers who attended AWS re:Invent 2024 in person and sought out experts in the cloud space. We were thrilled to have:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Elizabeth Adegbaju&lt;/li&gt;
&lt;li&gt;Adebayo Balogun&lt;/li&gt;
&lt;li&gt;Victory Eze
These incredible panelists accepted our invitation without hesitation and delivered an engaging and insightful discussion. Special shoutout to Ndimofor Ateh, who not only recapped the latest AWS announcements but also demonstrated how to implement them in real-world scenarios, leaving the audience inspired and empowered.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;The Power of Volunteers&lt;/strong&gt;&lt;br&gt;
As the saying goes, "If you want to go fast, walk alone; if you want to go far, walk together." Our volunteers exemplified this perfectly, exceeding all expectations with their dedication and results.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Ibraheem Ejibode: Our stellar Social Media Manager, keeping the buzz alive.&lt;/li&gt;
&lt;li&gt;Mary Ogunmola: A phenomenal Event Organizer, ensuring everything ran smoothly.&lt;/li&gt;
&lt;li&gt;Ezikel Kwerechukw and Ifeoluwa Michael: Our incredible hosts who brought energy and professionalism to the event.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;The Results&lt;/strong&gt;&lt;br&gt;
The event was a resounding success!&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;200+ attendees registered to learn, connect, and grow.&lt;/li&gt;
&lt;li&gt;Practical demos and hands-on sessions led by Ndimofor were a highlight.&lt;/li&gt;
&lt;li&gt;The panelists shared their unique re:Invent experiences, offering inspiration and valuable insights.&lt;/li&gt;
&lt;li&gt;AWS credits were awarded as swags to five lucky participants!&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;A Heartfelt Thanks&lt;/strong&gt;&lt;br&gt;
We couldn’t have achieved this without the unwavering support of our community and our backbone, Thembile, whose encouragement and guidance were invaluable throughout this journey. If you attend the event, share your experience with us. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;In Conclusion&lt;/strong&gt;&lt;br&gt;
The AWS re:Invent Recap was more than just an event—it was a celebration of learning, collaboration, and community spirit. We’re incredibly proud of what we’ve achieved and can’t wait to host more impactful events in the future.&lt;/p&gt;

&lt;p&gt;Here’s to many more milestones with the AWS Nigeria User Group&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Weather Data collection API using S3 and Python</title>
      <dc:creator>Ayinde Jamiu</dc:creator>
      <pubDate>Sat, 18 Jan 2025 21:07:43 +0000</pubDate>
      <link>https://dev.to/ayindejamiu/weather-data-collection-api-using-s3-and-python-3908</link>
      <guid>https://dev.to/ayindejamiu/weather-data-collection-api-using-s3-and-python-3908</guid>
      <description>&lt;p&gt;Embarking on the 30-Day DevOps Challenge 🚀&lt;/p&gt;

&lt;p&gt;I have officially kicked off a 30-Day DevOps Challenge! The mission? To build hands-on cloud skills by tackling real-world projects. My first venture: creating a serverless weather dashboard that fetches live weather data and stores it in AWS S3.&lt;/p&gt;

&lt;p&gt;At first glance, the project seemed straightforward:&lt;br&gt;
👉 Fetch weather data from an API&lt;br&gt;
👉 Store it in the cloud&lt;br&gt;
👉 Retrieve and display it later&lt;/p&gt;

&lt;p&gt;But as I quickly learned, real-world cloud projects often come with hidden complexities. That’s the fun of it, though! 😆&lt;/p&gt;

&lt;p&gt;Step 1: Fetching Weather Data Like a Pro 🌤️&lt;br&gt;
To get real-time weather updates, I chose the OpenWeather API—a user-friendly service that provides details like temperature, humidity, and conditions for any city.&lt;/p&gt;

&lt;p&gt;With my API key in hand, I made a request and… voilà, instant weather data! ✅&lt;/p&gt;

&lt;p&gt;This first step was a breeze, and I was feeling confident about what was next.&lt;/p&gt;

&lt;p&gt;Step 2: Storing Data in the Cloud 🌐&lt;br&gt;
Rather than saving the data locally, I opted to store it in Amazon S3 (Simple Storage Service)—a scalable cloud-based storage solution. This approach not only allowed me to keep weather records over time but also access them from anywhere.&lt;/p&gt;

&lt;p&gt;Sounds simple, right? Not so fast. AWS had some surprises in store.&lt;/p&gt;

&lt;p&gt;🙃 Reality Check #1: Uploading data to S3 isn’t enough; configuring the bucket correctly is essential.&lt;br&gt;
🙃 Reality Check #2: AWS’s strict security settings are a double-edged sword—great for safety, challenging when you’re starting out.&lt;/p&gt;

&lt;p&gt;After a bit of troubleshooting (and several Google searches), I successfully configured my S3 bucket and uploaded my first weather data file. ✅&lt;/p&gt;

&lt;p&gt;Step 3: Cracking the S3 Permissions Puzzle 🔐&lt;br&gt;
Here’s where I hit my first real DevOps roadblock. Although my data was in S3, I kept encountering 403 Forbidden errors when trying to access it.&lt;/p&gt;

&lt;p&gt;As it turns out, S3 buckets are private by default (a smart security feature). To make the data accessible, I needed to adjust my bucket policy to allow public read access.&lt;/p&gt;

&lt;p&gt;Cue troubleshooting mode:&lt;br&gt;
✔️ Reviewed bucket policies&lt;br&gt;
✔️ Disabled “Block Public Access” settings (the sneaky culprit)&lt;br&gt;
✔️ Updated object-level permissions&lt;/p&gt;

&lt;p&gt;After some trial and error, I finally got it working. Lesson learned: Security is a cornerstone of cloud computing, and AWS’s default settings reflect that.&lt;/p&gt;

&lt;p&gt;Lessons Learned on Day 1 💡&lt;br&gt;
🎯 Cloud services are powerful, but security is always a top priority.&lt;br&gt;
🎯 Permissions and access control require patience and persistence.&lt;br&gt;
🎯 DevOps is all about learning through solving real problems.&lt;/p&gt;

&lt;p&gt;By the end of Day 1, I had a working setup: weather data pulled from an API and stored securely in AWS S3.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Seven ways to Secure Your EC2 Instance on AWS</title>
      <dc:creator>Ayinde Jamiu</dc:creator>
      <pubDate>Tue, 31 Dec 2024 06:48:52 +0000</pubDate>
      <link>https://dev.to/ayindejamiu/seven-ways-to-secure-your-ec2-instance-on-aws-3pjc</link>
      <guid>https://dev.to/ayindejamiu/seven-ways-to-secure-your-ec2-instance-on-aws-3pjc</guid>
      <description>&lt;p&gt;Amazon Elastic Compute Cloud (EC2) is a powerful and scalable virtual server by Amazon Web Service. EC2 offers Infrastructure as a Service and this comes with the responsibility of ensuring its security by the customer as well as the cloud service providers. When deploying applications to EC2 there are several security measures that should be taken. I have highlighted ten best practices that should be considered when using EC2 instances to deploy your application to minimize vulnerabilities, and protect your applications and data.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Start with a secure configuration and harden the OS: When you are configuring your virtual machine, it is best practice to start with a lightweight and secure operating system images;&lt;br&gt;
this will limit the attack surface and will give you major control. Use OS images like Amazon Linux 2 or Ubuntu Minimal. Harden the operating system by removing unused packages after you launch the instance. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Use IAM Roles and least privilege principle: Use IAM roles to grant instances permissions to access AWS resources. Avoid using long-term credentials in your instance or creating a user with username and password for resources. Follow the Principle of Least Privilege by ensuring that roles have minimal permissions necessary to perform their tasks.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Secure SSH Access: It is best practice to use key pairs rather than passwords for SSH access. The use of security groups to allow SSH access only from trusted IP addresses or Virtual Private Networks is also important. You can also configure your EC2 instance to listen for SSH on a non-default port to make it less discoverable.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Configure Security Groups and NACL: Configure your instance security group to allow access on specific ports based on the application running on your instance and access needed. You can also restrict traffic from specific ports and IP addresses. The use of NACL as a compensating control is also important for the subnet-level traffic. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Update and Patch: Continuous updates of your operating system and patches are important to address known vulnerabilities.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Enable Monitoring: Configure your instance to send logs to CloudWatch for centralized monitoring.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Use Encryption: Activate encryption for Amazon Elastic Block Store (EBS) volumes to secure data at rest.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Conclusion&lt;br&gt;
Securing your EC2 instance is an ongoing process that requires attention to detail, regular updates, and a proactive approach. By following these best practices, you can significantly reduce your attack surface and protect your resources on AWS. Stay vigilant, and always look for ways to enhance your security posture.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Effective Strategies for Monitoring Running Services on AWS and Controlling Costs</title>
      <dc:creator>Ayinde Jamiu</dc:creator>
      <pubDate>Sun, 04 Feb 2024 02:41:26 +0000</pubDate>
      <link>https://dev.to/ayindejamiu/effective-strategies-for-monitoring-running-services-on-aws-and-controlling-costs-1n4n</link>
      <guid>https://dev.to/ayindejamiu/effective-strategies-for-monitoring-running-services-on-aws-and-controlling-costs-1n4n</guid>
      <description>&lt;p&gt;Ever found yourself in the predicament of unintentionally accumulating charges on AWS due to running instances? It's a common situation for new users in the cloud, and in this article, we'll explore different ways to keep tabs on running services in your AWS account.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgdvd2k6fa9h5id67itf9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgdvd2k6fa9h5id67itf9.png" alt="Image description" width="800" height="789"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By gaining visibility into your resources, you can proactively manage your bills and prevent unexpected expenses.&lt;/p&gt;

&lt;p&gt;Billing Dashboard&lt;/p&gt;

&lt;p&gt;The AWS Billing Dashboard is a crucial tool for understanding what's impacting your expenses. Simply log in as a root user or an IAM user with billing permissions, navigate to the Billing and Cost Management section, and select Bills. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fysnwqm9v7xuhf820298v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fysnwqm9v7xuhf820298v.png" alt="Image description" width="800" height="349"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;From there, explore the Charges by Services section to see a breakdown of all running services and their associated costs. Clicking on the plus sign beside a service reveals the specific regions where it's active.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa3osvk11rnv5hop9wpyp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa3osvk11rnv5hop9wpyp.png" alt="Image description" width="800" height="349"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Cost Explorer
&lt;/h2&gt;

&lt;p&gt;Cost Explorer offers a comprehensive view of your AWS expenses. This tool enables you to compare monthly costs, apply filters based on parameters like services, dates, and regions, providing valuable insights into your spending patterns.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fksf1fzh5cv82mgr7gnnf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fksf1fzh5cv82mgr7gnnf.png" alt="Image description" width="800" height="349"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Resource Group
&lt;/h2&gt;

&lt;p&gt;Utilize the Resource Group and Tag Editor to examine running resources across all regions. Click on the Tag Editor by the left panel. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnidlwdcgo7jd8nko526q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnidlwdcgo7jd8nko526q.png" alt="Image description" width="800" height="349"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By selecting "all regions" and "all supported resource types," you can search for and display all resources currently active. This allows you to pinpoint specific resources for further analysis and management.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fepum4n1npsx6ni3rto8o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fepum4n1npsx6ni3rto8o.png" alt="Image description" width="800" height="349"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  AWS CLI
&lt;/h2&gt;

&lt;p&gt;For a more hands-on approach, the AWS Command Line Interface (CLI) offers a method to view running services. Use the command &lt;code&gt;aws resourcegroupstaggingapi get-resources --output table&lt;/code&gt; to display tagged services. While this method is reliant on tagging, it provides a quick overview of your running resources directly from the command line in a tabular form.&lt;/p&gt;

&lt;p&gt;By leveraging these different methods, you can gain a comprehensive understanding of the services running in your AWS account. This knowledge empowers you to eliminate unnecessary expenses and ensure that your resources are both relevant and cost-effective. Whether you're a novice or an experienced AWS user, proactive monitoring is key to optimizing costs and maximizing the efficiency of your cloud infrastructure.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Host static website with AWS S3 pushing codes from Git</title>
      <dc:creator>Ayinde Jamiu</dc:creator>
      <pubDate>Tue, 01 Aug 2023 21:59:57 +0000</pubDate>
      <link>https://dev.to/ayindejamiu/host-static-website-with-aws-s3-pushing-codes-from-git-1ge5</link>
      <guid>https://dev.to/ayindejamiu/host-static-website-with-aws-s3-pushing-codes-from-git-1ge5</guid>
      <description>&lt;p&gt;&lt;em&gt;Introduction&lt;/em&gt; &lt;br&gt;
In this article, you will learn how to host your static website on s3 pushing codes from git. We will also use AWS code pipeline to deliver the codes to S3. This is a easy way to automate your website and host it in cloud. &lt;/p&gt;

&lt;p&gt;&lt;em&gt;Prerequisite&lt;/em&gt; &lt;br&gt;
AWS Account &lt;br&gt;
Git Account &lt;br&gt;
A domain name &lt;/p&gt;

&lt;p&gt;&lt;em&gt;Set up AWS S3&lt;/em&gt; &lt;br&gt;
Sign in to the AWS Management Console.&lt;br&gt;
Open the S3 service.&lt;br&gt;
Click on "Create bucket."&lt;br&gt;
Enter a unique bucket name, use the name of the website you wish to setup. e.g., "&lt;a href="http://www.prolific.com.ng"&gt;www.prolific.com.ng&lt;/a&gt;".&lt;br&gt;
Select the region closest to your target audience.&lt;br&gt;
Uncheck "Block all public access" (acknowledge that the bucket will be public)&lt;br&gt;
Click on "Create bucket."&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Configure S3 bucket to for website&lt;/em&gt; &lt;br&gt;
Click on the bucket you created &lt;br&gt;
Navigate to permissions &lt;br&gt;
On bucket policy, click on edit. &lt;br&gt;
Use this policy &lt;br&gt;
Get your resource ARN on top left of the page and add it before saving the policy&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Statement1",
            "Principal": "*",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject"
            ],
            "Resource": "arn:aws:s3:::Bucket-Name/*"
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then save the policy. &lt;/p&gt;

&lt;p&gt;Navigate to properties. Scroll to the page bottom.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mei3JRlC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/o9chjdkxs36tcy8alt19.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mei3JRlC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/o9chjdkxs36tcy8alt19.png" alt="Image description" width="800" height="377"&gt;&lt;/a&gt;&lt;br&gt;
Edit the Static website hosting&lt;br&gt;
Check the Static website hosting to &lt;code&gt;Enable&lt;/code&gt;&lt;br&gt;
On the index document, input this&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;index.html
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can add error.html for Error document&lt;br&gt;
Then save changes. &lt;/p&gt;

&lt;p&gt;Now your bucket is ready &lt;/p&gt;

&lt;p&gt;&lt;em&gt;Next Set up your github&lt;/em&gt; &lt;br&gt;
Create a GitHub repository for your website, if you haven't already and push your static website files to it.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Setup AWS Code pipeline&lt;/em&gt;&lt;br&gt;
Navigate to AWS Code Pipeline&lt;br&gt;
Click on create pipeline&lt;br&gt;
Give your pipline a name &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0lZnrt5a--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/uwoleoic59tcu45hezsk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0lZnrt5a--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/uwoleoic59tcu45hezsk.png" alt="Image description" width="800" height="377"&gt;&lt;/a&gt;&lt;br&gt;
click on next&lt;br&gt;
On the source provider &lt;br&gt;
Choose Github, we are connecting from github.&lt;br&gt;
Click on &lt;code&gt;Connect to Github&lt;/code&gt;&lt;br&gt;
Signin and authenticate your Github account&lt;br&gt;
Once connected, you will be able to see your respositories, choose the repo with your website &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bWLaQbNr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/v2av0hkyp4rlavgwte6y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bWLaQbNr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/v2av0hkyp4rlavgwte6y.png" alt="Image description" width="800" height="377"&gt;&lt;/a&gt;&lt;br&gt;
Write the name of the branch that should trigger the pipeline. This can be main, master or any other branch. &lt;/p&gt;

&lt;p&gt;Next, the build stage. You can create a simple buildspec.yaml file or you use code build. You can simple skip it.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Code deploy&lt;/em&gt; &lt;br&gt;
When you click on next, you will choose where the code will deploy to, in this case, it will be s3 bucket. &lt;br&gt;
Choose the bucket name you created. &lt;br&gt;
Tick &lt;code&gt;Extract file before deploy&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Finally, review and click &lt;code&gt;Create pipeline&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5dr95fX2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6cqryk85pep0zgil0bjg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5dr95fX2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6cqryk85pep0zgil0bjg.png" alt="Image description" width="800" height="377"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Your pipeline will run and it will push the codes in your github to S3 bucket &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--vrjfAWj_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rl9ilo5l87pib9qszsez.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vrjfAWj_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rl9ilo5l87pib9qszsez.png" alt="Image description" width="800" height="391"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Navigate to S3 from the search bar&lt;br&gt;
click on the bucket created, you will see the files in your bucket&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jkQF4xyV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qxwd9huxo40t0nwe3t6v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jkQF4xyV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qxwd9huxo40t0nwe3t6v.png" alt="Image description" width="800" height="391"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, click on properties, get the website Url and check it on your browser, your website is up and running.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>How to get AWS Official Practice Exam for Free</title>
      <dc:creator>Ayinde Jamiu</dc:creator>
      <pubDate>Sun, 08 May 2022 20:50:48 +0000</pubDate>
      <link>https://dev.to/aws-builders/how-to-get-aws-official-practice-exam-for-free-485g</link>
      <guid>https://dev.to/aws-builders/how-to-get-aws-official-practice-exam-for-free-485g</guid>
      <description>&lt;p&gt;Getting certified in AWS requires preparation and practice. Taking the practice exam usually serve as a confidence test before writing the actual exam. AWS in early 2022 announced the end of payment for her official practice exam which was 20 dollars per practice exam. &lt;/p&gt;

&lt;p&gt;To access the AWS official practice exam, follow along to get it done.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1:&lt;/strong&gt;&lt;br&gt;
Login to &lt;a href="https://explore.skillbuilder.aws/" rel="noopener noreferrer"&gt;https://explore.skillbuilder.aws/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F08jhuy4fa6ish0gnlb90.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F08jhuy4fa6ish0gnlb90.png" alt="AWS Skill builder page "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Click on Sign in. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F13ccymkhzl0htrxj36zd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F13ccymkhzl0htrxj36zd.png" alt="Sign in page for AWS skill builder"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For most of us, we will want to use our AWS credentials, so choose Login with Amazon. &lt;/p&gt;

&lt;p&gt;Then, Sign in with your AWS account details using your email and password. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdrb0rf1u8egndrdooh5b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdrb0rf1u8egndrdooh5b.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3:&lt;/strong&gt;&lt;br&gt;
After successful login. &lt;/p&gt;

&lt;p&gt;On the search bar, search for `&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS Certification Official Practice Question Sets&lt;/strong&gt;&lt;br&gt;
`&lt;/p&gt;

&lt;p&gt;Choose the language you want. There are eleven languages supported by AWS the last time I checked. &lt;/p&gt;

&lt;p&gt;Click on the course then &lt;strong&gt;enroll&lt;/strong&gt;. I choose the English version &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftfl3ox5ohplwejpd7t7y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftfl3ox5ohplwejpd7t7y.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 4:&lt;/strong&gt;&lt;br&gt;
Click on AWS Certification Official Practice Question Sets (English)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3f6kf82nhrggl0ylkz15.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3f6kf82nhrggl0ylkz15.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then you will see your private code  which should be used to access the practice exam on &lt;/p&gt;

&lt;p&gt;Note that this will give you access to all the practice exam for AWS exams. &lt;/p&gt;

&lt;p&gt;I hope this helps someone. &lt;/p&gt;

</description>
    </item>
    <item>
      <title>Five Pathways to AWS Certified Cloud Practitioner</title>
      <dc:creator>Ayinde Jamiu</dc:creator>
      <pubDate>Mon, 10 Jan 2022 12:11:51 +0000</pubDate>
      <link>https://dev.to/aws-builders/five-pathways-to-aws-certified-cloud-practitioner-445p</link>
      <guid>https://dev.to/aws-builders/five-pathways-to-aws-certified-cloud-practitioner-445p</guid>
      <description>&lt;p&gt;The AWS certified cloud practitioner exam is the foundation of AWS certification. The certification appears to be the entry point into the cloud. If you are lost on how to get certified, you are not alone. This article is to expose you to some of the pathways of preparing for the exam. Let us get started.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;AWS Re/start Program: AWS Re/start program is a full time, usually weekdays during work hour classroom-based skills development and training program. The program is free and it prepares you for a career in cloud. This program is usually competitive and it takes 12 weeks. At the end of the program you will be awarded with a certificate from AWS restart and you will be ready to take the AWS Certified Cloud Practitioner exam. The class usually takes other software development skills such as python, git among others. You can check here for a link to AWS partners near you.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Online Self paced course: Learning in the 21st century has gone beyond physical contact. Most of the technology skills learned today are gotten through online self paced structured platforms. You can get a paid AWS Certified Cloud Practitioner course on Udemy, Udacity, Ustacky, Whizlab among others. These online structured courses are a great way to have a good knowledge of what is expected in the exam and comprehensive information about the exam.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;If you feel paying for the courses, especially at the initial stage is too much, you can get started with youtube videos as there are many great channels like AWS Demos and “AWS Knowledge Center Videos” you can subscribe to and get a first hand experience of how you can navigate the world of AWS.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Online Training: If you are the kind of person that prefers communicating with the instructor, there are other training that are classroom based. There are several centers for online classes. The advantage of this remains that you can clarify issues with instructor as you learn. This kind of training is usually more expensive than the online self paced course. I will advise you to pick a trainer that uses the same time zone with your location, to avoid attending midnight classes.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;AWS Resources: Another method of preparing for the exam is to use the resources provided by AWS. You will use the whitepapers and also some of AWS online resources. You can check out the AWS Certified Cloud Practitioner Essential course on AWS Skill Builder. When you use AWS resources strictly, you will have gotten information directly from source but the learning path is more tedious.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Local Groups: We have several AWS volunteers who are always excited to mentor others to become certified and grow in the cloud path. You can be lucky to find a mentor or a structured mentorship platform. In this pathway, you will be exposed to various materials, training and other information related to the cloud. If you find yourself in this path, your personal zeal is required to excel.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;I have highlighted five different paths you can ride your way into the cloud. Remember, you need to make use of AWS free service for practice, these will improve your confidence. Towards your exam, you need to try your skill with practice questions, with this you will know your weakness.&lt;/p&gt;

&lt;p&gt;The journey into the cloud is a rewarding one, when you get certified in AWS there are lots of opportunities that will open for you. Keep moving, keep going.&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--DQ604l2e--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ncp58h4os2hsykvs9lok.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--DQ604l2e--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ncp58h4os2hsykvs9lok.jpg" alt="Image description" width="880" height="496"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
