<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Cory M</title>
    <description>The latest articles on DEV Community by Cory M (@corymullins).</description>
    <link>https://dev.to/corymullins</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/corymullins"/>
    <language>en</language>
    <item>
      <title>Cloud Resume Challenge Summary</title>
      <dc:creator>Cory M</dc:creator>
      <pubDate>Tue, 24 May 2022 22:11:14 +0000</pubDate>
      <link>https://dev.to/corymullins/cloud-resume-challenge-summary-47do</link>
      <guid>https://dev.to/corymullins/cloud-resume-challenge-summary-47do</guid>
      <description>&lt;h3&gt;
  
  
  About the challenge
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://www.corymullins.com"&gt;This&lt;/a&gt; project was completed in accordance with the &lt;a href="https://cloudresumechallenge.dev/docs/the-challenge/aws/"&gt;Cloud Resume Challenge&lt;/a&gt;, which was created by &lt;a class="mentioned-user" href="https://dev.to/forrestbrazeal"&gt;@forrestbrazeal&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Certification
&lt;/h3&gt;

&lt;p&gt;The first requirement of this challenge is to be AWS Cloud Practitioner certified, which I achieved in April 2022. After completing this certification, I began studying for the AWS Certified Solutions Architect which led to me discovering this challenge.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.credly.com/badges/e2af7cf0-ef0a-4a7f-a04e-9dc6c56e7ac8/public_url"&gt;AWS Cloud Practitioner&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Links to GitHub Repositories
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://github.com/corymullins/Cloud_Resume_Challenge_Backend"&gt;Backend&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/corymullins/Cloud_Resume_Challenge_Frontend"&gt;Frontend&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Contents:
&lt;/h3&gt;

&lt;p&gt;1 - Website Architecture&lt;br&gt;
2 - Why Use Serverless?&lt;br&gt;
3 - Infrastructure as Code&lt;br&gt;
4 - Source Control&lt;br&gt;
5 - Continuous Integration/Continuous Delivery&lt;br&gt;
6 - Lambda Function in Python&lt;br&gt;
7 - Building the Frontend&lt;br&gt;
8 - Creating a Test&lt;br&gt;
9 - Conclusion&lt;/p&gt;

&lt;h3&gt;
  
  
  1 - Website Architecture
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--siVlh0BO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gv1ci0japm62v8jglmjc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--siVlh0BO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gv1ci0japm62v8jglmjc.png" alt="Cloud Resume Challenge diagram" width="880" height="620"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The website architecture is designed as follows:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;A user requests the webpage via their browser.&lt;/li&gt;
&lt;li&gt;The browser requests the IP from Route 53, which then resolves the DNS to the nearest CloudFront edge location.&lt;/li&gt;
&lt;li&gt;CloudFront then forwards the request to the S3 bucket containing the website static files, and retrieves its contents.&lt;/li&gt;
&lt;li&gt;The S3 bucket containing the frontend code is protected with Origin Access Identity (OAI), preventing access not routed through CloudFront.&lt;/li&gt;
&lt;li&gt;The website runs JavaScript code which sends a GET request to API Gateway, to retrieve the current number of visitors stored in the DynamoDB database.&lt;/li&gt;
&lt;li&gt;API Gateway forwards this request to lambda as JSON.&lt;/li&gt;
&lt;li&gt;Lambda then identifies the type of request, performs a get/put operation on DynamoDB to retrieve and incrementally increase the number of visitors stored.&lt;/li&gt;
&lt;li&gt;The returned visitor count is then processed and displayed on the website.&lt;/li&gt;
&lt;li&gt;GitHub repositories of both frontend and backend data provide code version control, and CI/CD through GitHub Actions.&lt;/li&gt;
&lt;li&gt;Terraform deploys the AWS infrastructure as detailed in the backend repository.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  2 - Why Use Serverless?
&lt;/h3&gt;

&lt;p&gt;Before I can explain why serverless was used, it is important to explain what serverless is. Serverless is not a new concept in computing, essentially how bygone mainframes and terminals worked. Its like that jacket you still have from high school, its so old that it has come back into fashion. Serverless computing is a development model in which code is executed on servers that have been abstracted from the user, meaning that the management and operation of the servers is external to the developers and users. An analogy would be a copy center: you don't own the printers, copiers, or fax machines, you simply pay for what you use in order to finish your task.&lt;/p&gt;

&lt;p&gt;The reasons to use serverless are a combination of cost, scalability, availability, and performance. As the resources to execute code is not provisioned prior to use, the cost is only for the compute time that is actually used. Amazon manages the provision, operation, and scaling to execute workload and meet demand. In my case, the custom domain is an annual cost of $12, Route 53 is $0.50 monthly, KMS is $0.29 monthly, Lambda and DynamoDB are very far below usage levels that would incur fees.&lt;/p&gt;

&lt;h3&gt;
  
  
  3 - Infrastructure as Code
&lt;/h3&gt;

&lt;p&gt;Infrastructure as Code is the use of definition files to provision and manage AWS infrastructure, instead of manually using the console or CLI to build and configure the needed infrastructure. I chose to use Terraform rather than AWS SAM tool, as Terraform is cross-platform and allows me to integrate additional technologies such as Kubernetes and virtual machines.&lt;/p&gt;

&lt;p&gt;Before getting started it is important to take care of the following aspects:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Security: Terraform needs to have permission to deploy infrastructure on AWS. A user was created with access only to STS (Secure Token Service). A role was created with the IAM permissions needed to perform actions on the required AWS services.&lt;/li&gt;
&lt;li&gt;Terraform State: A remote backend was configured using a S3 bucket to store the Terraform state file and DynamoDB was used to store the state lock. This configuration ensures that the infrastructure declared in .tf files, and the infrastructure deployed match.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The following AWS infrastructure was deployed via Terraform:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;A private S3 bucket hosting the website frontend code.&lt;/li&gt;
&lt;li&gt;A new table was created in DynamoDB with on-demand capacity and a primary key ID. A default item was created to store a value for the number of website visitors.&lt;/li&gt;
&lt;li&gt;A Lambda function using Python runtime, the code of which was uploaded from GitHub.&lt;/li&gt;
&lt;li&gt;An API Gateway configured as Lambda integration, which sends GET requests from CORS-compliant source to Lambda as JSON, and Lambda responds with JSON.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  4 - Source Control
&lt;/h3&gt;

&lt;p&gt;I chose to use GitHub to maintain control of the frontend and backend repositories. GitHub allows all additions, revisions, and deployments to be tracked and reviewed.&lt;/p&gt;

&lt;h3&gt;
  
  
  5 - Continuous Integration/Continuous Delivery
&lt;/h3&gt;

&lt;p&gt;Continuous Integration/Continuous Delivery (CI/CD) is the method of automating and continually monitoring throughout the lifecycle of application, from integration and testing to delivery. To achieve this, I used GitHub Actions, which defines the process with YAML files. Any changes to the GitHub repositories triggers a new commit, which will deploy or modify the existing design.&lt;/p&gt;

&lt;h3&gt;
  
  
  6 - Lambda Function in Python
&lt;/h3&gt;

&lt;p&gt;The Lambda function was written in Python code. This code uses the Boto3 SDK to interact with DynamoDB, specifically to read the visitor count table, and to incrementally increase the count value. In order for Lambda to have access to DynamoDB, IAM role permissions needed to be assigned. To secure access to the Lambda function, Cross-Origin Resource Sharing (CORS) requires that the source of GET must be the website. While I had some initial difficulties getting CORS Access-Control-Allow-Origin headers to function correctly, using * was not an option as it creates a large security risk.&lt;/p&gt;

&lt;h3&gt;
  
  
  7 - Building the Frontend
&lt;/h3&gt;

&lt;p&gt;The frontend code was built using HTML, CSS, and JavaScript. The files are contained in a S3 bucket, which were deployed from the GitHub repository.&lt;/p&gt;

&lt;h3&gt;
  
  
  8 - Creating a Test
&lt;/h3&gt;

&lt;p&gt;To test the functionality of the API, I wrote a Cypress test. This test sends the API a GET request, and evaluates the response type and to have a 200 status code.&lt;/p&gt;

&lt;h3&gt;
  
  
  9 - Conclusion
&lt;/h3&gt;

&lt;p&gt;I found this challenge to be a great experience. There were many opportunities to learn the interactions between different AWS services, how to describe my design to Terraform, and how to implement CORS. I am deeply thankful to &lt;a class="mentioned-user" href="https://dev.to/forrestbrazeal"&gt;@forrestbrazeal&lt;/a&gt; for creating this wonderful challenge. I believe that this challenge has definitely improved my understanding of a myriad of AWS services, and given me direct experience in using Terraform and CI/CD.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>terraform</category>
      <category>serverless</category>
    </item>
    <item>
      <title>Cloud Resume Challenge - Part 3</title>
      <dc:creator>Cory M</dc:creator>
      <pubDate>Tue, 24 May 2022 05:23:12 +0000</pubDate>
      <link>https://dev.to/corymullins/cloud-resume-challenge-part-3-l67</link>
      <guid>https://dev.to/corymullins/cloud-resume-challenge-part-3-l67</guid>
      <description>&lt;p&gt;I completed writing the backend as IaC in Terraform. As this was the first complicated project I have done in Terraform, I used an incremental approach to build each component and testing before expanding scope. While most of the essential parts, such as S3 buckets, IAM permissions, Route 53, and CloudFront were quite straightforward; there were a few issues that required a lot of research to fix, primarily due to cryptic error messages. For example, in the API Gateway Integration, I would receive an "BadRequestException: Enumeration value for HttpMethod must be non-empty" error. After digging through documentation and many searches on Stack Overflow, I discovered that this was due to not having used the integration_http_method argument, which the documentation identified as being optional, in addition to the required http_method argument.&lt;/p&gt;

&lt;p&gt;After the Terraform code was competed and tested, the next step was to learn how to migrate the activation and distribution to GitHub Actions. This went fairly smoothly, as I had already learned the basics for the frontend code. I had a slight misstep in initializing tfvars, and in forgetting to remove reference to AWS credentials that had been stored locally during testing.&lt;/p&gt;

&lt;p&gt;Once the backend had been proven as functional from GitHub Actions, the last step was for me to write a test. I had seen multiple references to a testing framework called Cypress, which interested me as an end-to-end tool for testing web applications. I used Cypress to request a GET on my API, then validate the response as 200 status and a number. This test was then integrated into my GitHub Action workflow to be run from a Docker image after Terraform apply has completed.&lt;/p&gt;

</description>
      <category>aws</category>
    </item>
    <item>
      <title>Cloud Resume Challenge - Part 2</title>
      <dc:creator>Cory M</dc:creator>
      <pubDate>Thu, 19 May 2022 23:19:47 +0000</pubDate>
      <link>https://dev.to/corymullins/cloud-resume-challenge-part-2-2cf1</link>
      <guid>https://dev.to/corymullins/cloud-resume-challenge-part-2-2cf1</guid>
      <description>&lt;p&gt;I solved the issue with the visitor counter not updating, it was an error with the CORS configuration. I had initially believed that my configuration of website calling to the CloudFront distribution would pass the correct headers; however I discovered that the API Gateway was not configured to correctly pass the header mappings through the Integration Response. Once that was corrected, the data was passed from the DynamoDB table without issue. Unfortunately, while I was receiving the data correctly, my JavaScript function would not result in any response other than null. After pouring through documentation and searching Stack Overflow without a solution that I could get working, I decided to abandon the use of fetch/response and instead implement XMLHttpRequest. I will need to learn more about JavaScript before I reiterate the function in the future.&lt;/p&gt;

&lt;p&gt;I implemented the frontend code into GitHub, and began learning how to configure Actions to automate the CI/CD of any future changes. This led to learning how to set up secrets, and automate CloudFront invalidations through the GitHub Actions YAML configuration.&lt;/p&gt;

&lt;p&gt;More work continues to write the scope of backend infrastructure into Terraform code.&lt;/p&gt;

</description>
      <category>aws</category>
    </item>
    <item>
      <title>Cloud Resume Challenge</title>
      <dc:creator>Cory M</dc:creator>
      <pubDate>Tue, 17 May 2022 23:40:59 +0000</pubDate>
      <link>https://dev.to/corymullins/cloud-resume-challenge-27lg</link>
      <guid>https://dev.to/corymullins/cloud-resume-challenge-27lg</guid>
      <description>&lt;p&gt;I have been working in Operational Technology (industrial control systems in power plants and oil refineries) for about 15 years, doing everything from initial construction and commissioning, to maintenance and modernization design reviews. I recently decided to transition to information security, and through studying and implementing projects, I fell in love with cloud services. This discovery pushed me to begin studying AWS services, which led me to find &lt;a href="https://cloudresumechallenge.dev/docs/the-challenge/aws/"&gt;Cloud Resume Challenge&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Background
&lt;/h2&gt;

&lt;p&gt;The project is a challenge that was created by Forrest Brazeal, who works in DevOps and cloud engineering. The challenge is to build a build a simple, static website of their resume using a multitude of cloud services. As the challenge allows you to choose your preferred cloud provider, I decided to used AWS because I already had some experience using it already, and had recently completed the AWS Certified Cloud Practitioner exam.&lt;/p&gt;

&lt;p&gt;As I had chosen to use AWS, I began to lay out the infrastructure that would be needed to complete the project. S3 would host the static website, Route 53 and CloudFront would manage DNS requirements and secure website distribution. Creating a visitor counter would involve JavaScript calling API Gateway, triggering a Python Lambda function, which would read and append a DynamoDB table, and the necessary Roles would be managed with IAM. Eventually, the infrastructure would need to be written as "Infrastructure as Code" or IaC.&lt;/p&gt;

&lt;h2&gt;
  
  
  Implementation
&lt;/h2&gt;

&lt;p&gt;In my experience doing AWS labs, I had used some individual parts of AWS, but not to the level that this project would demand. Because of this, I decided to start by manually setting up a S3 static website, Route 53, and CloudFront distribution. While everything went fine, I learned that any change of code necessitated an invalidation of the CloudFront cache to propagate the new code.&lt;/p&gt;

&lt;p&gt;I am currently working on an error displaying the visitor counter updating correctly. I also have progress to make in the IaC front, I have been working in Terraform and have many bugs to work out. More to come soon.&lt;/p&gt;

</description>
      <category>aws</category>
    </item>
  </channel>
</rss>
