<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Andrew May</title>
    <description>The latest articles on DEV Community by Andrew May (@andrewdmay).</description>
    <link>https://dev.to/andrewdmay</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/andrewdmay"/>
    <language>en</language>
    <item>
      <title>Faster, Cheaper - AWS Graviton 2</title>
      <dc:creator>Andrew May</dc:creator>
      <pubDate>Sat, 20 Nov 2021 20:29:55 +0000</pubDate>
      <link>https://dev.to/leading-edje/faster-cheaper-aws-graviton-2-2b0c</link>
      <guid>https://dev.to/leading-edje/faster-cheaper-aws-graviton-2-2b0c</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Graviton: Hypothetical quantum of gravity, a Marvel character, and now an ARM based CPU from AWS&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;There have been multiple attempts to bring ARM based CPUs into the data center in recent years due to their perceived efficiency compared to x86_64 designs. Generally they have been seen as lower performance designs suitable for running webservers and other light workloads. &lt;/p&gt;

&lt;p&gt;In 2018, AWS released Graviton 1 CPUs that powered the A1 instance family. These were cheaper than Intel/AMD instance types, but also lower performing. I think I launched one to take a look, but never really considered using them again.&lt;/p&gt;

&lt;p&gt;In 2019, AWS released &lt;a href="https://aws.amazon.com/ec2/graviton/" rel="noopener noreferrer"&gt;Graviton 2&lt;/a&gt; CPUs, and these are a very different beast. AWS claims that they are up to &lt;strong&gt;40% faster&lt;/strong&gt; than the equivalent Intel instance types, and prices them &lt;strong&gt;20% less&lt;/strong&gt;. Instead of being limited to one instance family, they are available in a whole range (T4g, C6g, M6g, R6g, X2gd) and they're no longer limited to EC2 instances.&lt;/p&gt;

&lt;p&gt;In many ways you can think of what AWS is doing with Graviton to be similar to what Apple is doing with their M1 line of chips. Both are using custom ARM designs developed in house (by companies they bought) and are redefining what we expect from ARM based CPUs.&lt;/p&gt;

&lt;h2&gt;
  
  
  Supported Services
&lt;/h2&gt;

&lt;p&gt;Some of the services (the list is growing over time, and with re:Invent on the horizon this could well grow in the next couple of weeks) that support Graviton 2:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;EC2 - Run servers with a range of instance types&lt;/li&gt;
&lt;li&gt;Lambda - Run serverless applications in a range of languages&lt;/li&gt;
&lt;li&gt;ECS/EKS - ARM based EC2 AMIs are available to work with both container platforms in AWS, and it's hopefully just a matter of time before Fargate support appears&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;Update: 11/23 - &lt;a href="https://aws.amazon.com/blogs/aws/announcing-aws-graviton2-support-for-aws-fargate-get-up-to-40-better-price-performance-for-your-serverless-containers/" rel="noopener noreferrer"&gt;Graviton 2 support for AWS Fargate&lt;/a&gt; announced before re:Invent! Currently this does not include Fargate Spot.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;ul&gt;
&lt;li&gt;RDS - Supports MySQL, Postgres and Aurora&lt;/li&gt;
&lt;li&gt;ElastiCache - Redis and Memcached&lt;/li&gt;
&lt;li&gt;OpenSearch - AWS's fork of ElasticSearch&lt;/li&gt;
&lt;li&gt;EMR - Elastic Map Reduce&lt;/li&gt;
&lt;li&gt;CodeBuild - Build code to run on ARM on Graviton instances&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;With a lot of these services you will need to be on a recent version to be able to use Graviton instances. For example you need to use a recent version of MySQL 8 to select Graviton instance types in RDS, and the very latest version to use the burstable T4g instance family.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Where to start using Graviton 2?
&lt;/h2&gt;

&lt;p&gt;If you are starting a new project in AWS, and you're using some of the services listed above, I would recommend starting with Graviton 2 instances where possible. That will give you cost savings and performance benefits from day 1. In some cases AWS is already defaulting the instance type to a Graviton 2 processor in the AWS console.&lt;/p&gt;

&lt;p&gt;This does make some assumptions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You're using services/applications that can run on Linux&lt;/li&gt;
&lt;li&gt;The runtimes and tools you need are available for ARM (true of pretty much all major languages and Open Source tools)&lt;/li&gt;
&lt;li&gt;You're willing to accept there may be some extra work in compiling/packaging applications to run on a different architecture than you're developing on (unless you're on a new Mac)&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  APIs are (generally) easy
&lt;/h3&gt;

&lt;p&gt;For services where you interact with an API and don't run your own code (RDS, ElastiCache, OpenSearch), consumers of the API don't care whether they're calling a service running on x86_64 or ARM, so these are often the easiest to switch.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;If you're already running some of these services, you may find that you need to upgrade the version and this can be disruptive depending upon how far from the latest version you are and the type of service. But perhaps this is the time to do the upgrade you've been putting off for the last 6 months.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Open Source applications
&lt;/h3&gt;

&lt;p&gt;Linux has supported ARM for a long time (doesn't everyone have a stack of Raspberry Pis sitting around?), and if you can install a package on an x86_64 machine you can almost certainly install it on Graviton 2 instances.&lt;/p&gt;

&lt;h4&gt;
  
  
  Nginx example
&lt;/h4&gt;

&lt;p&gt;This snippet of CloudFormation launches an EC2 instance running Amazon Linux 2, installs Nginx then starts the service:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;  &lt;span class="na"&gt;NginxInstance&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS::EC2::Instance&lt;/span&gt;
    &lt;span class="na"&gt;Properties&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;IamInstanceProfile&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Ref&lt;/span&gt; &lt;span class="s"&gt;InstanceProfile&lt;/span&gt;
      &lt;span class="na"&gt;ImageId&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Ref&lt;/span&gt; &lt;span class="s"&gt;AmiParameter&lt;/span&gt;
      &lt;span class="na"&gt;InstanceType&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Ref&lt;/span&gt; &lt;span class="s"&gt;InstanceType&lt;/span&gt;
      &lt;span class="na"&gt;SecurityGroups&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="kt"&gt;!Ref&lt;/span&gt; &lt;span class="s"&gt;SecurityGroup&lt;/span&gt;
      &lt;span class="na"&gt;Tags&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;Key&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Name&lt;/span&gt;
          &lt;span class="na"&gt;Value&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Ref&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;AWS::StackName'&lt;/span&gt;
      &lt;span class="na"&gt;UserData&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;Fn::Base64&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Sub&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
          &lt;span class="s"&gt;#!/bin/bash -xe&lt;/span&gt;
          &lt;span class="s"&gt;amazon-linux-extras install -y nginx1&lt;/span&gt;
          &lt;span class="s"&gt;systemctl start nginx&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This works identically whether the AMI/Instance Type are for x86_64 or for Graviton.&lt;/p&gt;

&lt;h3&gt;
  
  
  Custom Applications
&lt;/h3&gt;

&lt;p&gt;For your own applications, whether running on EC2 or Lambda, how much work it is to run on Graviton 2 will depend upon the language:&lt;/p&gt;

&lt;h4&gt;
  
  
  Languages: No changes necessary
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Java&lt;/li&gt;
&lt;li&gt;.NET Core, .NET 5 or 6&lt;/li&gt;
&lt;li&gt;NodeJS&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;All of these run on top of a runtime and your code is either not compiled (NodeJS) or compiled into an intermediary form that works across platforms.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Java in particular has always had the concept of build once, run anywhere, and it's over 20 years ago that I first compiled Java code on an Intel/Windows machine and ran it on a Sun/Solaris server.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h4&gt;
  
  
  Languages: Repackage (sometimes)
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Python&lt;/li&gt;
&lt;li&gt;Ruby&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These languages can run on multiple platforms, often without changing, but may make use of some packages that use native code. This means that you &lt;em&gt;may&lt;/em&gt; need to re-package your application for ARM, and some packages may need to be compiled if they're not available pre-compiled in the package manager.&lt;/p&gt;

&lt;p&gt;I'm not that familiar with PHP, but I think it may fall into this list as well. This &lt;a href="https://aws.amazon.com/blogs/compute/improving-performance-of-php-for-arm64-and-impact-on-amazon-ec2-m6g-instances/" rel="noopener noreferrer"&gt;blog post&lt;/a&gt; talks about some of the work AWS has done with the PHP community to improve the performance on ARM64 CPUs like Graviton 2.&lt;/p&gt;

&lt;h4&gt;
  
  
  Languages: Recompile
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Golang&lt;/li&gt;
&lt;li&gt;Rust&lt;/li&gt;
&lt;li&gt;C, C++&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These languages will need to be recompiled to run on a Graviton instance, using an ARM64 server to compile (e.g. in CodeBuild), or cross-compiling from another machine.&lt;/p&gt;

&lt;h3&gt;
  
  
  Containers
&lt;/h3&gt;

&lt;p&gt;Although containers are considered portable (run the same container many different places), they are architecture specific and you cannot run the same container on x86_64 and ARM architectures. Even if your application code is in a language like Java, it's going to reply on a JVM being installed in the container that uses native code.&lt;/p&gt;

&lt;p&gt;Containers can either be built on the target architecture, or there are options for building &lt;a href="https://docs.docker.com/desktop/multi-arch/" rel="noopener noreferrer"&gt;multiple architectures&lt;/a&gt; using emulation.&lt;/p&gt;

&lt;p&gt;AWS ECR &lt;a href="https://aws.amazon.com/blogs/containers/introducing-multi-architecture-container-images-for-amazon-ecr/" rel="noopener noreferrer"&gt;has added support&lt;/a&gt; for multi-architecture container images. This allows you to store different architectures in the same repository and when you pull a multi-architecture tag it picks to correct container to download based upon the client architecture.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;At the moment the UI for ECR doesn't do a good job of displaying architectures, and I'm not sure how lifecycle policies are affected.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Lambda functions
&lt;/h3&gt;

&lt;p&gt;Depending upon the language you're using, it may be as easy as changing the Architecture value to switch from x86_64 to ARM64 (i.e. Graviton 2). As discussed above, if you're using native packages or code you will need to repackage or recompile to switch.&lt;/p&gt;

&lt;p&gt;One particularly nice feature is the ability to gradually shift traffic from one architecture to another by &lt;a href="https://aws.amazon.com/blogs/aws/aws-lambda-functions-powered-by-aws-graviton2-processor-run-your-functions-on-arm-and-get-up-to-34-better-price-performance/" rel="noopener noreferrer"&gt;using weights to distribute traffic between two versions of the function&lt;/a&gt;. This would allow you to send a small percentage of your traffic to the Graviton 2 version of the function until you are satisfied that there are no errors and the performance is satisfactory (hopefully better).&lt;/p&gt;

&lt;h2&gt;
  
  
  Graviton Challenge
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F58y697z15e5gv2h2sikm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F58y697z15e5gv2h2sikm.png" alt="Graviton Challenge"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;AWS is running the &lt;a href="https://aws.amazon.com/ec2/graviton/challenge/" rel="noopener noreferrer"&gt;Graviton Challenge&lt;/a&gt; which walks you through migrating an application to Graviton 2 over 4 days. Some parts of the challenge have finished, but you can still run a single T4g.micro instance for free until 12/31.&lt;/p&gt;

&lt;h2&gt;
  
  
  Looking forward to re:invent
&lt;/h2&gt;

&lt;p&gt;The re:invent catalog contains a number of sessions related to Graviton and custom silicon:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;CMP301: The journey of silicon innovation at AWS&lt;/li&gt;
&lt;li&gt;CMP213: Lessons learned from customers who have adopted AWS Graviton&lt;/li&gt;
&lt;li&gt;AMZ301: How Amazon migrated a large ecommerce platform to AWS Graviton&lt;/li&gt;
&lt;li&gt;CMP214: Run containerized workloads on AWS Graviton2 for better price-performance&lt;/li&gt;
&lt;li&gt;DEM002-S: Workload analysis on Arm-based AWS Graviton2 processors&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I'd also be very surprised if there aren't some more announcements in this area.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.to/leading-edje"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkshqaumpndcmytd0dqly.png" alt="Leading EDJE Articles"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>arm</category>
      <category>cloud</category>
    </item>
    <item>
      <title>GCP Cloud Digital Leader, a certification in search of an identity</title>
      <dc:creator>Andrew May</dc:creator>
      <pubDate>Thu, 09 Sep 2021 00:03:36 +0000</pubDate>
      <link>https://dev.to/leading-edje/gcp-cloud-digital-leader-a-certification-in-search-of-an-identity-5bk1</link>
      <guid>https://dev.to/leading-edje/gcp-cloud-digital-leader-a-certification-in-search-of-an-identity-5bk1</guid>
      <description>&lt;p&gt;I recently took and passed the exam for the new &lt;a href="https://cloud.google.com/certification/cloud-digital-leader" rel="noopener noreferrer"&gt;GCP Cloud Digital Leader certification&lt;/a&gt;, and I thought I'd share a few thoughts.&lt;/p&gt;

&lt;p&gt;The certification is the first "Foundation" Level certification for GCP, but it seems to sit there uneasily.&lt;/p&gt;

&lt;h2&gt;
  
  
  Similar Certifications
&lt;/h2&gt;

&lt;p&gt;I've previously taken both the AWS Cloud Practitioner and Microsoft Azure Fundamentals certifications (along with a bunch of other AWS certs), and these both provide fairly easy introductions to cloud concepts and a high level overview of the core services in the platform.&lt;/p&gt;

&lt;h2&gt;
  
  
  Digital Leader?
&lt;/h2&gt;

&lt;p&gt;The name of the certification seems to be an indication of the aspirations Google had for this new certification, and the training material they provide makes it seem that the certification is squarely aimed at decision makers who are considering moving to the cloud but either need convincing, or need ammunition to convince others.&lt;/p&gt;

&lt;p&gt;The four training courses in their learning path are titled:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Introduction to Digital Transformation with Google Cloud&lt;/li&gt;
&lt;li&gt;Innovating with Data and Google Cloud&lt;/li&gt;
&lt;li&gt;Infrastructure and Application Modernization with Google Cloud &lt;/li&gt;
&lt;li&gt;Understanding Google Cloud Security and Operations&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The courses take about 8-10 hours to go through and have a number of simple tests at the end of sections. In the videos you'll hear a lot of reasons for why you should move to the Cloud, and GCP in particular, but you'll learn practically nothing about the individual services available in GCP. Some of the services are mentioned by name, but with no detail - for example Cloud SQL gets a mention, but no details about which RDBMS engines it supports.&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://docs.google.com/forms/d/e/1FAIpQLSfsSfkh9PE-HjdRRzJ24wPSjZrXF3gLxmncAYx31gyz2rLbtw/viewform" rel="noopener noreferrer"&gt;10 practice test questions&lt;/a&gt; they provide might lead you to think these videos have prepared you reasonably well for the exam, as they ask you to chose between adding resources in your datacenter or creating them in a "public cloud platform" (wink, wink, we mean GCP).&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Hint: the answer is always moving to the cloud!&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;There are actually a number of these types of questions in the exam, but it feels like they got part way and then realized they had nowhere near enough questions of that type to create a certification.&lt;/p&gt;

&lt;p&gt;Many of the other questions are of a more traditional foundation style, expecting you to select the right service to meet a particular use case. It does seem like they expect you to know &lt;strong&gt;every&lt;/strong&gt; service in GCP unlike the AWS and Azure certs, but at least there are a lot less services in GCP.&lt;/p&gt;

&lt;p&gt;Then there are questions that feel like they were pulled from an Associate Architect exam that doesn't exist, that require you to have a bit deeper understanding of how to select services, or configure IAM etc. The addition of these questions makes the exam feel more difficult as a whole than the other foundation certs, and had me doubting myself a few times.&lt;/p&gt;

&lt;h2&gt;
  
  
  Preparation
&lt;/h2&gt;

&lt;p&gt;A group of us at &lt;a href="https://www.leadingedje.com" rel="noopener noreferrer"&gt;Leading EDJE&lt;/a&gt; all decided to get this certification, so we watched the provided videos together. Fortunately before diving headlong into the exam, we did some research and found some reviews of the beta exam that made us realize we were not yet prepared.&lt;/p&gt;

&lt;p&gt;There isn't a lot of third party training available yet because it's such a new certification. I ended up using the &lt;a href="https://www.exampro.co/gcp-cdl" rel="noopener noreferrer"&gt;ExamPro training course&lt;/a&gt; that includes 5.5 hours of videos and practice questions that are designed to be closer to the real exam.&lt;/p&gt;

&lt;p&gt;I also spent some time reading my &lt;a href="https://www.amazon.com/Google-Cloud-Certified-Associate-Engineer/dp/1119564417" rel="noopener noreferrer"&gt;Associate Cloud Engineer Study guide&lt;/a&gt; that I bought a while ago but never got around to using.&lt;/p&gt;

&lt;p&gt;After that I felt pretty prepared for the exam and went ahead and booked it to take a few hours after I finished going through the ExamPro course.&lt;/p&gt;

&lt;h2&gt;
  
  
  The exam
&lt;/h2&gt;

&lt;p&gt;GCP uses Webassessor/Kryterion for their exams, and while the process was similar for a remotely proctored exam was similar to the companies used for AWS and Azure tests, the actual pre-exam screening was made significantly more difficult because you can't use your phone to take pictures of the room and have to use your webcam - so there was a lot of trying to scan the room (and under the table) using my laptop, and in a few cases (like verifying my id) I had to take a photo with my phone, zoom in and then hold it up to my webcam.&lt;/p&gt;

&lt;p&gt;The actual exam software and type of question was similar to the AWS Cloud Practitioner, multiple choice questions most with a single answer - none of the variety of question styles that the Azure Fundamentals exam uses. As with AWS, they ask you to take a questionnaire before telling you if you passed or failed.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Currently I just know I've passed, I'm still waiting on the official email from Google.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Some of the questions were very straightforward and I answered in seconds, others I wasn't entirely sure about. I answered all the questions in about 30 of the allotted 90 minutes.&lt;/p&gt;

&lt;h2&gt;
  
  
  Summing up
&lt;/h2&gt;

&lt;p&gt;If you are a decision maker trying to decide whether to move to the Cloud or not, the provided GCP training videos may be of use, but I'm not convinced studying for the certification exam itself will be a great deal of use.&lt;/p&gt;

&lt;p&gt;If you already know you're going to be using GCP, and want to learn more about the platform, then the provided training is of little use and you're probably better served by studying for the &lt;a href="https://cloud.google.com/certification/cloud-engineer" rel="noopener noreferrer"&gt;Associate Cloud Engineer certification&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.to/leading-edje"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Femoex3mjpg9liyomhtax.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>cloud</category>
      <category>googlecloud</category>
      <category>certification</category>
    </item>
    <item>
      <title>Jamming with AWS</title>
      <dc:creator>Andrew May</dc:creator>
      <pubDate>Sun, 22 Aug 2021 16:20:45 +0000</pubDate>
      <link>https://dev.to/leading-edje/jamming-with-aws-50h3</link>
      <guid>https://dev.to/leading-edje/jamming-with-aws-50h3</guid>
      <description>&lt;p&gt;The upcoming &lt;a href="https://aws.amazon.com/events/summits/online/americas/"&gt;AWS Summit Online (Americas)&lt;/a&gt; next week (August 24th-26th) will have the usual Keynotes and Sessions, but the most interesting/fun part of the summit will be the &lt;a href="https://jam.awsevents.com/"&gt;AWS Jam&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;An AWS Jam is a "Capture the Flag" style event that runs over two days (on the 25th/26th August) where you complete challenges in AWS to earn points, either competing as an individual or as part of a team. The challenges vary in complexity and difficulty and can take anywhere from a few minutes to a couple of hours. If you're stuck you can get hints, but these decrease the number of points you get for completing the challenge.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Example challenges from the AWS Jam at the May 2021 summit:&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Mn_EIn7l--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/19s7qfnjfa8n6dqis77l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Mn_EIn7l--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/19s7qfnjfa8n6dqis77l.png" alt="AWS Jam Challenges"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;At the last summit I wasn't entirely sure what an AWS Jam was, so I took part by myself and didn't have time to organize a team (I was also limited in how much time I could spend on it), but I still managed to reach 38th on the leader board by quickly completing a number of challenges at the last minute!&lt;/p&gt;

&lt;p&gt;I found it was a great opportunity to experiment with parts of AWS I'd not used before, in particular some of the Data Science and ML challenges where you're given a pre-configured Notebook and Dataset and they work you through the steps of data analysis.&lt;/p&gt;

&lt;p&gt;This time I'm hoping to have a team of EDJErs to compete in the challenge and see how much further we can get up the leaderboard!&lt;/p&gt;

&lt;h2&gt;
  
  
  Joining the Jam
&lt;/h2&gt;

&lt;p&gt;In order to take part in the AWS Jam you'll need a Jam account (&lt;a href="https://jam.awsevents.com/"&gt;sign up here&lt;/a&gt;) and you'll also need to code for the Jam which will be available once the AWS Summit starts (so you'll need at least one person in your team to register with the summit to get the code, create a team, and then you can invite people to the Jam and direct them to your team).&lt;/p&gt;

&lt;h2&gt;
  
  
  The rest of the summit
&lt;/h2&gt;

&lt;p&gt;There should be an interesting keynote from the VP of Machine Learning at AWS (Swami Sivasubramanian), and possibly a few new announcements, and then there's a selection of interesting sessions in all areas of AWS.&lt;/p&gt;

&lt;p&gt;Additionally there are two hour workshops in a range of technologies (e.g. Build and deploy web applications with AWS App Runner) that you need to &lt;a href="https://aws-summit-online-americas-virtual-workshops.splashthat.com/"&gt;register for separately&lt;/a&gt;. They're great if that particular workshop is something you want to learn about, but won't give you experience with the range of AWS services that the Jam will.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.to/leading-edje"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Vl6ulxi5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ch3nypi7c02kagscohyu.png" alt="EDJE Articles"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Feature flags for Infrastructure as Code</title>
      <dc:creator>Andrew May</dc:creator>
      <pubDate>Sun, 23 May 2021 18:23:58 +0000</pubDate>
      <link>https://dev.to/leading-edje/feature-flags-for-infrastructure-as-code-4hi8</link>
      <guid>https://dev.to/leading-edje/feature-flags-for-infrastructure-as-code-4hi8</guid>
      <description>&lt;h2&gt;
  
  
  Feature what?
&lt;/h2&gt;

&lt;p&gt;Feature flags or &lt;a href="https://martinfowler.com/bliki/FeatureToggle.html" rel="noopener noreferrer"&gt;toggles&lt;/a&gt; are often* used in application code to allow for rapid integration of new changes without necessarily enabling them in production.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;*I have no real idea how commonly used they really are&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Feature flags can prevent the dreaded long running feature branch that keeps accumulating changes that you hold off from merging because you're not ready for it to go into production in the next release.&lt;/p&gt;

&lt;p&gt;These same challenges can apply to your Infrastructure as Code (IaC) where having changes to shared resources you want to test in a lower environment but aren't ready to promote to production can be a real problem.&lt;/p&gt;

&lt;p&gt;Consider this scenario:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You're making changes to your network in a cloud environment.&lt;/li&gt;
&lt;li&gt;You've got your changes in a development environment and everything seems to be working, but you can't go to production until the next maintenance window because it's going to cause an outage.&lt;/li&gt;
&lt;li&gt;Then you suddenly have to make a different change to that same IaC template that has to go into production immediately, but you want to test it before you apply it to production.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Well, now you've got a problem because development has your other changes in it, so you're going to have to branch your template for production, but perhaps also update what's in development with the new change, or rollback your other changes to test. &lt;/p&gt;

&lt;p&gt;🤷 Either way it's not ideal.&lt;/p&gt;

&lt;h2&gt;
  
  
  Adding flags to your IaC
&lt;/h2&gt;

&lt;p&gt;IaC tools have conditional logic of varying degrees of sophistication. This might be &lt;a href="https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/conditions-section-structure.html" rel="noopener noreferrer"&gt;Conditions&lt;/a&gt; in CloudFormation, &lt;a href="https://www.terraform.io/docs/language/expressions/conditionals.html" rel="noopener noreferrer"&gt;Conditional Expressions&lt;/a&gt; in Terraform or the full power of a programming language using the &lt;a href="https://docs.aws.amazon.com/es_es/cdk/latest/guide/home.html" rel="noopener noreferrer"&gt;AWS Cloud Development Kit (CDK)&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Once you've got conditional logic, you can turn parts of your infrastructure on/of or configured on way or another based upon some kind of input. The easiest option here is to have a parameter to your template act as your feature flag.&lt;/p&gt;

&lt;p&gt;Your goal with the feature flags is to be able to merge changes and then use the flags to enable the features progressively in environments while still allowing additional changes to be made and tested.&lt;/p&gt;

&lt;h3&gt;
  
  
  To create or not to create, that is the question
&lt;/h3&gt;

&lt;p&gt;With feature flags in application software, the new code that is not yet enabled is actually deployed to production, it's just not active. With IaC it's a little different, because your new change may be creating additional resources (or deleting old ones).&lt;/p&gt;

&lt;p&gt;There are a few factors to consider when you decide whether your flag should control the existence of resources or just whether they're configured to be used:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Are you adding new resources of significant cost that you aren't sure when they're going to be used?&lt;/li&gt;
&lt;li&gt;Can the change exist in parallel with existing resources?&lt;/li&gt;
&lt;li&gt;Does pre-creating new resources make the eventual switchover faster and less risky?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In other words it's going to depend upon the situation. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;If the resources are cheap (perhaps even free if not being used) and can co-exist then you probably don't need to bother with a flag at all and just go ahead and create the new resources.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Testability
&lt;/h2&gt;

&lt;p&gt;You can often test your changes by going ahead and deploying them in a non-production environment. Depending upon the change you may be able to deploy it in parallel to existing resources and not break anything if you get it wrong, but certain changes can have impacts (going back to our network changes), and some changes are only applied in production.&lt;/p&gt;

&lt;p&gt;Depending upon the IaC tool you are using there are different levels of testing you can perform &lt;strong&gt;without deploying anything&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Guidelines for &lt;a href="https://www.hashicorp.com/blog/testing-hashicorp-terraform" rel="noopener noreferrer"&gt;Testing HashiCorp Terraform&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/cdk/latest/guide/testing.html" rel="noopener noreferrer"&gt;Testing AWS CDK Constructs&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;With CDK you can write regular unit tests that can easily be automated to ensure you create the correct infrastructure, and you can vary your inputs (your feature flags) to ensure they do the right thing when turned on or off. This can give you the confidence to push your new template to production and know that it won't change anything unexpectedly until you toggle your flag 🚩.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;As I'm writing this, I'm doing a good job of convincing myself that I should look more seriously into using CDK for new projects.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Your level of confidence and familiarity with techniques like unit testing are going to depend upon your background, but robust testing for your Infrastructure code is just as important as your Application code.&lt;/p&gt;

&lt;h3&gt;
  
  
  Cleaning up
&lt;/h3&gt;

&lt;p&gt;There is some conditional logic in your IaC that will always be there to handle differences between environments.&lt;/p&gt;

&lt;p&gt;Feature flags are different - they should be transient and cleaned-up once the flag has been turned on in every environment.&lt;/p&gt;

&lt;p&gt;Once again unit testing is really useful here as you can ensure that removing the flag won't make unexpected changes.&lt;/p&gt;

&lt;h2&gt;
  
  
  Summing up
&lt;/h2&gt;

&lt;p&gt;Using feature flags in your Infrastructure as Code can allow you to merge changes and promote them to production without updating your infrastructure until you are ready. This can reduce the number of conflicting changes in your templates and avoid problems with when particular changes go live.&lt;/p&gt;

&lt;p&gt;I wrote this article because I've been struggling with exactly the problem I described with changes in development that aren't ready to go to production.&lt;/p&gt;

&lt;p&gt;It seems to me now like an obvious solution, but I wasn't using it before, and I couldn't find a lot of articles suggesting this approach.&lt;/p&gt;

&lt;p&gt;Let me know in the comments if you're already been doing something like this.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.to/leading-edje"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fscroshyxang2b4bbiiqu.png" alt="Leading EDJE dev.to"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>devops</category>
      <category>cloud</category>
      <category>iac</category>
    </item>
    <item>
      <title>Cloud Migration 101: Developer Edition</title>
      <dc:creator>Andrew May</dc:creator>
      <pubDate>Sat, 13 Mar 2021 00:37:48 +0000</pubDate>
      <link>https://dev.to/leading-edje/cloud-migration-101-developer-edition-ol5</link>
      <guid>https://dev.to/leading-edje/cloud-migration-101-developer-edition-ol5</guid>
      <description>&lt;p&gt;&lt;em&gt;So your company is moving to the cloud, that's cool,&lt;br&gt;
but how does it affect &lt;strong&gt;YOU&lt;/strong&gt;?&lt;/em&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  How things work on-prem
&lt;/h1&gt;

&lt;p&gt;When applications are on-premises, developers typically have little say about the infrastructure their applications run on. The choice of languages and frameworks may be set in stone, and applications are more likely to be monolithic.&lt;/p&gt;

&lt;p&gt;Even with the increasing adoption of DevOps practices, there may still be a significant divide between application developers and the infrastructure/operations team - your deployments may be automated, but setting up a new server probably isn't.&lt;/p&gt;

&lt;h1&gt;
  
  
  How things change in the Cloud
&lt;/h1&gt;

&lt;p&gt;The &lt;a href="https://docs.aws.amazon.com/wellarchitected/latest/performance-efficiency-pillar/design-principles.html"&gt;Performance Pillar&lt;/a&gt; of the AWS Well Architected Framework includes these Design Principals:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Democratize advanced technologies: Make advanced technology implementation easier for your team&lt;/li&gt;
&lt;li&gt;Use serverless architectures&lt;/li&gt;
&lt;li&gt;Experiment more often&lt;/li&gt;
&lt;li&gt;Consider mechanical sympathy&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This isn't just true of AWS, the ability to quickly spin up new resources, initially for experimentation and then all the way to production, is a feature of all Cloud platforms.&lt;/p&gt;

&lt;p&gt;The architecture of applications built to take advantage of running in the cloud can be radically different to what you've been previously building, and while you can deploy an ASP.NET core API to AWS Lambda (for example), how it behaves in that environment is significantly different to running it in IIS.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;AWS provides a NuGet package (see &lt;a href="https://github.com/aws/aws-lambda-dotnet/tree/master/Libraries/src/Amazon.Lambda.AspNetCoreServer"&gt;Amazon.Lambda.AspNetCoreServer&lt;/a&gt;) that makes it easy to do this. It works pretty well, but instead of one copy of the API serving many clients you have many copies of the API each serving one client at a time. There are also limitations in areas like file upload and download (which vary depending upon whether this is being used with API Gateway or an Application Load Balancer).&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h1&gt;
  
  
  Become a Cloud Developer
&lt;/h1&gt;

&lt;p&gt;For Leading EDJE I've worked on a number of cloud migration projects, as an architect, developer and cloud engineer. I've seen the difficulties that developers can face getting up to speed with the cloud, and the miscommunication that can occur between application and infrastructure teams.&lt;/p&gt;

&lt;p&gt;It's more important than ever for developers to understand how their applications run when deployed and not just on their machine when developing. This requires developers to have an understanding of the Cloud platform that's being used and it's features and quirks. It's also important (this has always been important) for developers to be involved in supporting applications all the way through their lifecycle including production support.&lt;/p&gt;

&lt;h2&gt;
  
  
  Training and certifications
&lt;/h2&gt;

&lt;p&gt;Hopefully, if your company has decided to move to the cloud they've recognized that there's a skill gap and have a plan for training. If so, make sure you take advantage of it - even if you don't use it immediately, it shows a willingness to learn that looks good at review time. If unfortunately your company hasn't got a plan in place, or they are limiting who has access to the training to a select few (perhaps they think developers don't need to know the details), then it behooves you to still do what you can to learn about the cloud.&lt;/p&gt;

&lt;p&gt;This unfortunately may mean that you need to learn about the cloud in your own time, and I recognize that not everyone has free time to spend on this self-study, but if you do there can be many benefits for your career.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;If your company doesn't want to make use of your new found cloud skills, there are lots of other companies that are looking for them.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Studying for a cloud certification is a good way of getting an overview of a cloud platform. There's a variety of free and paid training (much of it inexpensive), and you should also go hands-on and sign up for a free trial of the cloud platform your company is using so that you can try some things out.&lt;/p&gt;

&lt;h2&gt;
  
  
  Mechanical Sympathy
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;“You don’t have to be an engineer to be be a racing driver, but you do have to have Mechanical Sympathy.” – Jackie Stewart, racing driver&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;If you don't understand how the cloud platform works, then you can build an application that &lt;em&gt;runs in the cloud&lt;/em&gt;, but you can't build a &lt;em&gt;cloud native application&lt;/em&gt; that makes efficient usage of cloud resources.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Cloud Native is an overused term that has multiple different definitions.&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://www.cncf.io/about/who-we-are/"&gt;Cloud Native Computing Foundation&lt;/a&gt; has a heavy focus on Kubernetes and Linux, but when I talk about it I'm thinking more of what's described in this &lt;a href="https://azure.microsoft.com/en-us/overview/cloudnative/"&gt;Azure Cloud-Native overview&lt;/a&gt;, with a combination of managed services, containers and serverless.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;New applications you build in the cloud can take advantage of services provided by the platform, but you can only do this effectively if you understand the strengths and weaknesses of those services. It may make sense to decompose your application into microservices or even smaller units of work and use messaging to decouple processing.&lt;/p&gt;

&lt;p&gt;The implication of this is that as a developer you need to be more involved in architectural decisions even when you are working on relatively small applications. You will hopefully have someone in your organization setting guidelines for Cloud architecture, but each application has different needs.&lt;/p&gt;

&lt;p&gt;You may have the opportunity to get involved in provisioning the infrastructure using Infrastructure as Code tools like Terraform, CloudFormation (AWS), Azure Resource Manager or Cloud Deployment Manager (GCP). If you can help to build the infrastructure you are far less likely to be held up waiting on another team to do the work or to have difficulties communicating what you need.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Infrastructure as Code may be foreign and new to you, but it's the best way to really understand how the building blocks in a cloud platform fit together.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Organizational resistance
&lt;/h2&gt;

&lt;p&gt;A cloud migration is a journey for a company and it may require changes that not everyone is ready to accept.&lt;/p&gt;

&lt;p&gt;As a cloud developer who learns how the cloud platform works, you can be in a position to influence how your company adapts and this can be a great career opportunity. That may put you at odds with those who oppose change, but having a deep understanding of the platform allows you to educate those around you and overcome some of those difficulties.&lt;/p&gt;

&lt;h2&gt;
  
  
  A change of role?
&lt;/h2&gt;

&lt;p&gt;For much of my career I was a Java developer and then an application architect specializing in Java. I learned about AWS while I worked on my first Cloud project and found myself splitting my time between architecting/developing parts of the application, and building out the infrastructure using CloudFormation.&lt;/p&gt;

&lt;p&gt;Personally I enjoy the mix of all these different roles, but you might find out that you have an affinity for building cloud infrastructure. The relatively new field of Site Reliability Engineering can be a good fit for those with a development background.&lt;/p&gt;

&lt;h1&gt;
  
  
  You owe it to yourself
&lt;/h1&gt;

&lt;p&gt;I firmly believe that Cloud platforms are here to stay and will become the dominant deployment platform for the applications we all develop. Even those organizations who do not want to use a public platform are likely to adopt technologies where infrastructure can be provisioned on demand.&lt;/p&gt;

&lt;p&gt;So, if you consider yourself "just a developer" you might want to rethink the boundaries of what that means.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.to/leading-edje"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--adXLSpgt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/scroshyxang2b4bbiiqu.png" alt="Leading EDJE dev.to"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>cloud</category>
      <category>migration</category>
      <category>training</category>
    </item>
    <item>
      <title>The relevance of the AWS Certified DevOps Engineer - Professional certification</title>
      <dc:creator>Andrew May</dc:creator>
      <pubDate>Sun, 21 Feb 2021 20:57:56 +0000</pubDate>
      <link>https://dev.to/leading-edje/the-relevance-of-the-aws-certified-devops-engineer-professional-certification-1nio</link>
      <guid>https://dev.to/leading-edje/the-relevance-of-the-aws-certified-devops-engineer-professional-certification-1nio</guid>
      <description>&lt;p&gt;I recently passed the &lt;a href="https://aws.amazon.com/certification/certified-devops-engineer-professional/?ch=sec&amp;amp;sec=rmg&amp;amp;d=1"&gt;AWS Certified DevOps Engineer - Professional&lt;/a&gt; certification, and honestly I did it more to get a sense of completion (and to finally add the sticker I picked up at re:Invent 2018) than an expectation that I would learn a lot, but I was surprised how many gaps in my knowledge it filled in and how relevant they would end up being.&lt;/p&gt;

&lt;h2&gt;
  
  
  Certification topics
&lt;/h2&gt;

&lt;p&gt;The certification focuses on a number of services that I haven't used a great deal, and aren't necessarily the ones I would use for green-fields development, but have an important role when it comes to cloud migrations of existing applications. In particular you can expect to see a lot of questions about Elastic Beanstalk and CodeDeploy, and learning their different deployment options is a large part of preparing for this certification.&lt;/p&gt;

&lt;p&gt;You also need to learn about CodeCommit, CodeBuild and CodePipeline, but while these services have their uses (and generally fairly attractive pay-as-you-go pricing), their functionality and ease of use is unfortunately far behind other CI/CD solutions (e.g. Azure DevOps, GitHub Actions, GitLab).&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;As part of my certification prep, I used CodeBuild to deploy the container image for &lt;a href="https://github.com/LeadingEDJE/stackmanager"&gt;stackmanager&lt;/a&gt; to the new &lt;a href="https://gallery.ecr.aws/leadingedje/stackmanager"&gt;Amazon ECR public gallery&lt;/a&gt;. Having to configure some elements within AWS (using CloudFormation) and others within the stackmanager repository (in GitHub) made configuring the build much more complicated and error prone than solutions where everything can be configured in one place.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;A large part of the certification focuses on deployments to EC2 instances within Auto-Scaling Groups, or best practices around migrating on-premises applications to EC2. Understanding some of the details of ASGs, in particular lifecycle hooks is important to be able to pass the exam.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Honestly, I got tired of the repetition within the practice tests and the actual exam and the questions started to blur together. I had to take a brief break during the exam to take my hands away from the keyboard and close my eyes for a few minutes to re-focus - it would have been nice to get up and stretch, but that was not possible in the proctored exam.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;One other service that I really needed to focus on was EventBridge (aka CloudWatch events). I've used CloudWatch events for triggering other services (mostly Lambda functions and Step Functions) based upon a schedule or something occurring within the AWS account, but it wasn't until studying for this certification that I really appreciated the difference between services that natively create events, and those where events are available via CloudTrails. It's also interesting that there are a small number of API calls you can make directly from an event (e.g. Create an EC2 snapshot) that probably predate Lambda functions. There are also a lot more targets for events than I was aware of.&lt;/p&gt;

&lt;p&gt;There is a focus on monitoring and governance, with AWS Config and Trusted advisor being used to trigger remediations for different types of issues.&lt;/p&gt;

&lt;p&gt;One of the hardest parts of the certification is remembering exactly what mechanisms the different services have for triggering other activities - whether it's AWS Config directly triggering SSM Automation, or the types of notifications CodePipeline supports. &lt;/p&gt;

&lt;p&gt;It's also important to be familiar with Lambda, ECS, CloudFormation and a number of other services, but these are services I use on a regular basis and were not a large part of the exam.&lt;/p&gt;

&lt;h2&gt;
  
  
  Relevance
&lt;/h2&gt;

&lt;p&gt;OK, so that's a lot about what the certification covers, but by itself that doesn't make a convincing case for relevance.&lt;/p&gt;

&lt;p&gt;Leading EDJE is an &lt;a href="https://partners.amazonaws.com/partners/0010L00001ml09OQAQ/Leading%20EDJE"&gt;AWS Select Consulting Partner&lt;/a&gt;, and as I become more involved in our partnership and work on projects at our clients, it has become clear how early many companies are in the overall cloud adoption process. &lt;/p&gt;

&lt;p&gt;The migration path can vary a lot, but it is common for enterprises to have applications that are moved to the cloud but are not immediately rearchitected to take full advantage of cloud functionality. These may be applications that are due to be retired or replaced but are still business critical and may still be updated on a regular basis.&lt;/p&gt;

&lt;p&gt;These applications will run on EC2 instances, either managed directly or using Elastic Beanstalk. Understanding the different deployment options as covered by this certification is essential to knowing how best to configure and manage applications that may have restrictions due to their architecture or licensing.&lt;/p&gt;

&lt;p&gt;Recently I've been able to offer advice on where CodeDeploy might be an appropriate tool for deployments, and some of the limitations it has (for example it is &lt;a href="https://docs.aws.amazon.com/codedeploy/latest/userguide/troubleshooting-auto-scaling.html#troubleshooting-multiple-depgroups"&gt;not recommended to have multiple deployment groups targeting a single ASG&lt;/a&gt;).&lt;/p&gt;

&lt;h1&gt;
  
  
  A convincing case?
&lt;/h1&gt;

&lt;p&gt;I learned quite a lot while studying for this certification, and there was less overlap with the other certifications than I expected. I was familiar with the services covered, but had never gone into this much depth with most of them before.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Most of my preparation was a combination of practice tests and going through the AWS documentation for the services included in the tests and making notes, plus my experimentation with CodeBuild mentioned above.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;If you are coming from a System Administrator background and are learning about AWS, there is a lot of important information here and it is the obvious choice after the AWS Certified SysOps Administrator - Associate certification.&lt;/p&gt;

&lt;p&gt;Let me know in the comments if you've taken this certification and felt it was worthwhile (or not).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.to/leading-edje"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--adXLSpgt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/scroshyxang2b4bbiiqu.png" alt="Leading EDJE dev.to"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>certification</category>
    </item>
    <item>
      <title>stackmanager: Yet another tool for CloudFormation</title>
      <dc:creator>Andrew May</dc:creator>
      <pubDate>Thu, 29 Oct 2020 00:55:29 +0000</pubDate>
      <link>https://dev.to/leading-edje/stackmanager-yet-another-tool-for-cloudformation-45o1</link>
      <guid>https://dev.to/leading-edje/stackmanager-yet-another-tool-for-cloudformation-45o1</guid>
      <description>&lt;p&gt;I recently started building &lt;a href="https://github.com/LeadingEDJE/stackmanager" rel="noopener noreferrer"&gt;stackmanager&lt;/a&gt;, an Open Source project for managing CloudFormation stacks.&lt;/p&gt;

&lt;p&gt;There are already a number of other tools for doing this, and I've used or tried several of them, but none quite fit how I wanted them to work.&lt;/p&gt;

&lt;h2&gt;
  
  
  Functional Requirements
&lt;/h2&gt;

&lt;h4&gt;
  
  
  Managing all the configuration used to create/update a CloudFormation stack in a single file
&lt;/h4&gt;

&lt;p&gt;The CloudFormation &lt;code&gt;create-stack&lt;/code&gt; command has lots of separate values you need supply, and with the CLI it's a pain to supply Stack Name, Template File, Parameters, Targets, Capabilities, etc.&lt;/p&gt;

&lt;h4&gt;
  
  
  Support for deploying the same stack to different environments (e.g. dev, prod, or different regions) while reducing duplication of configuration
&lt;/h4&gt;

&lt;p&gt;Almost all the CloudFormation I write is used in multiple different environments (often different accounts), and sometimes is deployed to multiple regions for failover.&lt;/p&gt;

&lt;h4&gt;
  
  
  Uses ChangeSets, with control over whether to immediately apply or apply later
&lt;/h4&gt;

&lt;p&gt;ChangeSets allow you to preview (to a limited degree) the changes you're making, and especially if you're using the &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/what-is-sam.html" rel="noopener noreferrer"&gt;AWS Serverless Application Model&lt;/a&gt; transformation, or a CloudFormation macro, you don't necessarily know exactly what resources are going to be created until the CloudFormation service process the template.&lt;/p&gt;

&lt;p&gt;In some CI/CD workflows, there may be an approval requirement before changes go into production, and the ChangeSet allows you to preview changes.&lt;/p&gt;

&lt;h4&gt;
  
  
  Log progress
&lt;/h4&gt;

&lt;p&gt;Print a summary of the ChangeSet and show the CloudFormation events whether the stack update succeeds or fails. I really like how SAM shows this information.&lt;/p&gt;

&lt;h4&gt;
  
  
  Support Lambda Functions
&lt;/h4&gt;

&lt;p&gt;This is somewhat outside the core functionality, but as I started writing this while working on a project that deployed Lambda functions using CloudFormation (where the S3 Key is changed when there is new code, triggering an update), I wanted to be able to use a single tool to build, upload and deploy Lambda functions, similar to the SAM CLI, but better suited to more general CloudFormation management and CI/CD.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F45indxvy270c6v7wp03s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F45indxvy270c6v7wp03s.png" alt="Lambda"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Non-functional requirements
&lt;/h2&gt;

&lt;p&gt;This is not the first utility of this sort I've written, but previous versions have belonged to the client who I wrote them for. I wanted to create an Open Source utility that I could use at multiple clients and for Leading EDJE infrastructure.&lt;/p&gt;

&lt;p&gt;Writing stackmanager from scratch allowed me to improve on what I'd written previously, and include things I'd previously neglected like unit tests.&lt;/p&gt;

&lt;p&gt;Even if I end up the only one contributing to the code base I also wanted to run it like a regular open source project with a list of issues, pull requests and releases when I reached a suitable milestone.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fehpwh0w58zd9uklyzawh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fehpwh0w58zd9uklyzawh.png" alt="Issues"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Getting to 1.0
&lt;/h2&gt;

&lt;p&gt;A lot of Open Source projects never reach a 1.0 release, but I just wanted to release something relatively complete and well tested and not necessarily include everything that stackmanager could possibly do. The hardest thing to finish in the list of issues I'd included in the 1.0 milestone was to get acceptable code coverage, as my experience writing Python unit tests is somewhat limited, especially when it comes to mocking.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fuyne5bxbk6xe1lflsbzb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fuyne5bxbk6xe1lflsbzb.png" alt="Releases"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Using stackmanager
&lt;/h2&gt;

&lt;p&gt;I don't really want to rewrite the &lt;a href="https://github.com/LeadingEDJE/stackmanager/blob/master/README.md" rel="noopener noreferrer"&gt;README&lt;/a&gt; here, but stackmanager allows you to do things like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ stackmanager --profile dev --region us-east-1 \
build-lambda --source-dir integration/functions/python/hello_world/ --output-dir /c/dev/temp/ --runtime python3.7 \
upload --bucket stackmanager-lambda-files-us-east-1 --key python/hello_world.zip \
deploy --environment dev --config-file integration/functions/python/config.yaml --auto-apply

Building python3.7 Lambda function from integration/functions/python/hello_world/

Running PythonPipBuilder:ResolveDependencies
Running PythonPipBuilder:CopySource

Built Lambda Archive C:\dev\temp\hello_world.zip
2020-10-28 20:33:21 Found credentials in shared credentials file: ~/.aws/credentials

Uploaded C:\dev\temp\hello_world.zip to s3://stackmanager-lambda-files-us-east-1/python/hello_world.zip

Stack: d-StackManager-PythonFunction, Status: does not exist

Creating ChangeSet c8a1e3704d0da48f69903bafcaf239c86

Action    LogicalResourceId    ResourceType           Replacement
--------  -------------------  ---------------------  -------------
Add       FunctionRole         AWS::IAM::Role         -
Add       Function             AWS::Lambda::Function  -

Executing ChangeSet c8a1e3704d0da48f69903bafcaf239c86 for d-StackManager-PythonFunction

ChangeSet c8a1e3704d0da48f69903bafcaf239c86 for d-StackManager-PythonFunction successfully completed:

Timestamp                  LogicalResourceId              ResourceType                ResourceStatus      Reason
-------------------------  -----------------------------  --------------------------  ------------------  ---------------------------
2020-10-28 20:33:24-04:00  d-StackManager-PythonFunction  AWS::CloudFormation::Stack  REVIEW_IN_PROGRESS  User Initiated
2020-10-28 20:33:35-04:00  d-StackManager-PythonFunction  AWS::CloudFormation::Stack  CREATE_IN_PROGRESS  User Initiated
2020-10-28 20:33:39-04:00  FunctionRole                   AWS::IAM::Role              CREATE_IN_PROGRESS  -
2020-10-28 20:33:39-04:00  FunctionRole                   AWS::IAM::Role              CREATE_IN_PROGRESS  Resource creation Initiated
2020-10-28 20:33:53-04:00  FunctionRole                   AWS::IAM::Role              CREATE_COMPLETE     -
2020-10-28 20:33:56-04:00  Function                       AWS::Lambda::Function       CREATE_IN_PROGRESS  -
2020-10-28 20:33:57-04:00  Function                       AWS::Lambda::Function       CREATE_IN_PROGRESS  Resource creation Initiated
2020-10-28 20:33:57-04:00  Function                       AWS::Lambda::Function       CREATE_COMPLETE     -
2020-10-28 20:33:59-04:00  d-StackManager-PythonFunction  AWS::CloudFormation::Stack  CREATE_COMPLETE     -
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This built a Python Lambda function, uploaded the zip file to S3 and then deployed a CloudFormation stack that used the file in S3, previewing the change-set and then running it.&lt;/p&gt;

&lt;p&gt;In this case the functionality is similar to the SAM CLI for Lambda functions, but the focus of stackmanager is really the &lt;code&gt;deploy&lt;/code&gt; command that is for any CloudFormation stack and not just Lambda functions.&lt;/p&gt;

&lt;p&gt;At the heart of stackmanager is the configuration file that can encapsulate the configuration for multiple environments. For the example I ran above this is the configuration file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="nn"&gt;---&lt;/span&gt;
&lt;span class="na"&gt;Environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;all&lt;/span&gt;
&lt;span class="na"&gt;StackName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;{{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;EnvironmentCode&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}-StackManager-PythonFunction"&lt;/span&gt;
&lt;span class="na"&gt;Parameters&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;Environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;{{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Environment&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}"&lt;/span&gt;
&lt;span class="na"&gt;Tags&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;Application&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;StackManager&lt;/span&gt;
  &lt;span class="na"&gt;Environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;{{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Environment&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}"&lt;/span&gt;
&lt;span class="na"&gt;Template&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;integration/functions/python/template.yaml&lt;/span&gt;
&lt;span class="na"&gt;Capabilities&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;CAPABILITY_IAM&lt;/span&gt;
&lt;span class="nn"&gt;---&lt;/span&gt;
&lt;span class="na"&gt;Environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;dev&lt;/span&gt;
&lt;span class="na"&gt;Region&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;us-east-1&lt;/span&gt;
&lt;span class="na"&gt;Variables&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;EnvironmentCode&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;d&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this simple example the inheritance between the &lt;code&gt;dev&lt;/code&gt; environment and the &lt;code&gt;all&lt;/code&gt; environment isn't particular useful, but once I have multiple environments and a mix of common and unique parameters this really cuts down on the total amount of configuration (and the number of separate files to edit).&lt;/p&gt;

&lt;h2&gt;
  
  
  Is stackmanager right for you?
&lt;/h2&gt;

&lt;p&gt;If you're already using CloudFormation, but you have a mess of parameter files and other arguments you're using to run your CloudFormation then I'd like to think this will help you clean up the mess.&lt;/p&gt;

&lt;p&gt;Moving forward there's a possibility that CDK will obsolete writing CloudFormation directly, but there's a lot of CloudFormation already out there so it's going to be around for a while.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.to/leading-edje"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fscroshyxang2b4bbiiqu.png" alt="Leading EDJE dev.to"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>showdev</category>
    </item>
    <item>
      <title>PyPI release using GitHub Actions</title>
      <dc:creator>Andrew May</dc:creator>
      <pubDate>Sun, 16 Aug 2020 18:53:18 +0000</pubDate>
      <link>https://dev.to/leading-edje/pypi-release-using-github-actions-4f9a</link>
      <guid>https://dev.to/leading-edje/pypi-release-using-github-actions-4f9a</guid>
      <description>&lt;h3&gt;
  
  
  My Workflow
&lt;/h3&gt;

&lt;p&gt;I'm making use of the &lt;code&gt;release&lt;/code&gt; event in a GitHub action to release my open source &lt;code&gt;stackmanager&lt;/code&gt; project to PyPI. This allows me to use the GitHub tag name for the PyPI release and just have one place to manage versions.&lt;/p&gt;

&lt;p&gt;I don't always want to release a new version after a PR is merged to master, so this gives me manual control over releases.&lt;/p&gt;

&lt;p&gt;This has been my first time using GitHub Actions and I've been generally pleased with how it worked but found some of the documentation to be a bit cryptic.&lt;/p&gt;

&lt;h3&gt;
  
  
  Submission Category:
&lt;/h3&gt;

&lt;p&gt;Maintainer Must-Haves&lt;/p&gt;

&lt;h3&gt;
  
  
  Yaml File or Link to Code
&lt;/h3&gt;


&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vJ70wriM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://practicaldev-herokuapp-com.freetls.fastly.net/assets/github-logo-ba8488d21cd8ee1fee097b8410db9deaa41d0ca30b004c0c63de0a479114156f.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/LeadingEDJE"&gt;
        LeadingEDJE
      &lt;/a&gt; / &lt;a href="https://github.com/LeadingEDJE/stackmanager"&gt;
        stackmanager
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      Utility for managing Cloudformation stacks
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;h1&gt;
stackmanager&lt;/h1&gt;
&lt;p&gt;&lt;a href="https://badge.fury.io/py/stackmanager" rel="nofollow"&gt;&lt;img src="https://camo.githubusercontent.com/2b0f0fdec32d65433d574d98297db4ec9788676a/68747470733a2f2f62616467652e667572792e696f2f70792f737461636b6d616e616765722e737667" alt="PyPI version"&gt;&lt;/a&gt;
&lt;a href="https://coveralls.io/github/LeadingEDJE/stackmanager?branch=master" rel="nofollow"&gt;&lt;img src="https://camo.githubusercontent.com/e3c0e22dad85d1fe3e5166f3bd77a0b5ee629c09/68747470733a2f2f636f766572616c6c732e696f2f7265706f732f6769746875622f4c656164696e6745444a452f737461636b6d616e616765722f62616467652e7376673f6272616e63683d6d6173746572" alt="Coverage Status"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Utility to manage CloudFormation stacks based upon a Template (either local or in S3) and a YAML configuration file.&lt;/p&gt;
&lt;p&gt;Uses ChangeSets to create or update CloudFormation stacks, allowing the ChangeSets to either be automatically
applied or applied later (e.g. during a later phase of a build pipeline after review of the ChangeSet).&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;There are also some utility methods for building a lambda file zip and uploading files to S3
These are to provide some of the AWS SAM CLI functionality while fitting into the workflow and configuration
style of stackmanager.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h2&gt;
Configuration&lt;/h2&gt;
&lt;p&gt;The configuration file can either be a single YAML document containing the configuration for a stack
for a specific environment and region, or can contain multiple documents for different deployments
of that stack to different environments and regions.&lt;/p&gt;
&lt;h3&gt;
Single Environment&lt;/h3&gt;
&lt;p&gt;The configuration combines together the different values that are typically passed to the CloudFormation
command line when creating…&lt;/p&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/LeadingEDJE/stackmanager"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;


&lt;p&gt;Specifically, the &lt;a href="https://github.com/LeadingEDJE/stackmanager/blob/master/.github/workflows/release.yml"&gt;release.yml&lt;/a&gt; workflow is triggered on release.&lt;/p&gt;

&lt;h3&gt;
  
  
  Additional Info
&lt;/h3&gt;

&lt;p&gt;The version of the release is taken from the &lt;code&gt;github.event.release.tag_name&lt;/code&gt; property of the &lt;a href="https://docs.github.com/en/developers/webhooks-and-events/webhook-events-and-payloads#release"&gt;release event&lt;/a&gt; and passed to &lt;code&gt;setup.py&lt;/code&gt; as the &lt;code&gt;STACKMANAGER_VERSION&lt;/code&gt; environment variable.&lt;/p&gt;

&lt;p&gt;I'm using a PyPI &lt;a href="https://pypi.org/help/#apitoken"&gt;API token&lt;/a&gt; restricted to this project so that I don't have to store a username and password in GitHub secrets (this also allows me to turn on 2FA for my PyPI account).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight yaml"&gt;&lt;code&gt;    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Package and Upload&lt;/span&gt;
      &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;STACKMANAGER_VERSION&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ github.event.release.tag_name }}&lt;/span&gt;
        &lt;span class="na"&gt;TWINE_USERNAME&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;__token__&lt;/span&gt;
        &lt;span class="na"&gt;TWINE_PASSWORD&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.PYPI_APIKEY }}&lt;/span&gt;
      &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
        &lt;span class="s"&gt;python setup.py sdist bdist_wheel&lt;/span&gt;
        &lt;span class="s"&gt;twine upload dist/*&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



</description>
      <category>actionshackathon</category>
    </item>
    <item>
      <title>AWS Lambda Layer for Private Certificates</title>
      <dc:creator>Andrew May</dc:creator>
      <pubDate>Mon, 10 Aug 2020 20:02:55 +0000</pubDate>
      <link>https://dev.to/leading-edje/aws-lambda-layer-for-private-certificates-465j</link>
      <guid>https://dev.to/leading-edje/aws-lambda-layer-for-private-certificates-465j</guid>
      <description>&lt;h1&gt;
  
  
  The problem
&lt;/h1&gt;

&lt;p&gt;&lt;em&gt;How can code running in the managed AWS Lambda environment call services that use private certificates for HTTPS?&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;The majority of enterprises moving to AWS or other cloud platforms have existing on-premises applications, and there is often a need for the new cloud based applications to talk back to services on-prem. Typically this done with a hybrid network where the corporate network and AWS VPC(s) are connected using a VPN or Direct Connect.&lt;/p&gt;

&lt;p&gt;Let's assume that the network has been set-up and the on-prem service is either using public DNS or a solution like &lt;a href="https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/resolver.html" rel="noopener noreferrer"&gt;Route 53 resolver&lt;/a&gt; has been configured and it's possible to establish a connection to the on-premises service.&lt;/p&gt;

&lt;p&gt;If the service is using a public certificate (issued by a public certificate authority that the Lambda environment is aware of and has a copy of the root certificate public key) then there will be no problems, &lt;em&gt;but if the enterprise is using their own private Certificate Authority then there will be an error making HTTPS calls.&lt;/em&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Enterprises may have their own CA for services that were only ever expected to be used internally. It gives greater control over issuing and revoking certificates. Typically the CA certificate will be installed onto desktops and services at the enterprise to ensure that connections to these services are trusted in browsers and between services.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;If this were an EC2 instance calling back to on-prem then you would probably install the certificate as part of the instance set-up, but we have less control in the transient Lambda environment.&lt;/p&gt;

&lt;p&gt;It's possible to write code to handle the TLS failure, or even to disable certificate validation entirely (&lt;em&gt;please don't do this&lt;/em&gt;), but what if there was a better way to have the lambda function transparently trust the private certificate?&lt;/p&gt;

&lt;h1&gt;
  
  
  The solution
&lt;/h1&gt;

&lt;p&gt;&lt;em&gt;Lambda Layers to the rescue!&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Many TLS/SSL libraries or application frameworks have mechanisms to add additional root certificates to the "trust store". Once this is configured, any connections using certificates that were signed by the private root certificate will be automatically trusted.&lt;/p&gt;

&lt;p&gt;On Linux (and macOS) &lt;a href="https://docs.microsoft.com/en-us/dotnet/standard/security/cross-platform-cryptography" rel="noopener noreferrer"&gt;.NET Core uses OpenSSL for cryptography&lt;/a&gt; and OpenSSL allows you to add additional root certificates from a file (in PEM format) using the &lt;a href="https://www.openssl.org/docs/man1.1.0/man3/SSL_CTX_set_default_verify_paths.html" rel="noopener noreferrer"&gt;SSL_CERT_FILE&lt;/a&gt; environment variable. The root certificate doesn't need to be "installed" into the environment.&lt;/p&gt;

&lt;p&gt;Setting environment variables is &lt;a href="https://docs.aws.amazon.com/lambda/latest/dg/configuration-envvars.html" rel="noopener noreferrer"&gt;easy for Lambda functions&lt;/a&gt;, but where should this certificate file be, and how do we get it into the Lambda environment?&lt;/p&gt;

&lt;p&gt;The contents of the Lambda zip file are extracted to &lt;a href="https://alestic.com/2014/11/aws-lambda-environment/" rel="noopener noreferrer"&gt;/var/task&lt;/a&gt;, and it's possible to include the certificate file here and point &lt;code&gt;SSL_CERT_FILE&lt;/code&gt; to a location in this directory, however this has the drawback that every lambda zip file needs to contain the certificate file and you either need to include it into every repository, or include it as part of your CI/CD process.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html" rel="noopener noreferrer"&gt;Lambda Layers&lt;/a&gt; have two main use-cases: sharing dependencies (typically code or libraries, but can be configuration like this) or creating a custom runtime. The contents of the layer are extracted to &lt;code&gt;/opt&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;If we build a layer containing our certificate file it can be re-used across as many lambda functions as we like.&lt;/p&gt;

&lt;h1&gt;
  
  
  Building and sharing the layer
&lt;/h1&gt;

&lt;p&gt;This &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/what-is-sam.html" rel="noopener noreferrer"&gt;SAM&lt;/a&gt; template will build a Certificate Lambda Layer including files in the &lt;code&gt;certs&lt;/code&gt; sub-directory in the layer zip file.&lt;/p&gt;

&lt;p&gt;The ARN for the layer is placed in a &lt;a href="https://docs.aws.amazon.com/systems-manager/latest/userguide/systems-manager-parameter-store.html" rel="noopener noreferrer"&gt;Parameter Store&lt;/a&gt; value that can be referenced by the templates for Lambda functions.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;

&lt;span class="na"&gt;AWSTemplateFormatVersion&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;2010-09-09'&lt;/span&gt;
&lt;span class="na"&gt;Transform&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS::Serverless-2016-10-31&lt;/span&gt;
&lt;span class="na"&gt;Description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;SAM Template for certificate layer&lt;/span&gt;

&lt;span class="na"&gt;Resources&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;CertificateLayer&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS::Serverless::LayerVersion&lt;/span&gt;
    &lt;span class="na"&gt;Properties&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;CompatibleRuntimes&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;dotnetcore2.1&lt;/span&gt;
        &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;dotnetcore3.1&lt;/span&gt;
      &lt;span class="na"&gt;ContentUri&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;./certs/&lt;/span&gt;
      &lt;span class="na"&gt;Description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Layer containing additional certificates&lt;/span&gt;
      &lt;span class="na"&gt;RetentionPolicy&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Retain&lt;/span&gt;

  &lt;span class="na"&gt;LayerParameter&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS::SSM::Parameter&lt;/span&gt;
    &lt;span class="na"&gt;Properties&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;Description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ARN for latest Certificate Layer&lt;/span&gt;
      &lt;span class="na"&gt;Name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;/Lambda/Layers/Certificate&lt;/span&gt;
      &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;String&lt;/span&gt;
      &lt;span class="na"&gt;Value&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Ref&lt;/span&gt; &lt;span class="s"&gt;CertificateLayer&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Within the &lt;code&gt;certs&lt;/code&gt; sub-directory there should be a file containing one or more root certificates in &lt;a href="https://en.wikipedia.org/wiki/Privacy-Enhanced_Mail" rel="noopener noreferrer"&gt;PEM format&lt;/a&gt;. Typically this file will have a &lt;code&gt;.crt&lt;/code&gt; extension, so let's call it &lt;code&gt;additional-certificates.crt&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;If we need to add additional root certificates (or remove expired ones), the layer can be updated, which will create a new version. Existing functions will continue to use the old version, but the Parameter Store key will now point to the ARN of the new version so any new deployments will pick up the change.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Why not a CloudFormation export? CloudFormation exports are good for things that never change, but you can't update the value of an export that's in use by another stack.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;In the parameters of your CloudFormation stacks containing your Lambda functions, include a parameter to pull in the value from Parameter store and then use it with the function, also setting the &lt;code&gt;SSL_CERT_FILE&lt;/code&gt; environment variable:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;

&lt;span class="na"&gt;AWSTemplateFormatVersion&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;2010-09-09'&lt;/span&gt;
&lt;span class="na"&gt;Transform&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS::Serverless-2016-10-31&lt;/span&gt;
&lt;span class="na"&gt;Description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Lambda Function&lt;/span&gt;

&lt;span class="na"&gt;Parameters&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;CertificateLayer&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;Default&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;/Lambda/Layers/Certificate&lt;/span&gt;
    &lt;span class="na"&gt;Description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Certificate Layer ARN&lt;/span&gt;
    &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS::SSM::Parameter::Value&amp;lt;String&amp;gt;&lt;/span&gt;

&lt;span class="na"&gt;Resources&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;MyFunction&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS::Serverless::Function&lt;/span&gt;
    &lt;span class="na"&gt;Properties&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;Environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;Variables&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;SSL_CERT_FILE&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;/opt/additional-certificates.crt&lt;/span&gt;
      &lt;span class="na"&gt;Layers&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="kt"&gt;!Ref&lt;/span&gt; &lt;span class="s"&gt;CertificateLayer&lt;/span&gt;
      &lt;span class="c1"&gt;# Plus all the other properties&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Once again we're using SAM here, although there is unfortunately an issue with the SAM CLI &lt;code&gt;build&lt;/code&gt; command where it &lt;a href="https://github.com/awslabs/aws-sam-cli/issues/1069" rel="noopener noreferrer"&gt;does not like the use of parameter store values for layers&lt;/a&gt;. There are some work arounds described in the GitHub issue.&lt;/p&gt;

&lt;h1&gt;
  
  
  Other runtimes
&lt;/h1&gt;

&lt;p&gt;I've used this approach with .NET Core lambda functions, but this layer may be useful for other lambda runtimes. &lt;/p&gt;

&lt;p&gt;For example, the popular Python &lt;code&gt;requests&lt;/code&gt; library also &lt;a href="https://requests.readthedocs.io/en/master/user/advanced/#ssl-cert-verification" rel="noopener noreferrer"&gt;allows you to configure certificates using the &lt;code&gt;REQUESTS_CA_BUNDLE&lt;/code&gt; environment variable&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.to/leading-edje"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F5uo60qforg9yqdpgzncq.png" alt="Smart EDJE Image"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>serverless</category>
      <category>dotnet</category>
    </item>
    <item>
      <title>Using the AWS Serverless Application Model (SAM)</title>
      <dc:creator>Andrew May</dc:creator>
      <pubDate>Sat, 18 Jul 2020 21:38:27 +0000</pubDate>
      <link>https://dev.to/leading-edje/using-the-aws-serverless-application-model-sam-54gm</link>
      <guid>https://dev.to/leading-edje/using-the-aws-serverless-application-model-sam-54gm</guid>
      <description>&lt;p&gt;I've recently done some work to compare the AWS &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/what-is-sam.html" rel="noopener noreferrer"&gt;Serverless Application Model&lt;/a&gt; to the &lt;a href="https://www.serverless.com/" rel="noopener noreferrer"&gt;Serverless Framework&lt;/a&gt; for building and deploying .NET Core based Lambda functions to AWS, and have started to deploy a number of Lambda functions that were built using the Serverless Application Model (SAM).&lt;/p&gt;

&lt;p&gt;There are a number of existing articles out there that compare the two frameworks, so rather than write another one I'll cut to the chase and say that I prefer SAM over the Serverless Framework for a number of reasons. I would suggest you evaluate both frameworks for yourself because you may have different criteria for evaluation. This article is going to give an overview of SAM and at the end I'll offer a brief comparison with the Serverless Framework. Other articles in this series will talk more about deploying services built with SAM, and other things that we've encountered while building these applications.&lt;/p&gt;

&lt;h1&gt;
  
  
  AWS Serverless Application Model
&lt;/h1&gt;

&lt;p&gt;There are two main parts to SAM - the &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-specification.html" rel="noopener noreferrer"&gt;Template Specification&lt;/a&gt; and the &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-command-reference.html" rel="noopener noreferrer"&gt;Command Line interface&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  SAM Templates
&lt;/h2&gt;

&lt;p&gt;If you're at all familiar with &lt;a href="https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/Welcome.html" rel="noopener noreferrer"&gt;CloudFormation&lt;/a&gt; then you will find writing SAM templates very familiar, because they are just CloudFormation templates with some custom resources and a Transform:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;AWSTemplateFormatVersion&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;2010-09-09"&lt;/span&gt;
&lt;span class="na"&gt;Transform&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS::Serverless-2016-10-31&lt;/span&gt;
&lt;span class="na"&gt;Description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;A simple SAM Template&lt;/span&gt;

&lt;span class="na"&gt;Globals&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;Timeout&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;60&lt;/span&gt;
  &lt;span class="na"&gt;MemorySize&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;512&lt;/span&gt;

&lt;span class="na"&gt;Resources&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;ListFunction&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;                                                                  
    &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS::Serverless::Function&lt;/span&gt;                                                       
    &lt;span class="na"&gt;Properties&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;                                                                           
      &lt;span class="na"&gt;CodeUri&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;./src/ListFunctions/&lt;/span&gt;                                                   
      &lt;span class="na"&gt;Handler&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ListFunctions::ListFunctions.LambdaEntryPoint::FunctionHandlerAsync&lt;/span&gt;
      &lt;span class="na"&gt;Runtime&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;dotnetcore3.1&lt;/span&gt;      
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Although the &lt;code&gt;AWS::Serverless::&lt;/code&gt; resources may look like custom &lt;a href="https://docs.aws.amazon.com/cloudformation-cli/latest/userguide/resource-types.html" rel="noopener noreferrer"&gt;Resource Providers&lt;/a&gt;, they actually have far more in common with &lt;a href="https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/template-macros.html" rel="noopener noreferrer"&gt;Macros&lt;/a&gt; that can manipulate a template using a transform built using a Lambda function. The &lt;a href="https://github.com/awslabs/serverless-application-model" rel="noopener noreferrer"&gt;SAM Translator&lt;/a&gt; is the code that is responsible for converting SAM templates into CloudFormation templates. &lt;/p&gt;

&lt;p&gt;Because this is a CloudFormation template with some additions, you can add standard CloudFormation resources intermingled with the SAM resources. Developer tools that work with CloudFormation templates for editing and linting (&lt;a href="https://pypi.org/project/cfn-lint/0.31.1/" rel="noopener noreferrer"&gt;cfn-lint&lt;/a&gt;) also support SAM templates.&lt;/p&gt;

&lt;p&gt;In fact you can combine your own macros with the SAM Serverless transform to manipulate regular CloudFormation resources or even SAM resources, which can be useful because there are gaps in what the SAM resources support.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Something that's important to note here is that you submit the template containing the &lt;code&gt;AWS::Serverless::&lt;/code&gt; resources to the CloudFormation service and it handles the transformation server-side. There are a few modifications that the SAM CLI may make to the template client-side to replace local references to code with an S3 Bucket and Key, but these are similar to the existing functionality that's part of the &lt;code&gt;aws cloudformation package&lt;/code&gt; command.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Differences between &lt;code&gt;AWS::Serverless::Function&lt;/code&gt; and &lt;code&gt;AWS::Lambda::Function&lt;/code&gt;
&lt;/h3&gt;

&lt;p&gt;Many properties of &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-resource-function.html" rel="noopener noreferrer"&gt;AWS::Serverless::Function&lt;/a&gt; match the properties of &lt;a href="https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-lambda-function.html" rel="noopener noreferrer"&gt;AWS::Lambda::Function&lt;/a&gt;, which is unsurprising because after the transform is applied the one is replaced with the other plus additional resources.&lt;/p&gt;

&lt;p&gt;Some of the biggest benefits to using the &lt;code&gt;AWS::Serverless:Function&lt;/code&gt; are these properties:&lt;/p&gt;

&lt;h4&gt;
  
  
  CodeUri
&lt;/h4&gt;

&lt;p&gt;The &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-resource-function.html#sam-function-codeuri" rel="noopener noreferrer"&gt;CodeUri&lt;/a&gt; property can be a link to a relative directory, and when used with the &lt;code&gt;sam build&lt;/code&gt; command it will perform the appropriate build for the specified &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-resource-function.html#sam-function-runtime" rel="noopener noreferrer"&gt;Runtime&lt;/a&gt; and then package the Lambda code.&lt;/p&gt;

&lt;h4&gt;
  
  
  Events
&lt;/h4&gt;

&lt;p&gt;The &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-resource-function.html#sam-function-events" rel="noopener noreferrer"&gt;Events&lt;/a&gt; allows you to specify the triggers for a Lambda function that you would normally need to specify in separate CloudFormation resources. There are current 14 different &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-property-function-eventsource.html" rel="noopener noreferrer"&gt;Event Sources&lt;/a&gt; including API Gateway (both the REST and HTTP versions), S3 and SQS.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;One big limitation of the S3 event is that it only works if the S3 bucket is declared in the same SAM template. This is because CloudFormation only allows you to specify S3 events directly on the S3 bucket resource. In a lot of cases the S3 bucket will have a different lifecyle than the lambda function and you would not want CloudFormation to try and delete it (which will fail if it contains objects) if you delete the Lambda function.&lt;/p&gt;

&lt;p&gt;CloudFormation Custom Resources can be used to add S3 lambda triggers separately from the bucket creation.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;With the &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-property-function-api.html" rel="noopener noreferrer"&gt;Api&lt;/a&gt; and &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-property-function-httpapi.html" rel="noopener noreferrer"&gt;HttpApi&lt;/a&gt; events there's some extra magic going on - if you don't specify the &lt;code&gt;RestApiId&lt;/code&gt; (for the Api event) or &lt;code&gt;ApiId&lt;/code&gt; (for the HttpApi event) it will automatically create an API Gateway using the mappings in your events.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;Events&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;HttpApiEvent&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;HttpApi&lt;/span&gt;
    &lt;span class="na"&gt;Properties&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;Path&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;/&lt;/span&gt;
      &lt;span class="na"&gt;Method&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;GET&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Example HttpApi event&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;There is a lot more you can do with API Gateways using the &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-resource-api.html" rel="noopener noreferrer"&gt;AWS::Serverless::Api&lt;/a&gt; and &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-resource-httpapi.html" rel="noopener noreferrer"&gt;AWS::Serverless::HttpApi&lt;/a&gt; resources, but that's a whole separate topic.&lt;/p&gt;

&lt;p&gt;Unfortunately the Events section lacks supports for &lt;a href="https://docs.aws.amazon.com/elasticloadbalancing/latest/application/lambda-functions.html" rel="noopener noreferrer"&gt;Application Load Balancers&lt;/a&gt;, which are becoming a popular alternative to API Gateway for Lambda functions because they are significantly cheaper than REST API Gateways and can be used to easily build internal APIs (something that is possible but more difficult with REST API Gateways and not currently possible with HTTP API Gateways). That doesn't mean you can't define a Lambda function using &lt;code&gt;AWS::Serverless:Function&lt;/code&gt; and target it with an ALB, but you will need to create the &lt;code&gt;AWS::Lambda::Permission&lt;/code&gt;, &lt;code&gt;AWS::ElasticLoadBalancerV2:TargetGroup&lt;/code&gt; and &lt;code&gt;AWS::ElasticLoadBalancerV2::ListenerRule&lt;/code&gt; resources yourself.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;This is one of those gaps that can be plugged with CloudFormation macros that operate on the SAM resources, like this example: &lt;a href="https://github.com/glassechidna/sam-alb" rel="noopener noreferrer"&gt;https://github.com/glassechidna/sam-alb&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h4&gt;
  
  
  Policies
&lt;/h4&gt;

&lt;p&gt;Doesn't everyone love writing IAM policies? The &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-resource-function.html#sam-function-policies" rel="noopener noreferrer"&gt;Policies&lt;/a&gt; property can be used in a number of ways, but one really useful feature is being able to use &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-policy-templates.html" rel="noopener noreferrer"&gt;Policy Templates&lt;/a&gt; that provide well designed sets of permissions for certain resources.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;Policies&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;DynamoDBReadPolicy&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;TableName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Ref&lt;/span&gt; &lt;span class="s"&gt;DyanamoDBTable&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;KMSDecryptPolicy&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;KeyId&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Ref&lt;/span&gt; &lt;span class="s"&gt;DynamoDBKmsKey&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Granting a Lambda function to read from a specific table and decrypt data using the related KMS key&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;It will also add the necesary permissions for CloudWatch logs, and other resources declared on the &lt;code&gt;AWS::Serverless:Function&lt;/code&gt; resource - i.e. if you specify VPC subnets to run the function within your VPC then it will add the necessary permissions automatically.&lt;/p&gt;

&lt;h2&gt;
  
  
  SAM CLI
&lt;/h2&gt;

&lt;p&gt;The standard workflow for SAM applications is to run &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-cli-command-reference-sam-build.html" rel="noopener noreferrer"&gt;sam build&lt;/a&gt; to build and package any code referenced in &lt;code&gt;AWS::Serverless::Function&lt;/code&gt; resources, then &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-cli-command-reference-sam-deploy.html" rel="noopener noreferrer"&gt;sam deploy&lt;/a&gt; to upload the lambda zip files to S3 and then create a CloudFormation stack (by default using change sets) to deploy everything.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;code&gt;sam init&lt;/code&gt;
&lt;/h3&gt;

&lt;p&gt;Create a new SAM project. There are quick start examples for most if not all of the supported runtimes:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ sam init
Which template source would you like to use?
        1 - AWS Quick Start Templates
        2 - Custom Template Location
Choice: 1

Which runtime would you like to use?
        1 - nodejs12.x
        2 - python3.8
        3 - ruby2.7
        4 - go1.x
        5 - java11
        6 - dotnetcore3.1
        7 - nodejs10.x
        8 - python3.7
        9 - python3.6
        10 - python2.7
        11 - ruby2.5
        12 - java8
        13 - dotnetcore2.1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;While these aren't the most complicated examples, they set up the directory structure in a way that works well with SAM and include testing set-up. You can also use custom templates if you have your own way of organizing your projects.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;code&gt;sam build&lt;/code&gt;
&lt;/h3&gt;

&lt;p&gt;The &lt;code&gt;build&lt;/code&gt; command looks for &lt;code&gt;template.yaml&lt;/code&gt; or &lt;code&gt;template.yml&lt;/code&gt; in the current directory by default, and figures out what to build based upon the contents of the template. The template could potentially contain several lambda functions using different runtimes, and it would package the code pointed to be the &lt;code&gt;CodeUri&lt;/code&gt; property using an appropriate builder for the runtime. All of these files end up in an &lt;code&gt;.aws-sam/build&lt;/code&gt; directory.&lt;/p&gt;

&lt;p&gt;There are few arguments you might need to specify in certain cases:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Set &lt;code&gt;--use-container&lt;/code&gt; for runtimes that have native dependencies. It will pull down a docker image based upon the Lambda environment for the function runtime and build within that. This can be important for Python where some dependencies may need to be compiled and if built on another platform (e.g. Windows) will not work in the Lambda runtime which is based upon Amazon Linux (1 or 2 depending upon the runtime).&lt;/li&gt;
&lt;li&gt;Set &lt;code&gt;--profile&lt;/code&gt;/&lt;code&gt;--region&lt;/code&gt; if using Lambda layers - it will retrieve the layer from your AWS account. If the Layer ARN is provided via a Parameter you will also need to provide the value using &lt;code&gt;--parameter-overrides&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Other parameters can override the build directory and the manifest file (e.g. requirements.txt for Python, a .csproj file for C#, pom.xml or build.gradle for Java) location or the template name if you're not using &lt;code&gt;template.[yaml|yml]&lt;/code&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;code&gt;sam package&lt;/code&gt;
&lt;/h3&gt;

&lt;p&gt;The &lt;code&gt;package&lt;/code&gt; command is less commonly used now because it's functionality has been rolled into the &lt;code&gt;deploy&lt;/code&gt; command. This command packages the results of the &lt;code&gt;build&lt;/code&gt; and uploads to s3 and then replaces the local references in the template with references to the objects uploaded to S3 and returns the new template (either to standard out or a file). The command supports these resource types and attributes:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  Resource : AWS::ServerlessRepo::Application | Location : LicenseUrl
  Resource : AWS::ServerlessRepo::Application | Location : ReadmeUrl
  Resource : AWS::Serverless::Function | Location : CodeUri
  Resource : AWS::Serverless::Api | Location : DefinitionUri
  Resource : AWS::Serverless::StateMachine | Location : DefinitionUri
  Resource : AWS::AppSync::GraphQLSchema | Location : DefinitionS3Location
  Resource : AWS::AppSync::Resolver | Location : RequestMappingTemplateS3Location
  Resource : AWS::AppSync::Resolver | Location : ResponseMappingTemplateS3Location
  Resource : AWS::AppSync::FunctionConfiguration | Location : RequestMappingTemplateS3Location
  Resource : AWS::AppSync::FunctionConfiguration | Location : ResponseMappingTemplateS3Location
  Resource : AWS::Lambda::Function | Location : Code
  Resource : AWS::ApiGateway::RestApi | Location : BodyS3Location
  Resource : AWS::ElasticBeanstalk::ApplicationVersion | Location : SourceBundle
  Resource : AWS::CloudFormation::Stack | Location : TemplateURL
  Resource : AWS::Serverless::Application | Location : Location
  Resource : AWS::Lambda::LayerVersion | Location : Content
  Resource : AWS::Serverless::LayerVersion | Location : ContentUri
  Resource : AWS::Glue::Job | Location : Command.ScriptLocation
  Resource : AWS::StepFunctions::StateMachine | Location : DefinitionS3Location
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You are required to specify the &lt;code&gt;--s3-bucket&lt;/code&gt; parameter and you can supply and optional &lt;code&gt;--s3-prefix&lt;/code&gt; (useful if you want to share a bucket between multiple serverless applications but want to group related artifacts). The S3 key will be automatically generated using an md5 of the uploaded resource, and the command will by default skip uploading if the md5 of the new resource matches an existing S3 key.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;One thing to note about this approach is that it works much better running locally than from a CI/CD pipeline. Locally the build may skip recompilation if nothing has changed or copy files and retaining their timestamps; but when using a fresh checkout in a pipeline you are typically performing a fresh build, and even if all the files in the resulting artifact are identical, if they're zipped up for a Lambda function the different timestamps will cause the md5 to be different than any previous build.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  &lt;code&gt;sam deploy&lt;/code&gt;
&lt;/h3&gt;

&lt;p&gt;The &lt;code&gt;deploy&lt;/code&gt; command includes the functionality in &lt;code&gt;package&lt;/code&gt; in newer versions of the SAM CLI, and also deploys the application using CloudFormation. It makes use of ChangeSets and will display the pending changes before executing them and then displays the CloudFormation events as they occur. The command has quite a lot of options (many of them are familiar from the related CloudFormation commands), so when using SAM locally you can specify the &lt;code&gt;--guided&lt;/code&gt; option and it will prompt you to make certain decisions and ask for values for all the parameters in the CloudFormation template. These values are then stored in a &lt;a href="https://toml.io/en/" rel="noopener noreferrer"&gt;toml&lt;/a&gt; configuration file which will be read the next time you run &lt;code&gt;sam deploy&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Unlike the &lt;code&gt;package&lt;/code&gt; command, you are not required to specify the S3 bucket to upload the files to (although you can), and by default it will create a new bucket in your account for SAM deployments if one does not already exist.&lt;/p&gt;

&lt;h2&gt;
  
  
  Local Testing
&lt;/h2&gt;

&lt;p&gt;In addition to packaging and deploying to AWS, the &lt;code&gt;sam local&lt;/code&gt; commands allow you to run Lambda functions locally within a Docker container that matches the AWS runtime. It also has some basic support for REST API Gateway applications.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;sam local generate-event&lt;/code&gt; can generate sample events from 22 different services. This can save you from scouring the documentation to find examples, but typically these events are just one of the possible events from the service and you will probably need to tweak it for use in testing your lambda functions.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;sam local invoke&lt;/code&gt; will run your Lambda function in a docker container. You supply an event from either &lt;code&gt;generate-event&lt;/code&gt; or a local file. If you have a single function in your template it will automatically call that, but if there are multiple functions you will need to supply it's LogicalId from the template. The output from the function is almost identical to what you would see in CloudWatch logs, including the timing and utilized memory information. Note that because it starts the container each time you're incurring the Lambda cold start penalty every time.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;sam local start-api&lt;/code&gt; can be used for templates containing a REST API Gateway either directly defined or where the Lambda functions use the &lt;code&gt;Api&lt;/code&gt; event type. It does not support HTTP API Gateways currently. It starts a server in Docker and listens on localhost:3000 by default. This way you can make HTTP calls rather than using the API Gateway events that you need to supply to &lt;code&gt;sam local invoke&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;sam local start-lambda&lt;/code&gt; also starts a HTTP server, but this one allows you to invoke the lambda functions using the CLI or SDKs as if they were running in AWS. This may be useful for running a series of tests against a lambda without running &lt;code&gt;sam local invoke&lt;/code&gt; each time.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Of course there are limitations to what this supports. If your lambda function calls another AWS service (as many do), then it will call out to AWS and you will need to have valid credentials (and those resources will need to have been deployed). It is possible with some work to invoke certain local versions of AWS services (e.g. those from &lt;a href="https://github.com/localstack/localstack" rel="noopener noreferrer"&gt;localstack&lt;/a&gt;), and this may be useful for automated testing, but nothing will be a perfect emulation of the AWS environment.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;This is particularly true when you need to verify that your IAM permissions for calling other services are correct, as when running a function locally and invoking services in AWS it will be using the AWS credentials you supply it with rather than the role that has been defined for the function.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;There are some integrations with a number of IDEs (e.g. Visual Studio Code) that allow you to debug your Lambda function when it's run locally like this. I've had mixed luck getting this to work consistently with .NET Core, but it can be useful.&lt;/p&gt;

&lt;h2&gt;
  
  
  Together or Apart
&lt;/h2&gt;

&lt;p&gt;Because the Serverless Transform used in SAM templates is essentially a pre-defined macro that is implemented as part of the CloudFormation service, there is no requirement that you use the SAM CLI with your SAM templates.&lt;/p&gt;

&lt;p&gt;For example, you might be building the Lambda Zip files as part of your CI/CD pipeline and then use the same Zip file in multiple accounts (e.g. dev/test/prod) without rebuilding it. The SAM CLI doesn't really support this workflow, but you can still use the &lt;code&gt;AWS::Serverless::&lt;/code&gt; resources in your template.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;You will not be able to use the substitutions for local resources that the SAM CLI supports if you are using the CloudFormation commands instead, although &lt;code&gt;aws cloudformation package&lt;/code&gt; does support uploading local Zip files to S3 and returning a transformed template. If you upload to S3 as part of your CI/CD pipeline you can then make the Bucket name and Key CloudFormation parameters that the servless resources reference.&lt;/p&gt;
&lt;/blockquote&gt;




&lt;h1&gt;
  
  
  Comparisons to the Serverless Framework
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;Unlike SAM, the Serverless Framework supports multiple Cloud platforms. If you have a multi-cloud strategy this can be useful.&lt;/li&gt;
&lt;li&gt;Serverless supports a &lt;a href="https://www.serverless.com/plugins/" rel="noopener noreferrer"&gt;plugin&lt;/a&gt; model that extends what the platform can do - with SAM you are somewhat constrained by what has been implemented by the platform and it can sometimes be frustrating waiting for new functionality to be supported.&lt;/li&gt;
&lt;li&gt;There is a Commercial (&lt;a href="https://www.serverless.com/pro/" rel="noopener noreferrer"&gt;Pro&lt;/a&gt;) version that offers a monitoring platform, CI/CD support and other administration features. If your Cloud applications are largely built up of functions this might be a useful option instead of using Cloud specific or third party tools to monitor your applications.&lt;/li&gt;
&lt;li&gt;There are a lot of examples for using the Serverless Framework on their website, but they are dominated by nodeJS (that I believe was the only language supported when it first launched) - if you're looking for C# examples you won't find anything more than the most basic Hello World.&lt;/li&gt;
&lt;li&gt;The CLI is responsible for interacting with the Cloud platform. In the case of AWS it still uses CloudFormation, but the templates are generated by the CLI from the &lt;code&gt;serverless.yml&lt;/code&gt; template.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For me this last point is the most significant. The full &lt;a href="https://www.serverless.com/framework/docs/providers/aws/guide/serverless.yml/" rel="noopener noreferrer"&gt;serverless.yml&lt;/a&gt; is pretty comprehensive and provides support for a lot of things that SAM does not, and you can even include raw CloudFormation resources in the &lt;code&gt;resources&lt;/code&gt; section of the template, but you are still reliant upon the serverless CLI to generate the CloudFormation template. This means you are required to use the CLI as part of your CI/CD pipelines, and if there are problems with the CloudFormation stack you are dealing with an auto-generated template and trying to understand how that related back to your serverless template.&lt;/p&gt;

&lt;p&gt;Because I am &lt;strong&gt;very&lt;/strong&gt; familiar with CloudFormation my bias is towards SAM which allows me to write better CloudFormation rather than Serverless that just makes use of it behind the scenes. Like I said at the start of this article, that's my personal bias and if you don't have the same background in CloudFormation you may find that Serverless works better for you.&lt;/p&gt;




&lt;p&gt;In future articles in this series I'll be writing about some different approaches to using .NET Core to build Lambda functions, and some of the challenges of deploying and using these functions in AWS.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.to/leading-edje"&gt;&lt;br&gt;
  &lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F5uo60qforg9yqdpgzncq.png" alt="Smart EDJE Image"&gt;&lt;br&gt;
&lt;/a&gt;&lt;a&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>serverless</category>
    </item>
    <item>
      <title>Success! Passing the AWS Solutions Architect Professional Certification</title>
      <dc:creator>Andrew May</dc:creator>
      <pubDate>Wed, 01 Jul 2020 21:37:30 +0000</pubDate>
      <link>https://dev.to/leading-edje/success-passing-the-aws-solutions-architect-professional-certification-12k4</link>
      <guid>https://dev.to/leading-edje/success-passing-the-aws-solutions-architect-professional-certification-12k4</guid>
      <description>&lt;p&gt;After much procrastination, I finally took and passed the AWS Solutions Architect Professional certification, using the "virtual" testing option from Pearson Vue. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.youracclaim.com/users/andrew-may.effcdb23" rel="noopener noreferrer"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fpmxve3gvfzeq80e7q32a.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I'd been reluctant to take the test virtually after reading a few horror stories about long waits for the proctor to be available. I also wasn't sure that I could get 3 hours undisturbed at home. However, I was able to take advantage of some time when our office was empty and the virtual experience went fairly smoothly.&lt;/p&gt;

&lt;h1&gt;
  
  
  Additional Preparation
&lt;/h1&gt;

&lt;p&gt;Having taken the &lt;a href="https://dev.to/leading-edje/training-review-advanced-architecting-on-aws-231j"&gt;Advanced Architecting on AWS&lt;/a&gt; course I felt somewhat prepared but needed a way to find the gaps in my knowledge. I'd seen a recommendation for the &lt;a href="https://portal.tutorialsdojo.com/courses/aws-certified-solutions-architect-professional-practice-exams/" rel="noopener noreferrer"&gt;Tutorials DOJO Practice Tests&lt;/a&gt;, and because they were inexpensive I decided to give them a go.&lt;/p&gt;

&lt;p&gt;I took 3 of the timed tests, and they helped me build up a list of topics where I wasn't confident in my knowledge:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F3e77okpevq0kisolcomi.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F3e77okpevq0kisolcomi.jpg" alt="Review Areas"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I then went through the AWS documentation and FAQs for these topics, and in a few cases tried a few things out (e.g. Patch baselines in System Manager). This was a lot to go through, but I used the questions from the test as a guide to which details were important to remember.&lt;/p&gt;

&lt;h1&gt;
  
  
  Testing Virtually
&lt;/h1&gt;

&lt;h2&gt;
  
  
  Scheduling
&lt;/h2&gt;

&lt;p&gt;AWS currently has 2 testing providers, PSI and Pearson VUE. Both are in the process of re-opening their testing centers in Ohio, but with limited capacity and a backlog of people wanting to complete tests, there was a wait of a couple of months for availability at the local testing centers. So I opted to use the virtual testing option from Pearson VUE that has availability 6 days a week.&lt;/p&gt;

&lt;p&gt;I picked a time when our office was going to be empty and booked the test for a couple of weeks later. I knew that a conference room in our office was going to be clear of the forbidden clutter that fills my home office.&lt;/p&gt;

&lt;p&gt;You will receive an email after scheduling the test recommending that you do a system test on the hardware and in the location where you will be taking the test. This checks internet speed and access to your camera and microphone, but also takes you through the steps of taking photos with your phone of yourself, your id and test location. I did the hardware test at home, but skipped the rest knowing that I was going to be in a different location.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F3kgz3uaa9y7hsexni0tb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F3kgz3uaa9y7hsexni0tb.png" alt="System Test"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;You will need a webcam!&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Starting the test
&lt;/h2&gt;

&lt;p&gt;You can start the test up to half an hour before your allotted time, but being late can cause you to fail the test. I turned up at the office in good time and got myself set-up.&lt;/p&gt;

&lt;p&gt;I made the mistake of taking the entire system test before the test. This did the hardware check and then walked me through taking and uploading the photos I needed, but didn't save any of that information.&lt;/p&gt;

&lt;p&gt;Once I'd repeated this process the second time to take the test for real, the kiosk browser opened for the test and I was instructed to wait for a proctor to be available. Fortunately it was just a couple of minutes before they appeared on the chat and asked if they could start a call with me to prepare for the test.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;I can see why a delay at this point would be annoying - I'd already stowed my phone and everything else away, so I was just siting there staring at the screen.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;I was asked a few questions about the location I was taking test test in, what I had plugged into USB (a mouse), and had to scan the room by turning my laptop around. They did ask if I could cover the windows of the conference room and unplug the TV on the wall, but neither of those were possible and they seemed OK with that.&lt;/p&gt;

&lt;h2&gt;
  
  
  The test itself
&lt;/h2&gt;

&lt;p&gt;When you're running the test there's a bar at the top showing your camera image and with a chat button to talk to the proctor. The format of the tests is a little different to the software used for the office practice tests and at PSI testing locations (the button to submit an answer is annoyingly all the way down in the bottom right), but it basically works the same way with a question counter and time countdown at the top right and an option to mark questions for review.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;At one point near the end of the 3 hours when I was reviewing one of the questions I'd marked, I had covered my mouth with my hand and the chat window popped up and I was told not to do that, but otherwise I had no further interactions with the proctor.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This was a difficult test, and there were only a small number of questions where I thought the answer was immediately obvious. For most of the questions it was a process of elimination with many answers being several sentences that are identical apart from a few words. Some of the answers are wrong because they suggest doing something that is not possible, but others are wrong because they don't meet the criteria in the question (e.g. most cost effective).&lt;/p&gt;

&lt;p&gt;I made sure to answer every question as I went, but marked some of them for review later in case I had extra time. I saw that I was doing OK for time, so I reviewed as I went, and after picking an answer I'd double check that it matched the criteria in the question, and in a few cases that caused me to change my answer because I wasn't really answering the question that had been asked.&lt;/p&gt;

&lt;p&gt;I had about 30 minutes left after answering the 75 questions, and wasn't sure whether I was going to pass or fail. I quickly reviewed the questions I'd marked (perhaps 10-15?), but only changed one answer.&lt;/p&gt;

&lt;p&gt;Then after confirming that I was done reviewing my answers I got the normal questionnaire about my testing experience and how long I've used AWS.&lt;/p&gt;

&lt;p&gt;Finally it told me I'd passed (yay!) and to click the button at the bottom left to end the test.&lt;/p&gt;

&lt;p&gt;I wish I'd been able to take a screenshot of the message saying that I'd passed, because honestly I started to doubt it as soon as it was no longer on the screen.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;I received the email confirmation 2 days after the test.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h1&gt;
  
  
  Question Topics
&lt;/h1&gt;

&lt;p&gt;Some of the topics that came up in my exam were:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Networking, in particular connectivity between enterprise networks and AWS using Direct Connect and VPNs&lt;/li&gt;
&lt;li&gt;Accessing S3 using VPC endpoints&lt;/li&gt;
&lt;li&gt;Controlling access across multiple accounts using Organizations and Service Control Policies&lt;/li&gt;
&lt;li&gt;Using AWS Config to track resources and verify compliance&lt;/li&gt;
&lt;li&gt;The pattern of using SQS and Lambda consumers to decouple applications and handle spikes in traffic&lt;/li&gt;
&lt;li&gt;Debugging network connectivity issues&lt;/li&gt;
&lt;li&gt;Restricting deployments to approved applications using Service Catalog&lt;/li&gt;
&lt;li&gt;Improving performance with various kinds of caching: CloudFront, API Gateway, Elasticache, DAX&lt;/li&gt;
&lt;li&gt;Reducing costs with reserved and/or spot instances&lt;/li&gt;
&lt;li&gt;Migrating databases to AWS&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  End of the journey?
&lt;/h1&gt;

&lt;p&gt;I finally managed to add the Solutions Architect Professional sticker to my laptop that I picked up at re:Invent 2017, but I've still got a sticker for the DevOps Professional certification, so perhaps that one is next?&lt;/p&gt;

&lt;p&gt;If you read this because you're getting ready to take the certification exam, I wish you the best of luck. It's difficult, but achievable.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.to/leading-edje"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fscroshyxang2b4bbiiqu.png" alt="Leading EDJE dev.to"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>cloud</category>
      <category>aws</category>
    </item>
    <item>
      <title>The practice test for the AWS Certified Solutions Architect Professional</title>
      <dc:creator>Andrew May</dc:creator>
      <pubDate>Sun, 10 May 2020 18:15:21 +0000</pubDate>
      <link>https://dev.to/leading-edje/the-practice-test-for-the-aws-certified-solutions-architect-professional-550l</link>
      <guid>https://dev.to/leading-edje/the-practice-test-for-the-aws-certified-solutions-architect-professional-550l</guid>
      <description>&lt;p&gt;Last weekend I was working on preparing a talk about AWS certifications (it looks like I will be presenting it at the upcoming &lt;a href="https://midwestcommunityday.com/"&gt;AWS Community Day Midwest&lt;/a&gt;), and as part of that I was including sample questions for the different certifications.&lt;/p&gt;

&lt;p&gt;AWS provides 10 sample questions for each certification, and the ones for the SA Professional certification can be found &lt;a href="https://aws.amazon.com/certification/certified-solutions-architect-professional/"&gt;on the certification page&lt;/a&gt; alongside the exam guide.&lt;/p&gt;

&lt;p&gt;As I hadn't actually gone through this version of the sample questions before, I decided to treat them like a mini test, and was fairly pleased to get 8 out of 10 right.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;In the past AWS did not provide the answers for these sample questions, so you could never be quite sure whether you had them right, but now all of them appear to have the answers and a short explanation (that's the really useful bit for questions you get wrong).&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;After that positive result, I decided to dive in and take the official practice test that you can register for at &lt;a href="http://aws.training"&gt;http://aws.training&lt;/a&gt; and going into the certification sub-portal. The sample tests contain 20 questions and normally cost $20, but I had a coupon for a free practice test from a previous certification I'd taken.&lt;/p&gt;

&lt;p&gt;The practice test gives you an hour to answer 20 questions, so 3 minutes for each. The full test lasts for 3 hours, and &lt;a href="https://help.acloud.guru/hc/en-us/articles/115001390094-AWS-Certification-Exam-FAQs"&gt;apparently&lt;/a&gt; contains 75-80 questions, so you have less time per question. I've heard that a lot of people struggle with time on the test, so I tried to go through the questions quickly only pausing when I was unsure.&lt;/p&gt;

&lt;p&gt;In the end I finished the 20 questions in about 35 minutes, and I felt pretty good about most of my answers and was surprised that some of the questions were relatively straightforward (of course what seems straightforward mostly had to do with which services I've been using for the last 5 years).&lt;/p&gt;

&lt;p&gt;Unfortunately my confidence was somewhat misplaced as when I got the results I had only 65% correct (13/20). I had some doubts about a few questions (and I looked up a few things afterwards and knew where I'd gone wrong in those cases), but I think I was confident in answers to more than 13 of the questions.&lt;/p&gt;

&lt;p&gt;Here's the breakdown that they sent me:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Topic Level Scoring:&lt;br&gt;
1.0  Design for Organizational Complexity: 50%&lt;br&gt;
2.0  Design for New Solutions: 40%&lt;br&gt;
3.0  Migration Planning: 75%&lt;br&gt;
4.0  Cost Control: 50%&lt;br&gt;
5.0  Continuous Improvement for Existing Solutions: 85%&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Some of those (particularly Design for New Solutions) are a little painful.&lt;/p&gt;

&lt;p&gt;Because it isn't obvious which questions I got wrong and why, that makes it harder to know where to focus my preparation. I think the reality is that I've got a lot of studying still to do if I want to go into the exam with a high chance of passing (as it currently stands I think I've got about 50/50 odds).&lt;/p&gt;

&lt;p&gt;I will probably look into other sets of practice questions (even though I'm not convinced that all of them are very realistic) as it may be a good way to expose gaps in my knowledge for services that are likely to be on the test.&lt;/p&gt;

&lt;p&gt;Hopefully the next post in the series will be to say how I passed the test. The testing centers are starting to re-open in Ohio, and while I'm in no rush to take the test, the virtual tests seem to have been a nightmare for some people (stories of people waiting hours for the proctor to start the test, and having to purge your room of extra monitors, paper etc.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.to/leading-edje"&gt;&lt;br&gt;
  &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SfUhPiEd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/5uo60qforg9yqdpgzncq.png" alt="Smart EDJE Image"&gt;&lt;br&gt;
&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
    </item>
  </channel>
</rss>
