<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Chris Farris</title>
    <description>The latest articles on DEV Community by Chris Farris (@jcfarris).</description>
    <link>https://dev.to/jcfarris</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/jcfarris"/>
    <language>en</language>
    <item>
      <title>Streamlining incident response investigations with Steampipe relationship graphs</title>
      <dc:creator>Chris Farris</dc:creator>
      <pubDate>Mon, 23 Jan 2023 14:23:07 +0000</pubDate>
      <link>https://dev.to/aws-builders/streamlining-incident-response-investigations-with-steampipe-relationship-graphs-39lf</link>
      <guid>https://dev.to/aws-builders/streamlining-incident-response-investigations-with-steampipe-relationship-graphs-39lf</guid>
      <description>&lt;p&gt;We covered how Steampipe can assist with &lt;em&gt;Detection and Analysis&lt;/em&gt; in &lt;a href="https://steampipe.io/blog/splunk-lookup-tables"&gt;Enrich Splunk events with Steampipe&lt;/a&gt;. Today we’re going to look at how leveraging &lt;a href="https://steampipe.io/blog/release-0-18-0"&gt;Steampipe relationship graphs&lt;/a&gt; can assist in the &lt;em&gt;Containment, Eradication, and Recovery&lt;/em&gt; phases of the incident response cycle.&lt;/p&gt;


&lt;center&gt;


  
    &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TlTEazFq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://steampipe.io/images/blog/2023-01-relationships-and-ir/NIST-800-61-IRCycle.svg" alt="NIST 800-61 Incident Response Cycle" width="569" height="283"&gt;&lt;center&gt;
      &lt;a href="https://csrc.nist.gov/publications/detail/sp/800-61/rev-2/final"&gt;NIST 800-61 Incident Response Cycle&lt;/a&gt;
    &lt;/center&gt;
    
  





&lt;/center&gt;

&lt;p&gt;Let's say you have evidence a cloud host was compromised. One of the key questions you need to answer is, “What could that attacker have done while inside our network?”&lt;/p&gt;

&lt;p&gt;Steampipe’s new relationship graphs are one method to determine the compromised host’s connectedness and &lt;a href="https://www.ncsc.gov.uk/guidance/preventing-lateral-movement#section_2"&gt;Lateral Movement&lt;/a&gt; possibilities.&lt;/p&gt;

&lt;h2&gt;
  
  
  Start with EC2 Instance Detail
&lt;/h2&gt;

&lt;p&gt;A typical incident starts with an event, and that event is often associated with an EC2 Instance. We can select the suspected compromised host via the &lt;a href="https://hub.steampipe.io/mods/turbot/aws_insights/dashboards/dashboard.ec2_instance_detail"&gt;AWS EC2 Instance Detail&lt;/a&gt;. In addition to details about the instance size, IP address, and tags, you can see several related resources including the SSH KeyPair used to launch the instance, the EBS Volume, and most importantly the IAM Instance Profile and VPC Information.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--oKW9w0Ga--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://steampipe.io/_next/image%3Furl%3D%252Fimages%252Fblog%252F2023-01-relationships-and-ir%252FEC2InstanceDetail_retina.png%26w%3D1920%26q%3D75" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--oKW9w0Ga--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://steampipe.io/_next/image%3Furl%3D%252Fimages%252Fblog%252F2023-01-relationships-and-ir%252FEC2InstanceDetail_retina.png%26w%3D1920%26q%3D75" alt="EC2 Instance Detail screenshot" width="880" height="790"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here we see that the compromised instance has the &lt;code&gt;RenderRole-dev&lt;/code&gt; IAM role and is located in the &lt;code&gt;VPC-Public2-Subnet&lt;/code&gt; in the &lt;code&gt;Dev VPC&lt;/code&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Dive into the Render Role
&lt;/h2&gt;

&lt;p&gt;Click RenderRole to visit the &lt;a href="https://hub.steampipe.io/mods/turbot/aws_insights/dashboards/dashboard.iam_role_detail"&gt;AWS IAM Role Detail&lt;/a&gt; dashboard.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--IrPf37te--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://steampipe.io/_next/image%3Furl%3D%252Fimages%252Fblog%252F2023-01-relationships-and-ir%252FRenderRole_retina.png%26w%3D1920%26q%3D75" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--IrPf37te--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://steampipe.io/_next/image%3Furl%3D%252Fimages%252Fblog%252F2023-01-relationships-and-ir%252FRenderRole_retina.png%26w%3D1920%26q%3D75" alt="IAM Role Detail screenshot" width="880" height="505"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The IAM Detail relationship diagram shows us the attached policies, along with the EC2 instances and Lambda function that use the role. These are the permissions that the attacker could get from the EC2 Metadata Service.&lt;/p&gt;

&lt;h2&gt;
  
  
  Dive into the Dev VPC
&lt;/h2&gt;

&lt;p&gt;In addition to viewing IAM actions an attacker might have had access to, you probably also want to know what network resources the compromised host could connect to. From the original EC2 Detail, we can also click into the &lt;a href="https://hub.steampipe.io/mods/turbot/aws_insights/dashboards/dashboard.vpc_detail"&gt;AWS VPC Detail&lt;/a&gt; for the &lt;code&gt;Dev VPC&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_dWlFP7l--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://steampipe.io/_next/image%3Furl%3D%252Fimages%252Fblog%252F2023-01-relationships-and-ir%252FDevVPC_retina2.png%26w%3D1920%26q%3D75" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_dWlFP7l--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://steampipe.io/_next/image%3Furl%3D%252Fimages%252Fblog%252F2023-01-relationships-and-ir%252FDevVPC_retina2.png%26w%3D1920%26q%3D75" alt="VPC Detail screenshot - Dev VPC" width="880" height="218"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The relationship diagram shows us that in this VPC was an RDS database, a mail relay server, and a payments server. Critically, there were also VPC peering connections to two other VPCs: the &lt;code&gt;Security VPC&lt;/code&gt; and the &lt;code&gt;Meme Factory VPC&lt;/code&gt;. Let’s click on the Security VPC to see what is in there.&lt;/p&gt;

&lt;h2&gt;
  
  
  Dive into the Security VPC
&lt;/h2&gt;

&lt;p&gt;Because Steampipe was configured to scan the entire AWS organization, we can pivot our view across multiple AWS accounts without logging in to the AWS console a second time. When investigating an incident, a responder often has multiple browser windows open to investigate, and this can lead to mistakes. Relationship graphs are an easier and more reliable way to explore the entire cloud network.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--loMS0efi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://steampipe.io/_next/image%3Furl%3D%252Fimages%252Fblog%252F2023-01-relationships-and-ir%252FSecurityVPC.gif%26w%3D1920%26q%3D75" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--loMS0efi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://steampipe.io/_next/image%3Furl%3D%252Fimages%252Fblog%252F2023-01-relationships-and-ir%252FSecurityVPC.gif%26w%3D1920%26q%3D75" alt="VPC Detail screenshot - Security VPC" width="880" height="446"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the &lt;code&gt;Security VPC&lt;/code&gt;, we find the &lt;a href="https://www.sans.org/tools/sift-workstation/"&gt;SIFT forensics server,&lt;/a&gt; the Fooli &lt;a href="https://steampipe.io/docs/guides/aws-orgs"&gt;Organization Steampipe instance&lt;/a&gt;, and &lt;a href="https://steampipe.io/blog/splunk-lookup-tables"&gt;Splunk&lt;/a&gt;, which is used as the Fooli SEIM.&lt;/p&gt;

&lt;h2&gt;
  
  
  Preparation
&lt;/h2&gt;

&lt;p&gt;For incident response, preparation is vital.  In order to be ready to use Steampipe to investigate an incident, you need to ensure everything is set up properly.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Follow the guidance on &lt;a href="https://steampipe.io/docs/guides/aws-orgs"&gt;Using Steampipe CLI with AWS Organizations&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Download the &lt;a href="https://github.com/turbot/steampipe-mod-aws-insights"&gt;AWS Insights Mod for Steampipe&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Run the Steampipe CLI with &lt;code&gt;steampipe dashboard&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;If using &lt;a href="https://cloud.steampipe.io"&gt;Steampipe Cloud&lt;/a&gt;, simply &lt;a href="https://steampipe.io/docs/cloud/mods#installing-mods"&gt;install&lt;/a&gt; the AWS Insights mod in your workspace to get started.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Let's go exploring!
&lt;/h2&gt;

&lt;p&gt;Once you’ve set up Steampipe or Steampipe Cloud to investigate resources, there are many other exciting things you can identify visually that you might miss in a tabular report.&lt;/p&gt;

&lt;p&gt;The Steampipe ecosystem now offers more than 90 plugins; each provides several, dozens, or even hundreds of tables. You've always been able to use SQL to join those tables, not only within an API family like AWS, but across diverse APIs. Now you can also use SQL to build relationship graphs across APIs and cloud tenants. Imagine an always accurate network map that links your AWS and Azure VPCs with the firewalls in your data center. The sky's the limit, and we look forward to hearing about your discoveries in our &lt;a href="https://steampipe-io-git-v18-release-nw-turbot.vercel.app/community/join"&gt;Slack community&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>security</category>
      <category>opensource</category>
    </item>
    <item>
      <title>Leveraging CodePipeline to deploy Terraform</title>
      <dc:creator>Chris Farris</dc:creator>
      <pubDate>Sun, 22 Jan 2023 18:25:49 +0000</pubDate>
      <link>https://dev.to/aws-builders/leveraging-codepipeline-to-deploy-terraform-3b4h</link>
      <guid>https://dev.to/aws-builders/leveraging-codepipeline-to-deploy-terraform-3b4h</guid>
      <description>&lt;p&gt;I've been on the &lt;a href="https://www.chrisfarris.com/terraform-vs-cloudformation/" rel="noopener noreferrer"&gt;CloudFormation side&lt;/a&gt; of the IaC Wars since 2014, when I started working in AWS. I've dabbled in terraform but never made it my &lt;em&gt;primary&lt;/em&gt; IaC choice. For the Fooli Meme Factory, I needed to mess things up and then quickly revert the state to what it was at deployment time. So for SECCDC 2023, I ported Meme Factory almost entirely over to &lt;a href="https://www.terraform.io/" rel="noopener noreferrer"&gt;Terraform&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;This led me to two problems. The first was the perennial issue I've had with Terraform from day one: &lt;em&gt;"How do I manage state?"&lt;/em&gt;. The second issue was how do I leverage some form of CI/CD tooling to allow me to leverage one of Terraform's biggest strengths - the &lt;code&gt;terraform plan&lt;/code&gt; capability. Since Fooli is an AWS product, I figured that I should be able to use AWS native tools for this. I've used CodePipeline in the past to preview change-sets with &lt;a href="https://github.com/org-formation/org-formation-cli" rel="noopener noreferrer"&gt;aws-org-formation&lt;/a&gt;, so I thought it would be easy to find a well-worn pattern from AWS on doing it.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Apparently, there is no canonical way to use Terraform in CodeBuild, with CodePipeline as the method to review plans before applying them!!!&lt;/strong&gt; Seriously, WTF?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3xrx1t5yahobkv0odoh6.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3xrx1t5yahobkv0odoh6.JPG" alt="Loki Saying Very Sad, Anyway" width="200" height="263"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This made me sad. So I decided to do it my damn self. And now I'm documenting it here for everyone else.&lt;/p&gt;

&lt;p&gt;This solution will provide the following:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;a href="https://aws.amazon.com/cloudformation/" rel="noopener noreferrer"&gt;CloudFormation&lt;/a&gt; Template (CFT) to deploy a CodePipeline, CodeBuild Projects, and an S3 Bucket for state and artifact handling.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html" rel="noopener noreferrer"&gt;Buildspec files&lt;/a&gt; to perform the &lt;code&gt;terraform plan&lt;/code&gt; and &lt;code&gt;terraform apply&lt;/code&gt; steps.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://opensource.com/article/18/8/what-how-makefile" rel="noopener noreferrer"&gt;Makefiles&lt;/a&gt; because I'm old school like that.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Why a CloudFormation template for step 1? To get around the chicken-and-egg problem with state. The CFT will deploy and do the needful to get the account setup for the terraform pipeline without needing a terraform pipeline already in place&lt;/p&gt;

&lt;h2&gt;
  
  
  How it works.
&lt;/h2&gt;

&lt;p&gt;When a push is made to a monitored GitHub repo, the CodePipeline will trigger. AWS's &lt;a href="https://aws.amazon.com/blogs/devops/using-aws-codepipeline-and-aws-codestar-connections-to-deploy-from-bitbucket/" rel="noopener noreferrer"&gt;CodeStar Connections&lt;/a&gt; are used to manage the integration between GitHub and CodePipeline. (As an aside: CodeStar connections are so under-the-radar I can't even find a product page to link to. Just some API docs and the above blog post.) Anyway - CodeStar is a much better solution than previous methods that required overly-privileged GitHub &lt;em&gt;Personal&lt;/em&gt; Access Tokens to be uploaded into &lt;em&gt;shared&lt;/em&gt; AWS Accounts.&lt;/p&gt;

&lt;p&gt;So CodeStar Connections will monitor GitHub and fire the pipeline on a push to the specified branch. After that, the pipeline will execute the following stages:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Source Stage&lt;/strong&gt; - where it downloads the code package from GitHub and stores it in the S3 Bucket.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Terraform Plan Stage&lt;/strong&gt; - where CodeBuild will execute the &lt;code&gt;terraform plan&lt;/code&gt; and copy the &lt;code&gt;tfplan&lt;/code&gt; into S3&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Review Stage&lt;/strong&gt; - sends a message to SNS, which (if configured) will email someone to review the output of the Terraform plan in CodeBuild.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Apply Stage&lt;/strong&gt; - If approved, this stage will fire up CodeBuild to do the &lt;code&gt;terraform apply&lt;/code&gt; on the preexisting &lt;code&gt;tfplan&lt;/code&gt; file.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.chrisfarris.com%2Fpost%2Ftf-codepipeline%2FPipeline.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.chrisfarris.com%2Fpost%2Ftf-codepipeline%2FPipeline.png" title="CodePipeline as seen in the console" alt="CodePipeline" width="483" height="1107"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  CodePipeline &amp;amp; CodeBuild
&lt;/h2&gt;

&lt;p&gt;The CodePipeline is defined entirely in the &lt;a href="https://gist.github.com/jchrisfarris/acc0cd200f2fe50ea56874b2f9ab93b7" rel="noopener noreferrer"&gt;CloudFormation Template&lt;/a&gt;. The CodeBuild projects are created by the CloudFormation Template, but the commands to be executed are defined in the &lt;a href="https://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html" rel="noopener noreferrer"&gt;build spec files&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Terraform Plan
&lt;/h3&gt;

&lt;p&gt;The definition of the Plan stage in CodePipeline is:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;Name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;terraform-plan&lt;/span&gt;
  &lt;span class="na"&gt;Actions&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;Name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;terraform_plan&lt;/span&gt;
      &lt;span class="na"&gt;RunOrder&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;1&lt;/span&gt;
      &lt;span class="na"&gt;Namespace&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;TfPlan&lt;/span&gt;
      &lt;span class="na"&gt;InputArtifacts&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;Name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;GitHubCode&lt;/span&gt;
      &lt;span class="na"&gt;OutputArtifacts&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;Name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;TerraformPlan&lt;/span&gt;
      &lt;span class="na"&gt;ActionTypeId&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;Category&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Build&lt;/span&gt;
        &lt;span class="na"&gt;Provider&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;CodeBuild&lt;/span&gt;
        &lt;span class="na"&gt;Owner&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS&lt;/span&gt;
        &lt;span class="na"&gt;Version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;1'&lt;/span&gt;
      &lt;span class="na"&gt;Configuration&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;ProjectName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Ref&lt;/span&gt; &lt;span class="s"&gt;TerraformPlanProject&lt;/span&gt;
        &lt;span class="na"&gt;EnvironmentVariables&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Sub&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
          &lt;span class="s"&gt;[&lt;/span&gt;
            &lt;span class="s"&gt;{"name": "EXECUTION_ID",    "value": "#{codepipeline.PipelineExecutionId}"},&lt;/span&gt;
            &lt;span class="s"&gt;{"name": "BRANCH",          "value": "#{GitHubSource.BranchName}"},&lt;/span&gt;
            &lt;span class="s"&gt;{"name": "REPO",            "value": "#{GitHubSource.FullRepositoryName}"},&lt;/span&gt;
            &lt;span class="s"&gt;{"name": "COMMIT_ID",       "value": "#{GitHubSource.CommitId}"},&lt;/span&gt;
            &lt;span class="s"&gt;{"name": "env",             "value": "${pEnvironment}"}&lt;/span&gt;
          &lt;span class="s"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And the Project is:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;TerraformPlanProject&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS::CodeBuild::Project&lt;/span&gt;
  &lt;span class="na"&gt;Properties&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;Name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Sub&lt;/span&gt; &lt;span class="s"&gt;${AWS::StackName}-tf-plan&lt;/span&gt;
    &lt;span class="na"&gt;Artifacts&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;CODEPIPELINE&lt;/span&gt;
    &lt;span class="na"&gt;Source&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;CODEPIPELINE&lt;/span&gt;
      &lt;span class="na"&gt;BuildSpec&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;buildspec-tf-plan.yaml&lt;/span&gt;
    &lt;span class="na"&gt;Environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;ComputeType&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;BUILD_GENERAL1_SMALL&lt;/span&gt;
      &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;LINUX_CONTAINER&lt;/span&gt;
      &lt;span class="na"&gt;Image&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Ref&lt;/span&gt; &lt;span class="s"&gt;BuildImageName&lt;/span&gt;
    &lt;span class="na"&gt;ServiceRole&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!GetAtt&lt;/span&gt; &lt;span class="s"&gt;ProjectServiceRole.Arn&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;EnvironmentVariables&lt;/code&gt; in the pipeline are passed into the CodeBuild Project. CodePipeline substitutes environment variables that begin with a &lt;code&gt;#&lt;/code&gt; at execution, and the ones beginning with &lt;code&gt;$&lt;/code&gt; are substituted by CloudFormation at deployment. The BuildSpec exports some environment variables, and they're stored in the pipeline under the &lt;code&gt;TfPlan&lt;/code&gt; Namespace (Line 5). The &lt;code&gt;BuildSpec&lt;/code&gt; part of the CodeBuild Project (Line 9) defines the commands that CodeBuild will execute. &lt;a href="//buildspec-tf-plan.yaml"&gt;That file&lt;/a&gt; looks like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;0.2&lt;/span&gt;

&lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;exported-variables&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;BuildID&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;BuildTag&lt;/span&gt;

&lt;span class="na"&gt;phases&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;install&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;commands&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;curl&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;-s&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;https://releases.hashicorp.com/terraform/1.3.6/terraform_1.3.6_linux_amd64.zip&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;-o&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;terraform.zip"&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;unzip&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;terraform.zip&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;-d&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;/usr/local/bin"&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;chmod&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;755&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;/usr/local/bin/terraform"&lt;/span&gt;
  &lt;span class="na"&gt;pre_build&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;commands&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;make&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;tf-init"&lt;/span&gt;
  &lt;span class="na"&gt;build&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;commands&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;make&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;tf-plan"&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;export&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;BuildID=`echo&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;$CODEBUILD_BUILD_ID&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;|&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;cut&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;-d:&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;-f1`"&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;export&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;BuildTag=`echo&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;$CODEBUILD_BUILD_ID&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;|&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;cut&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;-d:&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;-f2`"&lt;/span&gt;

&lt;span class="na"&gt;artifacts&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;TerraformPlan&lt;/span&gt;
  &lt;span class="na"&gt;files&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;terraform/$env-terraform.tfplan&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Lines 4-6 indicate that we will export two environment variables we want to pass back to CodePipeline, &lt;code&gt;BuildID&lt;/code&gt; and &lt;code&gt;BuildTag&lt;/code&gt;. These are needed to build the URL for reviewing the plan. The &lt;code&gt;artifacts&lt;/code&gt; section on line 23 defines the files created that CodeBuild/CodePipeline should store in S3 and pass between the pipeline stages.&lt;/p&gt;

&lt;h3&gt;
  
  
  Review Stage
&lt;/h3&gt;

&lt;p&gt;The review stage consists of a manual step and looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;Name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Review-Plan&lt;/span&gt;
  &lt;span class="na"&gt;Actions&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;Name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;review-plan&lt;/span&gt;
      &lt;span class="na"&gt;RunOrder&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;1&lt;/span&gt;
      &lt;span class="na"&gt;ActionTypeId&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;Category&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Approval&lt;/span&gt;
        &lt;span class="na"&gt;Provider&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Manual&lt;/span&gt;
        &lt;span class="na"&gt;Owner&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS&lt;/span&gt;
        &lt;span class="na"&gt;Version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;1'&lt;/span&gt;
      &lt;span class="na"&gt;Configuration&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;NotificationArn&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Ref&lt;/span&gt; &lt;span class="s"&gt;PipelineNotificationsTopic&lt;/span&gt;
        &lt;span class="na"&gt;CustomData&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Review&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;the&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Terraform&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Plan"&lt;/span&gt;
        &lt;span class="na"&gt;ExternalEntityLink&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Sub&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;https://${AWS::Region}.console.aws.amazon.com/codesuite/codebuild/${AWS::AccountId}/projects/#{TfPlan.BuildID}/build/#{TfPlan.BuildID}%3A#{TfPlan.BuildTag}/?region=${AWS::Region}"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here we construct the &lt;code&gt;ExternalEntityLink&lt;/code&gt; from the &lt;code&gt;BuildID&lt;/code&gt; and &lt;code&gt;BuildTag&lt;/code&gt; from the plan stage. Again variables that begin with a &lt;code&gt;#&lt;/code&gt; are substituted by CodePipeline at execution and the ones beginning with &lt;code&gt;$&lt;/code&gt; are substituted by CloudFormation at deployment. We send a message to the &lt;code&gt;PipelineNotificationsTopic&lt;/code&gt; which triggers an email to the user:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fay8113ge815w1knd64uw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fay8113ge815w1knd64uw.png" title="The email from CodePipeline telling me I have changes to review" alt="Email from CodePipeline" width="800" height="482"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Apply Stage.
&lt;/h3&gt;

&lt;p&gt;The apply stage is similar to the plan. In CodePipeline it looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;Name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ExecuteTerraform&lt;/span&gt;
  &lt;span class="na"&gt;Actions&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;Name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;terraform-apply&lt;/span&gt;
      &lt;span class="na"&gt;RunOrder&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;1&lt;/span&gt;
      &lt;span class="na"&gt;InputArtifacts&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;Name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;GitHubCode&lt;/span&gt;
        &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;Name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;TerraformPlan&lt;/span&gt;
      &lt;span class="na"&gt;ActionTypeId&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;Category&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Build&lt;/span&gt;
        &lt;span class="na"&gt;Provider&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;CodeBuild&lt;/span&gt;
        &lt;span class="na"&gt;Owner&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS&lt;/span&gt;
        &lt;span class="na"&gt;Version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;1'&lt;/span&gt;
      &lt;span class="na"&gt;Configuration&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;ProjectName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Ref&lt;/span&gt; &lt;span class="s"&gt;ExecuteTerraformProject&lt;/span&gt;
        &lt;span class="na"&gt;PrimarySource&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;GitHubCode&lt;/span&gt;
        &lt;span class="na"&gt;EnvironmentVariables&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Sub&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
          &lt;span class="s"&gt;[&lt;/span&gt;
            &lt;span class="s"&gt;{"name": "EXECUTION_ID",    "value": "#{codepipeline.PipelineExecutionId}"},&lt;/span&gt;
            &lt;span class="s"&gt;{"name": "BRANCH",          "value": "#{GitHubSource.BranchName}"},&lt;/span&gt;
            &lt;span class="s"&gt;{"name": "REPO",            "value": "#{GitHubSource.FullRepositoryName}"},&lt;/span&gt;
            &lt;span class="s"&gt;{"name": "COMMIT_ID",       "value": "#{GitHubSource.CommitId}"},&lt;/span&gt;
            &lt;span class="s"&gt;{"name": "env",             "value": "${pEnvironment}"}&lt;/span&gt;
          &lt;span class="s"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this case, we're inputting the input artifacts from both GitHub and the plan (lines 5-7). We set the GitHubCode as the &lt;code&gt;PrimarySource&lt;/code&gt; on line 15, and that becomes the working directory. The other files are written to a different directory, and we have to move them in the BuildSpec file (line 9 below).&lt;/p&gt;

&lt;p&gt;The BuildSpec for the apply looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;0.2&lt;/span&gt;

&lt;span class="na"&gt;phases&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;install&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;commands&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;curl&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;-s&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;https://releases.hashicorp.com/terraform/1.3.6/terraform_1.3.6_linux_amd64.zip&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;-o&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;terraform.zip"&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;unzip&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;terraform.zip&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;-d&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;/usr/local/bin"&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;chmod&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;755&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;/usr/local/bin/terraform"&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;mv&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;$CODEBUILD_SRC_DIR_TerraformPlan/terraform/$env-terraform.tfplan&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;terraform"&lt;/span&gt;
  &lt;span class="na"&gt;pre_build&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;commands&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;make&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;tf-init"&lt;/span&gt;
  &lt;span class="na"&gt;build&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;commands&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;make&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;tf-apply"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The buildspec file installs terraform, moves the tfplan file back to where it's expected, runs &lt;code&gt;make tf-init&lt;/code&gt; (because this is a new container), and then &lt;code&gt;terraform apply&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Makefiles
&lt;/h2&gt;

&lt;p&gt;I use Makefiles to simplify the process of deploying both in CodeBuild and when deploying the terraform locally.&lt;/p&gt;

&lt;p&gt;The root Makefile for the repo looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Copyright 2022 - Chris Farris (chrisf@primeharbor.com) - All Rights Reserved&lt;/span&gt;
&lt;span class="c"&gt;#&lt;/span&gt;
ifndef &lt;span class="nb"&gt;env&lt;/span&gt;
&lt;span class="si"&gt;$(&lt;/span&gt;error &lt;span class="nb"&gt;env &lt;/span&gt;is not &lt;span class="nb"&gt;set&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;
endif

include config.&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;env&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;
&lt;span class="nb"&gt;export&lt;/span&gt;

&lt;span class="c"&gt;#&lt;/span&gt;
&lt;span class="c"&gt;# Terraform&lt;/span&gt;
&lt;span class="c"&gt;#&lt;/span&gt;
tf-init:
    &lt;span class="nb"&gt;cd &lt;/span&gt;terraform &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="si"&gt;$(&lt;/span&gt;MAKE&lt;span class="si"&gt;)&lt;/span&gt; tf-init

tf-plan:
    &lt;span class="nb"&gt;cd &lt;/span&gt;terraform &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="si"&gt;$(&lt;/span&gt;MAKE&lt;span class="si"&gt;)&lt;/span&gt; tf-plan

tf-apply:
    &lt;span class="nb"&gt;cd &lt;/span&gt;terraform &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="si"&gt;$(&lt;/span&gt;MAKE&lt;span class="si"&gt;)&lt;/span&gt; tf-apply

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And the Makefile in the &lt;code&gt;terraform&lt;/code&gt; directory is:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Copyright 2022 - Chris Farris (chrisf@primeharbor.com) - All Rights Reserved&lt;/span&gt;
&lt;span class="c"&gt;#&lt;/span&gt;
tf-init:
    terraform init &lt;span class="nt"&gt;-backend-config&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;../&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;env&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;.tfbackend &lt;span class="nt"&gt;-reconfigure&lt;/span&gt;

tf-plan:
    terraform plan &lt;span class="nt"&gt;-out&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;env&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="nt"&gt;-terraform&lt;/span&gt;.tfplan &lt;span class="nt"&gt;-no-color&lt;/span&gt;

tf-apply:
    terraform apply &lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;env&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="nt"&gt;-terraform&lt;/span&gt;.tfplan
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;config.env&lt;/code&gt; file contains all the TF_VAR exports to feed variables to terraform similar to:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;export &lt;/span&gt;&lt;span class="nv"&gt;TF_VAR_mail_relay_ami&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;ami-0e03dcd66f...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;$(env).tfbackend&lt;/code&gt; contains the line to define the bucket:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;bucket="fooli-tf-state-test"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Done!
&lt;/h2&gt;

&lt;p&gt;There you have it, a complete solution to deploy Terraform in CodePipeline with CodeBuild and a manual review of the changes to be made. You can tweak the makefiles and buildspec files as you see fit. Here is the entire &lt;a href="https://gist.github.com/jchrisfarris/acc0cd200f2fe50ea56874b2f9ab93b7" rel="noopener noreferrer"&gt;CloudFormation Template&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I'm surprised that there isn't a CodeBuild container from Hashicorp or AWS with Terraform pre-installed. The effort of making one would be less than the expense of everyone curling the terraform binary and providers twice on every build.&lt;/p&gt;

</description>
      <category>discuss</category>
      <category>community</category>
    </item>
    <item>
      <title>Mapping your AWS attack surface</title>
      <dc:creator>Chris Farris</dc:creator>
      <pubDate>Tue, 27 Dec 2022 02:47:00 +0000</pubDate>
      <link>https://dev.to/aws-builders/mapping-your-aws-attack-surface-339b</link>
      <guid>https://dev.to/aws-builders/mapping-your-aws-attack-surface-339b</guid>
      <description>&lt;h2&gt;
  
  
  What is a cloud attack surface?
&lt;/h2&gt;

&lt;p&gt;Traditional IT security protects the network perimeter with firewalls, vulnerability scanners and patching processes. Discovering the attack surface of your network perimeter was a simple matter of knowing your public IP range and scanning it from an external host.  With the introduction of public cloud technologies, the attack surface expanded to include cloud provider’s APIs.  "Identity is the new perimeter" became the new mantra.&lt;/p&gt;

&lt;p&gt;The old network perimeter still exists, of course, and  it has evolved. No longer are all the IP addresses that make up your perimeter assigned to your company. In the cloud, your network perimeter is part of the cloud provider’s IP space. And, cloud-provider-managed resources provide a direct path to your application code. This combination of cloud-provider IP addresses assigned to you, and the URLs of cloud-provider resources that lead to your application and data, comprise your cloud’s network perimeter and define part of the cloud attack surface.&lt;/p&gt;

&lt;p&gt;An Organization must monitor and understand the network perimeter of their cloud estate. Resources comprising the externally facing network components of your cloud attack surface can be broadly grouped into IP addresses, hostnames, and URLs. In this blog post, we will provide step-by-step instructions for mapping the network aspects of the cloud attack surface using &lt;a href="https://steampipe.io/" rel="noopener noreferrer"&gt;Steampipe&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Enumerate IP Addresses using Steampipe
&lt;/h2&gt;

&lt;p&gt;For IP Addresses, we'll look mainly at the Elastic Network Interfaces. While other resources like &lt;a href="https://aws.amazon.com/cloudfront/" rel="noopener noreferrer"&gt;CloudFront&lt;/a&gt;, &lt;a href="https://aws.amazon.com/s3/" rel="noopener noreferrer"&gt;S3&lt;/a&gt;, and &lt;a href="https://aws.amazon.com/api-gateway/" rel="noopener noreferrer"&gt;API Gateway&lt;/a&gt; also leverage IP addresses, the network security protections of those services are the cloud provider's responsibility.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/turbot/steampipe-samples/tree/main/all/aws-cloud-perimeter/public_vpc_ips.sql" rel="noopener noreferrer"&gt;This query&lt;/a&gt; will download a list of all public IP addresses tied to the customer’s VPC.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;select&lt;/span&gt;
  &lt;span class="n"&gt;eni&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;association_public_ip&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;public_ip&lt;/span&gt;
&lt;span class="k"&gt;from&lt;/span&gt;
  &lt;span class="n"&gt;aws_ec2_network_interface&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;eni&lt;/span&gt;
&lt;span class="k"&gt;where&lt;/span&gt;
  &lt;span class="n"&gt;eni&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;association_public_ip&lt;/span&gt; &lt;span class="k"&gt;is&lt;/span&gt; &lt;span class="k"&gt;not&lt;/span&gt; &lt;span class="k"&gt;Null&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Enumerate Hostnames using Steampipe
&lt;/h2&gt;

&lt;p&gt;To determine the DNS Hostnames used as part of your cloud perimeter, Steampipe can query all of the A records and CNAMEs in your &lt;a href="https://aws.amazon.com/route53/" rel="noopener noreferrer"&gt;Route 53&lt;/a&gt; Hosted Zones. A records point directly to IP addresses under your control. CNAMEs are references that can point to hosts or other cloud-provider-managed resources. In either case, you need to understand what exists in your environment.&lt;/p&gt;

&lt;p&gt;To query all of the hostnames that point to CNAME and A Records in your Route 53 Hosted Zones, use &lt;a href="https://github.com/turbot/steampipe-samples/tree/main/all/aws-cloud-perimeter/route53.sql" rel="noopener noreferrer"&gt;this SQL query&lt;/a&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;select&lt;/span&gt;
  &lt;span class="n"&gt;r&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;hostname&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="k"&gt;type&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;jsonb_array_elements_text&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;records&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;resource_record&lt;/span&gt;
&lt;span class="k"&gt;from&lt;/span&gt;
  &lt;span class="n"&gt;aws_route53_zone&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;z&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;aws_route53_record&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;r&lt;/span&gt;
&lt;span class="k"&gt;where&lt;/span&gt; &lt;span class="n"&gt;r&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;zone_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;
  &lt;span class="k"&gt;and&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;type&lt;/span&gt; &lt;span class="k"&gt;LIKE&lt;/span&gt; &lt;span class="s1"&gt;'A'&lt;/span&gt; &lt;span class="k"&gt;OR&lt;/span&gt; &lt;span class="k"&gt;type&lt;/span&gt; &lt;span class="k"&gt;LIKE&lt;/span&gt; &lt;span class="s1"&gt;'CNAME'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="k"&gt;and&lt;/span&gt; &lt;span class="n"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;private_zone&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="k"&gt;false&lt;/span&gt;
  &lt;span class="k"&gt;and&lt;/span&gt; &lt;span class="n"&gt;jsonb_pretty&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;records&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;not&lt;/span&gt; &lt;span class="k"&gt;like&lt;/span&gt; &lt;span class="s1"&gt;'%dkim%'&lt;/span&gt;
  &lt;span class="k"&gt;and&lt;/span&gt; &lt;span class="n"&gt;jsonb_pretty&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;records&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;not&lt;/span&gt; &lt;span class="k"&gt;like&lt;/span&gt; &lt;span class="s1"&gt;'%acm-validations.aws.%'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Review the list of IP addresses returned, and if permitted by the terms of service, scan these hostnames for exposed ports and services using &lt;a href="https://nmap.org/" rel="noopener noreferrer"&gt;nmap&lt;/a&gt; or &lt;a href="https://www.tenable.com/downloads/nessus?loginAttempted=true" rel="noopener noreferrer"&gt;Nessus&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Note: The above query excludes private DNS for VPCs &lt;code&gt;z.private_zone=false&lt;/code&gt; and excludes common CNAMEs needed for &lt;a href="https://aws.amazon.com/certificate-manager/" rel="noopener noreferrer"&gt;ACM&lt;/a&gt; and &lt;a href="https://en.wikipedia.org/wiki/DomainKeys_Identified_Mail" rel="noopener noreferrer"&gt;email validation&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Enumerate the URLs using Steampipe
&lt;/h2&gt;

&lt;p&gt;A number of AWS services, including &lt;a href="https://aws.amazon.com/cloudfront/" rel="noopener noreferrer"&gt;CloudFront&lt;/a&gt;, &lt;a href="https://aws.amazon.com/s3/" rel="noopener noreferrer"&gt;S3&lt;/a&gt;, &lt;a href="https://aws.amazon.com/api-gateway/" rel="noopener noreferrer"&gt;API Gateway&lt;/a&gt;, and &lt;a href="https://docs.aws.amazon.com/lambda/latest/dg/lambda-urls.html" rel="noopener noreferrer"&gt;AWS Lambda&lt;/a&gt;, produce URLs that can be vulnerable. For example, S3 Buckets exist as URLs on the public internet and can be accessed if the bucket is not properly secured. To get a list of all of the URLs for the &lt;em&gt;public&lt;/em&gt; buckets in your cloud environment, you can use &lt;a href="https://github.com/turbot/steampipe-samples/tree/main/all/aws-cloud-perimeter/s3_buckets.sql" rel="noopener noreferrer"&gt;this query&lt;/a&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;select&lt;/span&gt;
  &lt;span class="s1"&gt;'https://'&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="s1"&gt;'.s3.'&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="n"&gt;region&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="s1"&gt;'.amazonaws.com/'&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;url&lt;/span&gt;
&lt;span class="k"&gt;from&lt;/span&gt;
  &lt;span class="n"&gt;aws_s3_bucket&lt;/span&gt;
&lt;span class="k"&gt;where&lt;/span&gt;
  &lt;span class="n"&gt;bucket_policy_is_public&lt;/span&gt; &lt;span class="k"&gt;is&lt;/span&gt; &lt;span class="k"&gt;True&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To request all of the URLs from CloudFront, &lt;a href="https://github.com/turbot/steampipe-samples/tree/main/all/aws-cloud-perimeter/cloudfront_distributions.sql" rel="noopener noreferrer"&gt;this SQL query&lt;/a&gt; will return both the distribution name and any aliases that are part of that distribution:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;select&lt;/span&gt;
  &lt;span class="s1"&gt;'https://'&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="n"&gt;domain_name&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;url&lt;/span&gt;
&lt;span class="k"&gt;from&lt;/span&gt;
  &lt;span class="n"&gt;aws_cloudfront_distribution&lt;/span&gt;
&lt;span class="k"&gt;UNION&lt;/span&gt; &lt;span class="k"&gt;ALL&lt;/span&gt;
&lt;span class="k"&gt;select&lt;/span&gt;
  &lt;span class="s1"&gt;'https://'&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="n"&gt;jsonb_array_elements_text&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;aliases&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="s1"&gt;'Items'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;url&lt;/span&gt;
&lt;span class="k"&gt;from&lt;/span&gt;
  &lt;span class="n"&gt;aws_cloudfront_distribution&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;API Gateways themselves do not have URLs, but when the customer creates a "&lt;a href="https://docs.aws.amazon.com/apigateway/latest/developerguide/http-api-stages.html" rel="noopener noreferrer"&gt;stage&lt;/a&gt;" a URL is created. To get a list of all the API Gateway v2 URLs, &lt;a href="https://github.com/turbot/steampipe-samples/tree/main/all/aws-cloud-perimeter/api_gatewayv2.sql" rel="noopener noreferrer"&gt;this SQL query&lt;/a&gt; can be used:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;select&lt;/span&gt; &lt;span class="s1"&gt;'https://'&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="n"&gt;api_id&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="s1"&gt;'.execute-api.'&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="n"&gt;region&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="s1"&gt;'.amazonaws.com/'&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="n"&gt;stage_name&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;url&lt;/span&gt;
&lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="n"&gt;aws_api_gatewayv2_stage&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;API Gateway URLs typically have the following format:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;https://[api-id].execute-api.[region].amazonaws.com/[stage]/[path]&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;where [api-id] is the ID of the API Gateway API, [region] is the AWS region where the API is deployed, [stage] is the stage of the API (such as "prod" or "dev"), and [path] is the path to the specific Lambda function being accessed. For example, the URL "&lt;code&gt;https://abc123.execute-api.us-east-1.amazonaws.com/prod/hello&lt;/code&gt;" could be used to invoke a Lambda function named "hello" that is associated with the API Gateway API with ID "abc123", and is deployed in the "us-east-1" region.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/lambda/latest/dg/lambda-urls.html" rel="noopener noreferrer"&gt;Lambda URLs&lt;/a&gt; are a new feature of AWS Lambda, released in April of 2022. AWS allows the creation of Lambda URLs that do not require authentication. Therefore, these Lambda URLs are part of your cloud attack surface. To find the Lambda URLs in your environment, you can use &lt;a href="https://github.com/turbot/steampipe-samples/tree/main/all/aws-cloud-perimeter/lambda_url.sql" rel="noopener noreferrer"&gt;this Steampipe query&lt;/a&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;select&lt;/span&gt; &lt;span class="n"&gt;url_config&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&amp;gt;&lt;/span&gt; &lt;span class="s1"&gt;'FunctionUrl'&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;url&lt;/span&gt;
&lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="n"&gt;aws_lambda_function&lt;/span&gt;
&lt;span class="k"&gt;where&lt;/span&gt; &lt;span class="n"&gt;url_config&lt;/span&gt; &lt;span class="k"&gt;is&lt;/span&gt; &lt;span class="k"&gt;not&lt;/span&gt; &lt;span class="k"&gt;Null&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can pull all these together using SQL UNION queries as demonstrated by &lt;a href="https://github.com/turbot/steampipe-samples/tree/main/all/aws-cloud-perimeter/all_urls.sql" rel="noopener noreferrer"&gt;this query&lt;/a&gt;. You can run it like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;steampipe query all_urls.sql

+-----------------------------------------------------------------------+-------------------------------+
| url                                                                   | type                          |
+-----------------------------------------------------------------------+-------------------------------+
| https://dev-FooliApiStack-780272371.us-east-1.elb.amazonaws.com:443   | application_load_balancer     |
| https://prod-FooliApiStack-1829703886.us-east-1.elb.amazonaws.com:443 | application_load_balancer     |
| https://rwpfqhvj6g.execute-api.us-east-1.amazonaws.com/dev_system     | api_gateway                   |
| https://v5johmngyd.execute-api.us-east-1.amazonaws.com/prod_system    | api_gateway                   |
| https://d7iaso3riah1ur.cloudfront.net                                 | cloudfront_distribution       |
| https://d65z8r284bm5f6.cloudfront.net                                 | cloudfront_distribution       |
| https://da6r321d6ljywr.cloudfront.net                                 | cloudfront_distribution       |
| https://d28s6da8wtxvjm.cloudfront.net                                 | cloudfront_distribution       |
| https://memes.dev1.fooli.media                                        | cloudfront_distribution_alias |
| https://memes.memes.fooli.media                                       | cloudfront_distribution_alias |
| https://oshehk7tpbxfqtkfhco5ojkdyq0fbenx.lambda-url.us-east-1.on.aws/ | lambda_url                    |
| https://f00li-prod-1.s3.us-east-1.amazonaws.com/                      | s3_bucket_url                 |
| https://f00li-dev-1.s3.us-east-1.amazonaws.com/                       | s3_bucket_url                 |
| https://f00li-dev-0.s3.us-east-1.amazonaws.com/                       | s3_bucket_url                 |
| https://f00li-public-bucket-policy.s3.us-east-1.amazonaws.com/        | s3_bucket_url                 |
| https://f00li-prod-0.s3.us-east-1.amazonaws.com/                      | s3_bucket_url                 |
| https://pht-blockchain.s3.eu-central-1.amazonaws.com/                 | s3_bucket_url                 |
+-----------------------------------------------------------------------+-------------------------------+
Time: 115.3s. Rows fetched: 215. Hydrate calls: 317.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Empowered with a list of all the exposed URLs in your organization, you can then set up a process to scan these using a number of web-focused &lt;a href="https://www.techopedia.com/definition/30958/dynamic-application-security-testing-dast" rel="noopener noreferrer"&gt;Dynamic Application Security Testing (DAST)&lt;/a&gt; tools and scanners such as &lt;a href="https://www.zaproxy.org/" rel="noopener noreferrer"&gt;Zed Attack Proxy&lt;/a&gt;, &lt;a href="https://github.com/maurosoria/dirsearch" rel="noopener noreferrer"&gt;dirsearch (Web path scanner)&lt;/a&gt;, &lt;a href="https://github.com/michenriksen/aquatone" rel="noopener noreferrer"&gt;Aquatone&lt;/a&gt;, and &lt;a href="https://cirt.net/Nikto2" rel="noopener noreferrer"&gt;Nikto2&lt;/a&gt;. The OWASP® Foundation maintains a &lt;a href="https://owasp.org/www-community/Vulnerability_Scanning_Tools" rel="noopener noreferrer"&gt;full list of scanning tools&lt;/a&gt; that could be used.&lt;/p&gt;

&lt;p&gt;If you are a larger organization that runs a bug bounty program, scanning your URLs for these low-hanging fruit is a quick and easy way to avoid payouts to researchers who use the same tools.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Steampipe is a powerful way to find and enumerate your AWS IP addresses, hostnames, and URLs. Follow the steps outlined here to gather the information needed to define and manage the network component of your cloud attack surface,  ensure the security and integrity of your data and applications, and prevent unauthorized access to your network resources.  Of course everyone's situation is unique, and you may find a solution that works better for you. If so, please &lt;a href="https://steampipe.io/community/join" rel="noopener noreferrer"&gt;let us know&lt;/a&gt;: we love to learn from our community!&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>webdev</category>
      <category>discuss</category>
    </item>
    <item>
      <title>Enrich Splunk events with Steampipe</title>
      <dc:creator>Chris Farris</dc:creator>
      <pubDate>Tue, 20 Dec 2022 18:27:07 +0000</pubDate>
      <link>https://dev.to/aws-builders/enrich-splunk-events-with-steampipe-265j</link>
      <guid>https://dev.to/aws-builders/enrich-splunk-events-with-steampipe-265j</guid>
      <description>&lt;p&gt;When analyzing telemetry from AWS in a security operations role, context is key. What is this instance i-7ba5bed288a? Is this random AWS-owned IP address one of mine, or does it belong to someone else? Which account is 178901234562 again? AWS doesn't provide any of this context in CloudTrail or GuardDuty.&lt;/p&gt;

&lt;p&gt;If you use Splunk as your Security Event and Incident Management (SEIM) platform, you've probably heard of Lookups. &lt;a href="https://docs.splunk.com/Documentation/Splunk/9.0.1/Knowledge/Aboutlookupsandfieldactions" rel="noopener noreferrer"&gt;Per Splunk&lt;/a&gt;: "&lt;strong&gt;Lookups&lt;/strong&gt; enrich your &lt;strong&gt;event data&lt;/strong&gt; by adding field-value combinations from &lt;strong&gt;lookup tables&lt;/strong&gt;".  Lookups can be a great way to improve your detection and investigations by adding attributes and key business context to your CloudTrail, GuardDuty, and VPC Flow Log data.&lt;/p&gt;

&lt;p&gt;Steampipe can pull that context into &lt;a href="https://docs.splunk.com/Documentation/Splunk/8.0.4/Knowledge/ConfigureCSVlookups" rel="noopener noreferrer"&gt;Splunk lookup tables&lt;/a&gt;. In a pair of examples, for AWS Accounts and Elastic Network Interfaces, we'll use Steampipe to query AWS Accounts and public and private IP addresses of Elastic Network Interfaces. We'll save that data as Splunk lookup tables. The Steampipe SQL queries and the Splunk SPL queries provided here are examples you can build upon to create your own enrichment tables.&lt;/p&gt;

&lt;h2&gt;
  
  
  Context enrichment with Steampipe
&lt;/h2&gt;

&lt;p&gt;Steampipe is an open source project that provides a common interface that enables you to query cloud APIs with SQL. WIth Steampipe and the &lt;a href="https://hub.steampipe.io/plugins/turbot/aws#configuration" rel="noopener noreferrer"&gt;AWS Plugin installed and configured&lt;/a&gt;, you can easily run SQL queries against AWS APIs represented as database tables, and export the results to CSV files that load into Splunk as lookup tables.&lt;/p&gt;

&lt;p&gt;Let's start with a simple example: a list of all &lt;a href="https://hub.steampipe.io/plugins/turbot/aws/tables/aws_organizations_account" rel="noopener noreferrer"&gt;AWS Accounts&lt;/a&gt; in an organization. This query (&lt;a href="https://github.com/turbot/steampipe-samples/tree/main/all/splunk-lookup-tables/accounts.sql" rel="noopener noreferrer"&gt;accounts.sql&lt;/a&gt;) pulls the twelve-digit account id, Account Name, Status (Active or Suspended), and four specific tags on each account.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;select&lt;/span&gt; &lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;status&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;tags&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&amp;gt;&lt;/span&gt; &lt;span class="s1"&gt;'ExecutiveOwner'&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;Executive_Owner&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;tags&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&amp;gt;&lt;/span&gt; &lt;span class="s1"&gt;'TechnicalContact'&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;Technical_Contact&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;tags&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&amp;gt;&lt;/span&gt; &lt;span class="s1"&gt;'DataClassification'&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;Data_Classification&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;tags&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&amp;gt;&lt;/span&gt; &lt;span class="s1"&gt;'environment'&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;Environment&lt;/span&gt;
&lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="n"&gt;aws_payer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;aws_organizations_account&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here's the command to extract this information into a CSV file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;steampipe query accounts.sql &lt;span class="nt"&gt;--output&lt;/span&gt; csv &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; sp_aws_accounts.csv
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The command extracts the account information from AWS Organizations into a CSV file that's ready to be used by Splunk.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;id,name,status,executive_owner,technical_contact,data_classification
877426665359,fooli-dev,ACTIVE,Richard Hendricks,Dinesh Chugtai,Public
352894534996,fooli-security,ACTIVE,Bertram Gilfoyle,Bertram Gilfoyle,Internal
152981771857,fooli-prod,ACTIVE,Erlich Bachman,Richard Hendricks ,PersonalInformation
540147993428,fooli-payer,ACTIVE,Monica Hall,Bertram Gilfoyle,None
747037951011,fooli-memefactory,ACTIVE,Erlich Bachman,Richard Hendricks,PersonalInformation
755629548949,fooli-sandbox,ACTIVE,Erlich Bachman,Dinesh Chugtai,None
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;One of the most useful Splunk lookup tables maps from  internal or external IP addresses to the cloud resources they belong to. This Steampipe query gets all of the &lt;a href="https://hub.steampipe.io/plugins/turbot/aws/tables/aws_ec2_network_interface" rel="noopener noreferrer"&gt;ENIs&lt;/a&gt; in your environment. It joins that data with your &lt;a href="https://hub.steampipe.io/plugins/turbot/aws/tables/aws_organizations_account" rel="noopener noreferrer"&gt;account&lt;/a&gt; and &lt;a href="https://hub.steampipe.io/plugins/turbot/aws/tables/aws_vpc" rel="noopener noreferrer"&gt;VPC&lt;/a&gt; data to provide account and VPC names. Finally, it populates the &lt;code&gt;attached_resource&lt;/code&gt; column with either the EC2 Instance ID or the ENI Description to tell you which resource each  public or private IP address belongs to.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;select&lt;/span&gt;
  &lt;span class="n"&gt;eni&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;network_interface_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;eni&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;private_ip_address&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;eni&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;vpc_id&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;vpc_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;eni&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;region&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;eni&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;status&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;eni&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;interface_type&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;eni&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;association_public_ip&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;public_ip&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="k"&gt;case&lt;/span&gt;
    &lt;span class="k"&gt;when&lt;/span&gt; &lt;span class="n"&gt;eni&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;attached_instance_id&lt;/span&gt; &lt;span class="k"&gt;is&lt;/span&gt; &lt;span class="k"&gt;not&lt;/span&gt; &lt;span class="k"&gt;null&lt;/span&gt;
      &lt;span class="k"&gt;then&lt;/span&gt; &lt;span class="n"&gt;eni&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;attached_instance_id&lt;/span&gt;
    &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="n"&gt;eni&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;description&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;attached_resource&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;vpc&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tags&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&amp;gt;&lt;/span&gt; &lt;span class="s1"&gt;'Name'&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;vpc_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;org&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;account_name&lt;/span&gt;
&lt;span class="k"&gt;from&lt;/span&gt;
  &lt;span class="n"&gt;aws_ec2_network_interface&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;eni&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;aws_vpc&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;vpc&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;aws_payer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;aws_organizations_account&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;org&lt;/span&gt;
&lt;span class="k"&gt;where&lt;/span&gt; &lt;span class="n"&gt;vpc&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;vpc_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;eni&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;vpc_id&lt;/span&gt;
  &lt;span class="k"&gt;and&lt;/span&gt; &lt;span class="n"&gt;org&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;eni&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;account_id&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Run &lt;code&gt;steampipe query eni.sql --output csv &amp;gt; sp_eni.csv&lt;/code&gt; to generate a CSV file like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;network_interface_id,private_ip_address,vpc_id,region,status,interface_type,public_ip,attached_resource,vpc_name,account_name
eni-0bb119c9b8271e13c,10.12.15.127,vpc-0baccbf3534d6a80c,us-east-1,in-use,interface,&amp;lt;null&amp;gt;,arn:aws:ecs:us-east-1:747037951011:attachment/2d6f0e6e-4e26-4dea-ae2f-7f0eac27d471,Prod VPC,fooli-memefactory
eni-03fb47e928c58f8c6,10.12.11.158,vpc-0baccbf3534d6a80c,us-east-1,in-use,interface,52.4.203.49,ELB app/prod-FooliApiStack/564122fa59bf64fe,Prod VPC,fooli-memefactory
eni-019de27cb07fe61f2,10.12.11.162,vpc-0baccbf3534d6a80c,us-east-1,in-use,interface,18.212.183.121,i-0a2af645e985e6aed,Prod VPC,fooli-memefactory
eni-0b0a0f4a74f50324b,10.12.15.233,vpc-0baccbf3534d6a80c,us-east-1,in-use,lambda,&amp;lt;null&amp;gt;,AWS Lambda VPC ENI-prod-FooliMailerStack-mailer-13aa6e45-264b-46cd-a91b-17cc79e7a011,Prod VPC,fooli-memefactory
eni-0d55432870ac355fc,10.12.11.119,vpc-0baccbf3534d6a80c,us-east-1,in-use,interface,54.86.186.254,i-0eeacb7ba5bed288a,Prod VPC,fooli-memefactory
eni-0dc96ad6321864543,10.12.11.151,vpc-0baccbf3534d6a80c,us-east-1,in-use,interface,34.231.250.0,RDSNetworkInterface,Prod VPC,fooli-memefactory
eni-06741026048ceb699,10.12.10.106,vpc-0baccbf3534d6a80c,us-east-1,in-use,interface,34.202.5.187,ELB app/prod-FooliApiStack/564122fa59bf64fe,Prod VPC,fooli-memefactory
eni-03831a874fce92713,10.12.10.104,vpc-0baccbf3534d6a80c,us-east-1,in-use,nat_gateway,44.196.140.136,Interface for NAT Gateway nat-0038ea0ba69861382,Prod VPC,fooli-memefactory
eni-0cb11b13ee1ded995,10.10.11.215,vpc-0c320ebb500ab616a,us-east-1,in-use,interface,3.87.77.3,i-08154eb5935852d50,Dev VPC,fooli-dev
eni-0265d9a496ff2b01e,10.10.10.218,vpc-0c320ebb500ab616a,us-east-1,in-use,interface,44.205.90.161,ELB app/dev-FooliApiStack/6fb5790bfca6ea6f,Dev VPC,fooli-dev
eni-074aac46384fe2501,10.10.10.217,vpc-0c320ebb500ab616a,us-east-1,in-use,nat_gateway,18.206.78.232,Interface for NAT Gateway nat-031b431e19aa13518,Dev VPC,fooli-dev
eni-0f45b8f63195340c5,10.10.11.66,vpc-0c320ebb500ab616a,us-east-1,in-use,interface,54.234.182.17,i-0d1bfdfc785de0619,Dev VPC,fooli-dev
eni-0bf128705f3c36d94,10.8.10.56,vpc-0da7e4ff31018985b,us-east-1,in-use,interface,&amp;lt;null&amp;gt;,i-06944d48a6262f7f9,SecurityVPC,fooli-security
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can now use this lookup table with either the &lt;code&gt;public_ip&lt;/code&gt;, &lt;code&gt;private_ip&lt;/code&gt; or the &lt;code&gt;network_interface_id&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.splunk.com/Documentation/Splunk/9.0.1/Knowledge/Usefieldlookupstoaddinformationtoyourevents" rel="noopener noreferrer"&gt;Upload your file to SplunkWeb&lt;/a&gt; or drop the CSV outputs from Steampipe into &lt;code&gt;/opt/splunk/etc/system/lookups&lt;/code&gt; on your &lt;a href="https://docs.splunk.com/Splexicon:Searchhead" rel="noopener noreferrer"&gt;Splunk search heads&lt;/a&gt;, and you're ready to start using these lookups.&lt;/p&gt;

&lt;h2&gt;
  
  
  How Splunk lookups can simplify your life and help find threats in your environment.
&lt;/h2&gt;

&lt;p&gt;Let's start with a simple example: Who is logging into your AWS accounts via the Web Console? Here we can decorate the ConsoleLogin events with not just the user and IP address, but also include the AWS Account's &lt;em&gt;Name&lt;/em&gt; which isn't normally available in CloudTrail.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;index="aws_cloudtrail" eventName=ConsoleLogin
| LOOKUP sp_aws_accounts.csv id AS recipientAccountId
  OUTPUT name as account_name
| table account_name, userIdentity.arn, sourceIPAddress
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv1h87kpixicyt6optew4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv1h87kpixicyt6optew4.png" alt="Splunk Search Results - Console Logins" width="800" height="259"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Steampipe-generated lookup tables also work well when you're searching VPC Flow Logs. Here we can cross-reference the FlowLog record &lt;code&gt;dvc&lt;/code&gt; (device) with a lookup table of all the network devices to get the names of the resources that this target IP address was talking to.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;index=aws_flowlogs 3.236.91.34
| LOOKUP sp_eni.csv network_interface_id AS dvc
  OUTPUT attached_resource as attached_resource, vpc_name as VPC, account_name as Account
| stats count by attached_resource, VPC, Account
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjpuz6dxoiuvp54w2chnb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjpuz6dxoiuvp54w2chnb.png" alt="Splunk Search Results - VPCs" width="800" height="260"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;One final example: Using ENI data to cross-reference where an EC2 Instance Role is coming from:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;index=aws_cloudtrail  "userIdentity.arn"="arn:aws:sts::*:assumed-role/payments_role/*"
| LOOKUP sp_eni.csv public_ip AS sourceIPAddress
  OUTPUTNEW attached_resource as attached_resource
| fillnull attached_resource value="External IP"
| stats count by sourceIPAddress, eventName, attached_resource
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw59okx4wxel10g2y4zqx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw59okx4wxel10g2y4zqx.png" alt="Splunk Search Results - Network Addresses" width="800" height="341"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If a role used by an EC2 Instance is generating CloudTrail events from an IP address not part of your VPC, that's a major concern that needs to be investigated. Here we see a role that belongs to an EC2 Instance being used outside from IP Address that are not tied to known EC2 Instances.&lt;/p&gt;

&lt;h2&gt;
  
  
  Extend your enrichment activities with Steampipe
&lt;/h2&gt;

&lt;p&gt;AWS Accounts and elastic IP addresses are only two examples of event enrichment for your SEIM. If you have a tagging strategy for your EC2 Instances, you can query instance names and contact teams to correlate them with your endpoint detection and response (EDR) alerts.&lt;/p&gt;

&lt;p&gt;Your organization is probably polycloud, and so is Steampipe. Plugins exist for all the major cloud providers; you can  easily  decorate your Azure activity logs and GCP audit logs.&lt;/p&gt;

&lt;p&gt;The sample queries here were only examples of the power of Steampipe to enrich logging data. If you try this technique, &lt;a href="https://steampipe.io/community/join" rel="noopener noreferrer"&gt;let us know&lt;/a&gt;. We love hearing how practitioners use Steampipe.&lt;/p&gt;

</description>
      <category>opensource</category>
      <category>nextjs</category>
      <category>tailwindcss</category>
      <category>saas</category>
    </item>
    <item>
      <title>Can't miss Security Sessions at re:Invent 2022</title>
      <dc:creator>Chris Farris</dc:creator>
      <pubDate>Mon, 10 Oct 2022 12:35:06 +0000</pubDate>
      <link>https://dev.to/aws-builders/cant-miss-security-sessions-at-reinvent-2022-1agl</link>
      <guid>https://dev.to/aws-builders/cant-miss-security-sessions-at-reinvent-2022-1agl</guid>
      <description>&lt;p&gt;Session registration for re:Invent 2022 opens in a few days, and &lt;a href="https://twitter.com/donkersgood/status/1579032604216791043?s=20&amp;amp;t=5i0BmhYI3vm8GbvWnKxm4Q"&gt;Luc van Donkersgoed&lt;/a&gt; inspired me with &lt;a href="https://bitesizedserverless.com/bite/session-guide-to-reinvent-2022/"&gt;his post&lt;/a&gt; to highlight what you can't miss in the Security, Compliance, and other cool topics.&lt;/p&gt;

&lt;p&gt;The session planner doesn't let me deep-link, so I'll be including the session numbers. I won't include time and place since I expect those to change. The description from the session catalog is included.&lt;/p&gt;

&lt;h2&gt;
  
  
  Can't miss security sessions
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;SEC401 - AWS Identity and Access Management (IAM) policy evaluation in action&lt;/strong&gt; (Workshop)&lt;br&gt;&lt;br&gt;
One of the most &lt;a href="https://ermetic.com/blog/aws/diving-deeply-into-iam-policy-evaluation-highlights-from-aws-reinforce-session-iam433/"&gt;popular and talked about chalk talks&lt;/a&gt; from re:Inforce was Matt Luttrell and Dan Peebles diving deep into policy evaluation. They provided new diagrams and animations to explain how SCPs, permission boundaries, and resource policies all interact with identity-based policies, conditions, and different effects. Matt is going to reprise and extend this work into a two-hour workshop. This session is a must if you're fighting SCPs, Permission Boundaries, and Resource Policies.&lt;/p&gt;

&lt;blockquote&gt;In this workshop, dive deep into the logic of AWS Identity and Access Management (IAM) policy evaluation. Gain experience with hands-on labs that walk through IAM use cases and learn how different policies interact with each other. Using identity- and resource-based policies within single- and cross-account scenarios, learn about the evaluation logic that you can apply in your own environment. You must bring your laptop to participate.&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;SEC402 - The anatomy of a ransomware event targeting data residing in Amazon S3&lt;/strong&gt; (Chalk Talk)&lt;br&gt;&lt;br&gt;
This is a repeat of the chalk talk given at re:Inforce and was one of the best sessions from that event. Kyle works on the &lt;a href="https://aws.amazon.com/blogs/security/welcoming-the-aws-customer-incident-response-team/"&gt;Customer Incident Response Team&lt;/a&gt; and will dive into real-world cases that AWS has seen in the field of Ransomware attacks on customers. You'll come away from this session with action items based on actual threat intel.&lt;/p&gt;

&lt;blockquote&gt;Ransomware events can cost governments, nonprofits, and businesses billions of dollars and interrupt operations. Early detection and automated responses are important steps that can limit your organization’s exposure. In this chalk talk, walk through the anatomy of a ransomware event that targets data residing in Amazon S3 and hear detailed best practices for detection, response, recovery, and protection.&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;BOA204 - When security, safety, and urgency all matter: Handling Log4Shell&lt;/strong&gt; (Breakout Session)&lt;br&gt;&lt;br&gt;
The indomitable Abby Fuller will talk through how AWS &lt;em&gt;internally&lt;/em&gt; handled Log4Shell. How a $1.6T company handled Log4Shell and how &lt;em&gt;you&lt;/em&gt; handled Log4Shell are not really comparable, I think this session will have a lot to offer for the next time something like this hits. It's also in that quiet, sweet spot of Friday morning when nothing else is happening and everyone is wrapping up.&lt;/p&gt;

&lt;blockquote&gt;On December 9, 2021, there was a report of a potential remote code execution issue in the widely used open-source Apache logging library Log4j. This issue allowed a user to use Java Naming and Directory Interface (JNDI) and LDAP endpoints to execute arbitrary code on a system. Over the next 10 days, 5 additional common vulnerabilities and exposures affecting Log4j were made public. This event as is now referred to as Log4Shell. In this session, learn about the response to Log4Shell, from initial notification to hot patch, fleet scanning, and customer communications.&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;COP305 - Best practices for organizing and operating on AWS&lt;/strong&gt; (Breakout Session)&lt;br&gt;&lt;br&gt;
Full Disclosure: The customer speaker here is my former boss, and she's talking about the AWS Organizations governance work my old team and I did while I was at Discovery. Learn how we tamed 200+ AWS accounts and balanced the needs of security, finance, and the whims of half a dozen different business units.&lt;/p&gt;

&lt;blockquote&gt;Managing and operating cloud environments from multiple business units can be challenging. In this session, hear from Bianca Lankford, Senior Director/Global Head of Cloud Engineering &amp;amp; Governance from Warner Brothers Discovery, how they organized their cloud environment to allow teams to develop with agility while being able to manage and operate their applications in a secure, automated, reliable, and cost-effective way. See how you can use AWS Organizations and AWS Systems Manager to operate your applications at scale, manage mergers and acquisitions, and develop governance as a product for your environment.&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;SEC206 - Security operations metrics that matter&lt;/strong&gt; (Chalk Talk)&lt;br&gt;&lt;br&gt;
One of the things I never finished at my last job was building out cloud security key risk indicators (KRIs) for my program. Anna was one of my partners in crime for my &lt;a href="https://d1.awsstatic.com/events/reinvent/2021/Adversary_emulation_for_incidentresponse_readiness_REPEAT_SEC309-R2.pdf"&gt;adversary emulation chalk talk&lt;/a&gt; at last year's re:Invent, so I'm keen to hear what she's built for her customers.&lt;/p&gt;

&lt;blockquote&gt;Security tooling can produce thousands of security findings to act on. But what are the most important items and metrics to focus on? In this chalk talk, learn about a framework you can use to develop and implement security operations metrics in order to prioritize the highest-risk issues across your AWS environment. This includes applying critical business context and capturing the metrics across your multi-account environment so you can take action with confidence.&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;SEC330 - Harness the power of temporary credentials with IAM Roles Anywhere&lt;/strong&gt; (Chalk Talk)&lt;br&gt;&lt;br&gt;
I'll be honest. I have not yet kicked the tires on IAM Roles Anywhere. This service promises to eliminate the need for IAM Users, but the key management issues make me wonder how many enterprises will do it right. This is a chalk talk, so I hope for some lively discussion and to discover some of the risks and edge cases with this new service.&lt;/p&gt;

&lt;blockquote&gt;In this chalk talk, get an introduction to AWS Identity and Access Management (IAM) Roles Anywhere, and dive deep into how you can use IAM Roles Anywhere to access AWS services from outside of AWS. Learn how IAM Roles Anywhere securely delivers temporary AWS credentials to your workloads. Discover the different use cases that IAM Roles Anywhere is designed to address as well as best practices for using it.&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;COP323 - Delegating access in a multi-account environment with IAM Identity Center&lt;/strong&gt; (Chalk Talk)&lt;br&gt;&lt;br&gt;
AWS SSO (I refuse to use the new name) has changed a lot since I first deployed it for the &lt;a href="https://www.chrisfarris.com/post/seccdc2022/"&gt;SECCDC&lt;/a&gt;. We started using it at my last job, and I want to see the additional capabilities they introduced.&lt;/p&gt;

&lt;blockquote&gt;In this chalk talk, learn about delegating access management with AWS Organizations and AWS Control Tower using AWS IAM Identity Center. Using customer-managed policies and permissions boundaries, you can enable a decentralized access management model with permissions guardrails that enforce coarse-grained authorization standards that apply in both role-based and attribute-based access control (RBAC and ABAC) models.&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;MKT304 - Faster vendor risk assessments with AWS Marketplace Vendor Insights&lt;/strong&gt; (Chalk Talk)&lt;br&gt;&lt;br&gt;
One of the more under-appreciated announcements from AWS re:Inforce was AWS Marketplace Vendor Insights, or as I thought of it at the time &lt;em&gt;Simplify Vendor Risk Management as a Service&lt;/em&gt;. I'm hoping this chalk talk will help me understand the capabilities of this new service and how security practitioners can use it to make their lives easier.&lt;/p&gt;

&lt;blockquote&gt;Validating the compliance and security posture for third-party SaaS products is often a complex process. In this chalk talk, explore how AWS Marketplace Vendor Insights simplifies the SaaS risk assessment process to help enterprises procure trusted software. Built upon AWS Config, AWS Audit Manager, and AWS Artifact, Vendor Insights streamlines the process by providing on-demand access to security and compliance information via a web-based dashboard. Learn how vendors can provide customers with on-demand access to security and compliance information to showcase security and compliance excellence.&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;SEC321 - Building your forensics capabilities on AWS&lt;/strong&gt; (Chalk Talk)&lt;br&gt;&lt;br&gt;
I've spent a lot of time recently giving talks and classes on &lt;a href="https://www.chrisfarris.com/post/aws-ir/"&gt;Incident Response in AWS&lt;/a&gt;, and EC2 Forensics is still a topic that lacks a definitive way of doing it. I look forward to Jonathon Poling's discussion of the topic in this chalk talk.&lt;/p&gt;

&lt;blockquote&gt;You have a compromised resource on AWS. How do you acquire evidence and artifacts? Where do you transfer the data, and how do you store it? How do you analyze it safely within an isolated environment? Join this chalk talk to walk through building a forensics lab on AWS, methods for implementing effective data acquisition and analysis, and how to make sure you are getting the most out of your investigations. Learn how to identify the tools and capabilities you need to effectively analyze it, as well as how to maintain least-privilege access to the evidence. Finally, walk through how to learn from your analysis and investigation to improve your security.&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;SEC202 - Vulnerability management with Amazon Inspector and AWS Systems Manager&lt;/strong&gt; (Builder's Session)&lt;br&gt;&lt;br&gt;
Vulnerability Management sucks. I hate it. I'd rather farm alpaca than work in the VM space as it's implemented in most enterprises. That said, I've got ideas on how to improve it for cloud workloads (spoiler alert: that was one of the reasons I went to my current job). I've not done a builder's session before, and it's only a 200-level session, but I know one of the builders, so I'll be signing up for this one.&lt;/p&gt;

&lt;blockquote&gt;Join this builders’ session to learn how to use Amazon Inspector and AWS Systems Manager Patch Manager to scan and patch software vulnerabilities on Amazon EC2 instances. Walk through how to understand, prioritize, suppress, and patch vulnerabilities using AWS security services. You must bring your laptop to participate.&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;SEC208 - Executive security simulation&lt;/strong&gt; (Workshop)&lt;br&gt;&lt;br&gt;
I've not done much with tabletop exercises, and I've not yet done one focused on Cloud Security, so I think this one is a better use of my time than cheap beer and big crowds on the expo floor.&lt;/p&gt;

&lt;blockquote&gt;This workshop features an executive security simulation, designed to take senior security management and IT or business executive teams through an experiential exercise that illuminates key decision points for a successful and secure cloud journey. During this team-based, game-like simulation, use an industry case study to make strategic security, risk, and compliance decisions and investments. Experience the impact of these investments and decisions on the critical aspects of your secure cloud adoption. Learn about the major success factors that impact security, risk, and compliance in the cloud and applicable decision and investment approaches to specific secure cloud adoption journeys. You must bring your laptop to participate.&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;SEC329 - AWS security services for container threat detection&lt;/strong&gt; (Breakout Session)&lt;br&gt;&lt;br&gt;
I will admit I'm not an expert on container security. My personal development path jumped from EC2 straight to Lambda functions. So, I'm interested in a deeper dive into what my former colleague Mrunal did with the topic.&lt;/p&gt;

&lt;blockquote&gt;Containers are a cornerstone of many AWS customers’ application modernization strategies. The increased dependence on containers in production environments requires threat detection that is designed for container workloads. To help meet the container security and visibility needs of security and DevOps teams, new container-specific security capabilities have recently been added to Amazon GuardDuty, Amazon Inspector, and Amazon Detective. In this session, learn about these new capabilities and the deployment and operationalization best practices that can help you scale your AWS container workloads. Additionally, the head of cloud security at HBO Max shares container security monitoring best practices.&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Sessions highlighting the massive awesomeness of AWS
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;SEC404 - A day in the life of a billion requests&lt;/strong&gt; (Breakout Session)&lt;br&gt;&lt;br&gt;
Eric Brandwine will dive into the scale and operational considerations of IAM (ignore the ironically bad session number). Let's face it: there is nothing actionable here for you. This is just a chance to nerd-out on a cool topic.&lt;/p&gt;

&lt;blockquote&gt;Every day, sites around the world authenticate their callers. That is, they verify cryptographically that the requests are actually coming from who they claim to come from. In this session, learn about unique AWS requirements for scale and security that have led to some interesting and innovative solutions to this need. How did solutions evolve as AWS scaled multiple orders of magnitude and spread into many AWS Regions around the globe? Hear about some of the recent enhancements that have been launched to support new AWS features, and walk through some of the mechanisms that help ensure that AWS systems operate with minimal privileges&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;SEC327 - Zero-privilege operations: Running services without access to data&lt;/strong&gt; (Breakout Session)&lt;br&gt;&lt;br&gt;
If you recall the &lt;a href="https://en.wikipedia.org/wiki/2020_Twitter_account_hijacking"&gt;Twitter compromise in 2020&lt;/a&gt;, or the more recent &lt;a href="https://www.okta.com/blog/2022/04/okta-concludes-its-investigation-into-the-january-2022-compromise/"&gt;Okta incident&lt;/a&gt;, both had the same things in common: Employees had high levels of access to production and customer data. In this session &lt;a href="https://twitter.com/colmmacc"&gt;Colm MacCarthaigh&lt;/a&gt; will discuss how AWS keeps its employees out of your data. Who knows, I may even change my opinion on &lt;a href="https://www.chrisfarris.com/post/cloud-encryption/"&gt;Encryption in AWS&lt;/a&gt;.&lt;/p&gt;

&lt;blockquote&gt;AWS works with organizations and regulators to host some of the most sensitive workloads in industry and government. In this session, learn how AWS secures data, even from trusted AWS operators and services. Explore the AWS Nitro System and how it provides confidential computing and a trusted runtime environment, and dive deep into the cryptographic chains of custody that are built into AWS Identity and Access Management (IAM). Finally, hear how encryption is used to provide defense in depth and why we focus on verified isolation and customer transparency at AWS.&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;ARC310 - Beyond five 9s: Lessons from our highest available data planes&lt;/strong&gt; (Breakout Session)&lt;br&gt;&lt;br&gt;
Another session from Colm MacCarthaigh on the scale and reliability of AWS Services. If I can't make this in-person, it will be at the top of my viewing list when I get home.&lt;/p&gt;

&lt;blockquote&gt;Updated with recent learning, this session dives deep into building and improvising resilience in AWS services. Every AWS service is designed to be highly available, but a small number of what are called Tier 0 services get extra-special attention. In this session, hear lessons from how AWS has built and architected Amazon Route 53 and the AWS authentication system to help them survive cataclysmic failures, enormous load increases, and more. Learn about the AWS approach to redundancy and resilience at the infrastructure, software, and team levels and how the teams tasked with keeping the internet running manage themselves and keep up with the pace of change that AWS customers demand.&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Other Sessions I'm going to
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;STG215 - Unlocking business value in media and entertainment with Amazon S3&lt;/strong&gt; (Breakout Session)&lt;br&gt;&lt;br&gt;
2016 was the year of the spreadsheet for me. I was designing the financial model for moving the (then 14PB) CNN Library into AWS S3. &lt;a href="https://aws.amazon.com/about-aws/whats-new/2016/11/access-your-amazon-glacier-data-in-minutes-with-new-retrieval-options/"&gt;Glacier Expedited Retrieval&lt;/a&gt; wasn't a thing then, so it was lots of "how can we use S3 IA". I'm excited to see how this project has moved forward with former colleague Jay Brown talking about how they're now using S3.&lt;/p&gt;

&lt;blockquote&gt;Media and entertainment companies are creating more content than ever to engage audiences and grow revenue, but many are overlooking the hidden value of content locked in their media archives. The proliferation of screens connected to the internet provides customers with more ways of consuming content whenever and wherever. Between the influx of choice and ubiquitous connectivity, storing media content is challenged to keep up with not only growth in storage but also capabilities needed to support this multiscreen world. In this session, Amazon S3 customers Warner Bros., CNN, and PGA Tour share how migrating media archives from on-premises systems to the cloud can unlock business value for your organization.&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;DOP402 - Implementing DevSecOps pipelines with compliance in mind&lt;/strong&gt; (Chalk Talk)&lt;br&gt;&lt;br&gt;
I'll admit I don't know enough about CI/CD, and I don't practice what the cloud security community preaches in this regard. I can't pass up a 400-level chalk talk on a subject I need to do more with.&lt;/p&gt;

&lt;blockquote&gt;In this chalk talk, review a DevSecOps CI/CD pipeline that includes software composition analysis, static application security testing, and dynamic application security testing. Also learn about best practices for incorporating security checkpoints across various pipeline stages and aggregating vulnerability findings into a single pane of glass. Finally, hear about processes and tools that can increase an organization’s ability to deliver applications and services in a secure manner.&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;COP325 - Migrate AWS accounts like an expert&lt;/strong&gt; (Chalk Talk)&lt;br&gt;&lt;br&gt;
While this isn't my job anymore, I spent a lot of time thinking through this process, and I'm curious about what AWS will suggest here.&lt;/p&gt;

&lt;blockquote&gt;This chalk talk is for you if you’ve ever had to migrate an AWS account from one organization to another during a merger and acquisition activity, while migrating to AWS Control Tower, or just while organizing your AWS environment according to best practices. In this talk, learn how to identify dependencies in your current organization that you can proactively address before the migration. The talk covers code for detecting resource policies and identity policies with dependencies. Walk through additional checks that can help you achieve a quick and efficient migration.&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;STG404 - Explore Amazon EBS direct APIs with flexible snapshot proxy&lt;/strong&gt; (Chalk Talk)&lt;br&gt;&lt;br&gt;
Until recently, the only way to get EBS data out of the cloud was to mount it to a machine and run &lt;code&gt;dd | ssh&lt;/code&gt;. The EBS Direct APIs are new to me, and there are some interesting security implications with these capabilities I want to know more about.&lt;/p&gt;

&lt;blockquote&gt;Amazon EBS snapshots are a feature-rich data protection function used by enterprises for block-level data. Join this chalk talk to learn how flexible snapshot proxy, an open-source project that uses Amazon EBS direct APIs, can enable you to efficiently move data between applications in a cross-Region, cross-organization, logically air-gapped replication scenario without temporary copies. Understand how a block of data moves through systems, services, and geographies. Also learn how to eliminate temporary copies, reduce transfer costs, improve RTO/RPO, and integrate your on-premises applications and systems with Amazon EBS. This talk dives into field-proven architectural patterns for building global-scale real-world solutions.&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;re:Play&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
I joke that my annual pilgrimage to re:Invent is my &lt;em&gt;"Cloud Nerd Rave in the Desert"&lt;/em&gt;. Thursday night's re:Play is that rave. Even if you hate crowds and loud music, you should attend once to see the spectacle and realize that Frugality is a selective &lt;a href="https://www.amazon.jobs/en/principles"&gt;leadership principle&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Open Source Zone and Steampipe
&lt;/h2&gt;

&lt;p&gt;I didn't pitch a talk for re:Invent this year, but I will be presenting. Come by the Open Source Zone (Third floor of the Venetian near San Polo and the Press area) on Tuesday from 1pm to 3pm and I'll be demoing our open source tool &lt;a href="https://steampipe.io/"&gt;Steampipe&lt;/a&gt; and a number of the nifty things it can do to help manage your cloud sprawl and reduce risk in your organization.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pre-registration
&lt;/h2&gt;

&lt;p&gt;Pre-Registration opens on October 11th at 1pm EDT for a guaranteed seat in these sessions. I'm told that 25% capacity is reserved for walk-ups, and in the past there have been a number of no-shows. I've also been denied entrance to a colleague's presentation because they ran out of open seats, and AWS policy doesn't allow anyone to stand in the back. So may the odds be ever in your favor.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloudsecurity</category>
    </item>
  </channel>
</rss>
