<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Pedram Hamidehkhan</title>
    <description>The latest articles on DEV Community by Pedram Hamidehkhan (@pedramha).</description>
    <link>https://dev.to/pedramha</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/pedramha"/>
    <language>en</language>
    <item>
      <title>Wing It: Cloud CRUD with Winglang</title>
      <dc:creator>Pedram Hamidehkhan</dc:creator>
      <pubDate>Wed, 26 Nov 2025 20:55:13 +0000</pubDate>
      <link>https://dev.to/pedramha/wing-it-cloud-crud-with-winglang-1c0i</link>
      <guid>https://dev.to/pedramha/wing-it-cloud-crud-with-winglang-1c0i</guid>
      <description>&lt;p&gt;When it comes to infrastructure as code, tools like Terraform, AWS CDK, and Pulumi immediately come to mind. Although these tools include testing capabilities, they often lack a fully integrated local simulation environment that accurately mimics cloud behavior. This gap can lead to longer, more expensive iteration cycles during development. Winglang addresses this challenge by providing a local simulator that mimics cloud behavior, enabling developers to test their code locally before deploying it to the cloud. Additionally, its simple and intuitive syntax and concept of preflight (static infrastructure code) and inflight (the application logic) code makes it easy for developers to work with cloud resources, regardless of their level of experience. In this post, we'll build a CRUD API using Winglang, deploy it to AWS, and build a GitOps workflow for the deployment and explore some of Winglang's key capabilities.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Prerequisites:&lt;/strong&gt; Docker, Node.js, Terraform, AWS account&lt;/p&gt;

&lt;p&gt;This post is organized in the following sections:&lt;br&gt;
&lt;strong&gt;Outline:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Introduction to Winglang&lt;/li&gt;
&lt;li&gt;Creating a CRUD API with Winglang&lt;/li&gt;
&lt;li&gt;Deploying the API to AWS&lt;/li&gt;
&lt;li&gt;Enabling a GitOps Workflow&lt;/li&gt;
&lt;li&gt;Conclusion&lt;/li&gt;
&lt;/ul&gt;


&lt;h2&gt;
  
  
  Introduction to Winglang
&lt;/h2&gt;

&lt;p&gt;Winglang is a high-level language that abstracts away the complexity of cloud infrastructure so you can focus on application logic. It unifies resource provisioning (preflight code) with dynamic operations (inflight code):&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Preflight:&lt;/strong&gt; Code that runs at compile time to set up static resources (e.g., VPCs, subnets).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Inflight:&lt;/strong&gt; Code that runs at runtime to manage dynamic operations (e.g., ECS Containers, AWS Lambda).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Even though Winglang facilitates development in the cloud for non-experts and folks with little to no infra exposure, I think its main superpower is its local simulator. The local simulator gives you a simulation of all the resources in your code and you can interact with them as if they were already deployed. For example, you can make API calls to validate if things get persisted in your database or if a notification gets triggered when a certain event occurs in the system. This can bring meaningful cost savings in the DevOps cycle as we shift testing to the left.&lt;/p&gt;
&lt;h2&gt;
  
  
  Creating a CRUD API with Winglang
&lt;/h2&gt;

&lt;p&gt;In this section, we will create a simple CRUD API using Winglang. Firstly please ensure you have installed Winglang on your machine. You can do this by running the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-g&lt;/span&gt; Winglang
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, we can “import” the modules we want to interact with—or as Winglang expresses it, “bring” the modules we need:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;bring cloud;    // this will bring all the cloud modules
bring dynamodb; // for database
bring http;     // for testing our APIs in the local simulator
bring expect;   // for testing
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I'd like to persist the data in a DynamoDB table, but you can of course opt for a bucket or any other DB choice you are more comfortable with. Here is the code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;messagesTable&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nx"&gt;dynamodb&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Table&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="nx"&gt;attributes&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;id&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;S&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;],&lt;/span&gt;
  &lt;span class="nx"&gt;hashKey&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;id&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now if you notice, this code is considered preflight because it is static and it will run at compile time. The table will be created when the code is compiled and the table will be available when the code is deployed. &lt;/p&gt;

&lt;p&gt;Next I am going to ask Wing to create an API for me that will interact with the table:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let api = new cloud.Api();
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then I'd like to also create a counter, which lets me change the ID of the items that are being saved in DynamoDB:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let messageCounter = new cloud.Counter();
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is again preflight code and static in nature. Now let's see how inflight code operates by creating a POST method on our API which persists data to our database.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;api&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;/messages&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nf"&gt;inflight &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;request&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;newMessage&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;Json&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;messageCounter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;inc&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
    &lt;span class="na"&gt;message&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;request&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;
  &lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;New message: {Json.stringify(newMessage)}&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;recordId&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;messageCounter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;inc&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
  &lt;span class="nx"&gt;messagesTable&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="na"&gt;Item&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;Json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;recordId&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
      &lt;span class="na"&gt;message&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;newMessage&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;status&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Content-Type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;text/html&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Access-Control-Allow-Origin&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;*&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="na"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;Json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;newMessage&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;So as you can see, contrary to previous objects, we annotated this one with the &lt;code&gt;inflight&lt;/code&gt; keyword. This means that this code will run at runtime and it will be executed. &lt;/p&gt;

&lt;p&gt;Before deploying I'd like to add one final step, which is testing our API. To that end I added the following tests:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;validateResponse&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;inflight&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;http&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;Response&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;expectedContent&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;str&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Response status: {response.status}&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="nx"&gt;expect&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;equal&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;status&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="nf"&gt;assert&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nf"&gt;contains&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;expectedContent&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="nx"&gt;test&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;messages API returns correct response&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;http&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;{api.url}/messages&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;xyz&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Response body: {response.body}&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="nf"&gt;assert&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nf"&gt;contains&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;xyz&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now run the following command to start the local simulator (Docker must be running):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;wing it 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now you should see the simulator like depicted in the image below:&lt;br&gt;
&lt;a href="/images/winglang/localsim.png" class="article-body-image-wrapper"&gt;&lt;img src="/images/winglang/localsim.png" alt="Winglang simulator"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The simulator allows you to try many different aspects of your code to ensure that it works as expected. You can make API calls, check the logs, and even interact with the database. This is a very powerful tool that can help you to test your code before deploying it to the cloud.&lt;/p&gt;
&lt;h2&gt;
  
  
  Deploying the API to AWS
&lt;/h2&gt;

&lt;p&gt;After successfully testing the code in the local simulator, we can now deploy the code to AWS. To do this, we need to compile the code to the platform of our choice. In this case I opted for Terraform, but AWS CDK should also be supported. Here is the command to compile the code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;wing compile &lt;span class="nt"&gt;-t&lt;/span&gt; tf-aws main.w
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This code will generate the Terraform code necessary to be deployed on AWS. You can navigate to &lt;code&gt;target/main.tfaws&lt;/code&gt; and examine the generated files. Of course, at this point you can run the following Terraform commands to deploy the changes to AWS:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;terraform init
terraform apply
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As this will create a local state file and does not enable a GitOps workflow, I'd like to show you how you can enable a GitOps workflow with Winglang.&lt;/p&gt;

&lt;h2&gt;
  
  
  Enabling a GitOps Workflow
&lt;/h2&gt;

&lt;p&gt;To enable a GitOps workflow with Winglang, there is one addition that we need in our Wing code which can be stored as a JavaScript file in the root of the project. Here is the code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;exports&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;Platform&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kd"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;TFBackend&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nf"&gt;postSynth&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;config&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nx"&gt;config&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;terraform&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;backend&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="na"&gt;s3&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="na"&gt;bucket&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;TF_BACKEND_BUCKET&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;          
            &lt;span class="na"&gt;region&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;TF_BACKEND_REGION&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;          
            &lt;span class="na"&gt;key&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;state/terraform.tfstate&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;dynamodb_table&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;TF_BACKEND_TABLE&lt;/span&gt;    
        &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;};&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;config&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will ensure that the Terraform state is stored in an S3 bucket and a DynamoDB table is used to lock the state. This is necessary to enable a GitOps workflow with Winglang. The &lt;code&gt;script.sh&lt;/code&gt; file in the repository should help with the creation of the S3 and DynamoDB resources.&lt;br&gt;
To improve things a bit further, we can also add a GitHub Action that will run the Winglang commands and deploy the code to AWS. Here is a &lt;a href="https://github.com/pedramha/wing-crud-backend/blob/main/.github/workflows/deploy.yml" rel="noopener noreferrer"&gt;link&lt;/a&gt; to the GitHub Action file.&lt;br&gt;
Now before everything falls in place, we need to create a GitHub repository and push the code to the repository. After that, we need to create a secret in the repository for AWS credentials as well as the Terraform state file. Of course the recommended approach is to use &lt;a href="https://docs.github.com/en/actions/security-for-github-actions/security-hardening-your-deployments/configuring-openid-connect-in-amazon-web-services" rel="noopener noreferrer"&gt;GitHub OIDC&lt;/a&gt; to authenticate with AWS, but for simplicity I created a temporary &lt;a href="https://docs.aws.amazon.com/cli/latest/userguide/cli-authentication-user.html" rel="noopener noreferrer"&gt;IAM user&lt;/a&gt; with minimal permissions and generated an access key for it.&lt;br&gt;
Now you can add all these secrets and variables in GitHub as variables in the repository settings. So the ones required by AWS are:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;AWS_ACCESS_KEY_ID
AWS_REGION
AWS_SECRET_ACCESS_KEY
TF_BACKEND_BUCKET: [Terraform] the name of the s3 bucket
TF_BACKEND_TABLE: [Terraform] the name of the dynamodb table
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now you can push the code to the repository and the GitHub Action should run and deploy the code to AWS. You can check the status of the action in the Actions tab in the repository.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In this post, we have seen how to create a simple CRUD API using Winglang and deploy it to AWS. We have also seen how to enable a GitOps workflow with Winglang. Winglang is a powerful tool that can help developers to build and deploy cloud applications with ease. It provides a simple and intuitive syntax that makes it easy to work with cloud resources. It also provides a local simulator that allows developers to test their code before deploying it to the cloud. I hope this post has been helpful and that you have learned something new. If you have any questions or comments, please feel free to leave them below. Thank you for reading!&lt;/p&gt;

</description>
      <category>cdk</category>
      <category>iac</category>
      <category>terraform</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Collaborative Infrastructure as Code using Terraform Cloud - Publishing Modules Private Registries</title>
      <dc:creator>Pedram Hamidehkhan</dc:creator>
      <pubDate>Wed, 22 Jun 2022 16:47:08 +0000</pubDate>
      <link>https://dev.to/pedramha/collaborative-infrastructure-as-code-using-terraform-cloud-publishing-modules-private-registries-3am3</link>
      <guid>https://dev.to/pedramha/collaborative-infrastructure-as-code-using-terraform-cloud-publishing-modules-private-registries-3am3</guid>
      <description>&lt;p&gt;I'd like to start this post with a great quote from The DevOps Handbook which says: "When new learnings are discovered locally, there must also be some mechanism&lt;br&gt;
to enable the rest of the organization to use and benefit from that knowledge." Infrastructure as Code is not a new concept and it is already used in many projects. However, collaborating on IaC has not always been easy for developers. Many organization are struggling with sharing the knowledge and expertise of their developers with other teams. This is a difficult problem that private registries Terraform Cloud/Enterprise is aiming to solve.&lt;/p&gt;

&lt;p&gt;In a previous post, we had a go on solving the cold starts for Lambda Functions. Now we want to  help the next person in our team preventing them from reinvent the wheel again.&lt;/p&gt;

&lt;p&gt;This tutorial assumes you already have a Terraform Cloud account, if you haven't opened up an account yet, please use the links bellow to open one.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://cloud.hashicorp.com/products/terraform"&gt;https://cloud.hashicorp.com/products/terraform&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So how do you go about creating a private registry? &lt;/p&gt;

&lt;p&gt;We begin by creating the module that you'd like to publish. I am using &lt;a href="https://github.com/pedramha/terraform-aws-lambda"&gt;this&lt;/a&gt; module. I personally prefer adding variables with smart defaults as much as I can. Of course, the consumer of this module is also going to need some of the outputs of this module to be used in their application, e.g. the ARN of the lambda function. so don't forget to add them in the output section.&lt;/p&gt;

&lt;p&gt;After creating the module, we are ready to push it into Github. Please also be aware of the naming convention for your module, which follow the "terraform--" pattern.&lt;br&gt;
Before we jump to terraform cloud, we need to build a release for our project in Github, follow this &lt;a href="https://docs.github.com/en/repositories/releasing-projects-on-github/managing-releases-in-a-repository"&gt;this&lt;/a&gt; to build the release, OR, click on releases on the right pane of your repository, click on "draft a new release" choose a tag for the release (should follow x.y.z versioning pattern), give it a title and click on publish, and we are good to go.&lt;/p&gt;

&lt;p&gt;Now jump to Terraform Cloud, and click on registry, then click on modules, publish, modules. If you don't see your VCS provider, you need to connect it with Terraform cloud before proceeding, please visit &lt;a href="https://learn.hashicorp.com/tutorials/terraform/github-oauth?in=terraform/cloud"&gt;this link&lt;/a&gt; to do so.&lt;/p&gt;

&lt;p&gt;Choose your repository, and click on Publish module! &lt;/p&gt;

&lt;p&gt;Congratulations on publishing your first Terraform Module.&lt;/p&gt;

&lt;p&gt;In the next post, I will show you how to consume this module and integrate it into a bigger application.&lt;/p&gt;

&lt;p&gt;The link to the Video: &lt;a href="https://youtu.be/S5JTD3hWOug"&gt;https://youtu.be/S5JTD3hWOug&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Github repo: &lt;a href="https://github.com/pedramha/terraform-aws-lambda"&gt;https://github.com/pedramha/terraform-aws-lambda&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>terraform</category>
      <category>iac</category>
      <category>hashicorp</category>
    </item>
    <item>
      <title>Solving the Cold Start Challenge for Lambda Function Using Terraform Cloud</title>
      <dc:creator>Pedram Hamidehkhan</dc:creator>
      <pubDate>Fri, 10 Jun 2022 09:07:16 +0000</pubDate>
      <link>https://dev.to/pedramha/solving-the-cold-start-challenge-for-lambda-function-using-terraform-cloud-1o2m</link>
      <guid>https://dev.to/pedramha/solving-the-cold-start-challenge-for-lambda-function-using-terraform-cloud-1o2m</guid>
      <description>&lt;p&gt;In this post we are going to properly address the cold starts for Lambda Functions. In a later post we will create a private module which facilitates consumption of this module.&lt;/p&gt;

&lt;p&gt;Start in an empty directory and create the following files:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    "main.tf"
    "variables.tf"
    "output.tf"
    "terraform.auto.tfvars"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;In the "main.tf" file add the following to it.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    provider "aws" {
        region = "eu-central-1"
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;One quick note: please NEVER add your credentials here! Ideally, store credentials as environment variables in Terraform Cloud. &lt;br&gt;
Now open the terminal and execute the "terraform init" command. This will initialize our directory with the AWS provider.&lt;br&gt;
To deploy the lambda function, we will need to upload the packaged code to S3. So I am also leveraging the random provider to make sure that the S3 Bucket is unique.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    resource "random_pet" "lambda_bucket_name" {
      prefix = "test"
      length = 4
    }


    resource "aws_s3_bucket" "lambda_bucket" {
      bucket = random_pet.lambda_bucket_name.id
      acl    = "private"
        tags = {
        "env" = "test"
      }
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The two resources above will just create a random bucket.&lt;br&gt;
The following command will allow me to package the code before uploading it to S3 (I prefer to avoid the local executioner whenever possible):&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    data "archive_file" "lambdaFunc_lambda_bucket" {
      type = "zip"

      source_dir  = var.src_path
      output_path = var.target_path
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;After that I create the bucket for my deployment artifact:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    resource "aws_s3_bucket_object" "lambdaFunc_lambda_bucket" {
      bucket = aws_s3_bucket.lambda_bucket.id

      key    = var.target_path
      source = data.archive_file.lambdaFunc_lambda_bucket.output_path

      etag = filemd5(data.archive_file.lambdaFunc_lambda_bucket.output_path)
        tags = {
        "env" = "test"
      }
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now things get more interesting. When you deploy a lambda function, you can specify a few parameters that are relevant when you want to deploy in production. One is of course, the reserved concurrency. AWS limits the number of concurrent executions per account and per region. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fysa34t25ia444488o2rj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fysa34t25ia444488o2rj.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Therefore for having predictability for our concurrency and not having our Lambda function throttled by other lambdas, we set this parameter.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    resource "aws_lambda_function" "lambdaFunc" {
      function_name = var.function_name

      s3_bucket = aws_s3_bucket.lambda_bucket.id
      s3_key    = aws_s3_bucket_object.lambdaFunc_lambda_bucket.key

      runtime = var.lambda_runtime
      handler = var.handler

      source_code_hash = data.archive_file.lambdaFunc_lambda_bucket.output_base64sha256

      role                           = aws_iam_role.lambda_exec.arn
      reserved_concurrent_executions = var.concurrent_executions
      tags = {
        "env" = "test"
      }
    }    
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now we get to the most important port, Cold Start. As officially suggested by AWS, I will be leveraging provisioned concurrency to deploy the lambda function. There are many solutions which use aws cloud watch as rules, but I really believe that provisioned concurrency is much simpler and comfortable to use. Moreover, in order to use the provisioned concurrency, we will need to create an Alias. This Alias can also serve as a way to do Blue Green or Canary deployments, which we will hopefully cover in a future post.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    resource "aws_lambda_alias" "con_lambda_alias" {
      name             = "lambda_alias"
      description      = "for blue green deployments OR for concurrency"
      function_name    = aws_lambda_function.lambdaFunc.arn
      function_version = var.function_version
    }

    resource "aws_lambda_provisioned_concurrency_config" "config" {
      function_name                     = aws_lambda_alias.con_lambda_alias.function_name
      provisioned_concurrent_executions = var.provisioned_concurrent_executions
      qualifier                         = aws_lambda_alias.con_lambda_alias.name
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Please note that I used variables as the value for these parameters. These variables have defaults. Feel free to change them as your use case requires. Also please be aware that the provisioned concurrency is going to cost you as AWS keeps a warm instance of your application somewhere.&lt;/p&gt;

&lt;p&gt;Lastly, we need a basic policy for our lambda function.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    resource "aws_iam_role" "lambda_exec" {
      name = "serverless_lambda"
      assume_role_policy = jsonencode(
        {
          "Version" : "2012-10-17",
          "Statement" : [
            {
              "Effect" : "Allow",
              "Principal" : {
                "Service" : "lambda.amazonaws.com"
              },
              "Action" : "sts:AssumeRole"
            }
          ]
        }
      )
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;In the "variables.tf" file add the following to it:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    variable "function_name" {
      type    = string
      default = "test-function"
    }

    variable "src_path" {
      type = string
    }

    variable "target_path" {
      type = string
    }

    variable "lambda_runtime" {
      type    = string
      default = "nodejs12.x"
    }

    variable "handler" {
      type    = string
      default = "index.handler"
    }

    variable "region" {
      type    = string
      default = "eu-central-1"
    }

    variable "concurrent_executions" {
      type    = string
      default = "1"
    }

    variable "provisioned_concurrent_executions" {
      type    = string
      default = "1"
    }

    variable "function_version" {
      type    = string
      default = "1"
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;And in the "terraform.auto.tfvars" file add the following content:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    src_path    = "./src"  //the path to the source code for lambdafunction
    target_path = "./artifacts/lambda_deployment.zip" //the path for the deployment artifact
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;You can of course bring your own lambda function, but if you don't have one, create a folder called "src and a file in it called "index.handler" and add the following to it:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    exports.handler =  async (event) =&amp;gt; {
      const payload = {
        date: new Date(),
        message: 'Terraform is awesome!'
      };
      return JSON.stringify(payload);
    };
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now you can deploy the application using Terraform CLI or Terraform Cloud. For the CLI version, simply run&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    "terraform apply -auto-approve"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;If you want to use Terraform Cloud to have "GitOps" use the following documentation:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.hashicorp.com/resources/a-practitioner-s-guide-to-using-hashicorp-terraform-cloud-with-github" rel="noopener noreferrer"&gt;https://www.hashicorp.com/resources/a-practitioner-s-guide-to-using-hashicorp-terraform-cloud-with-github&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This problem might have been difficult to solve, but I can guarantee that it is much harder to communicate the solution to someone properly.&lt;br&gt;
I will create another blog post explaining how you can optimally share this solution with someone inside your organization.&lt;/p&gt;

&lt;p&gt;GitHub Repository: &lt;a href="https://github.com/pedramha/terraform-aws-lambda" rel="noopener noreferrer"&gt;https://github.com/pedramha/terraform-aws-lambda&lt;/a&gt;&lt;br&gt;
Youtube: &lt;a href="https://www.youtube.com/watch?v=e0QplrqH0J4" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=e0QplrqH0J4&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Thank you!&lt;br&gt;
Pedram&lt;/p&gt;

</description>
      <category>aws</category>
      <category>lambd</category>
      <category>iac</category>
      <category>terraform</category>
    </item>
    <item>
      <title>Basic serverless CRUD Application with CDK for Terraform</title>
      <dc:creator>Pedram Hamidehkhan</dc:creator>
      <pubDate>Fri, 01 Apr 2022 14:00:50 +0000</pubDate>
      <link>https://dev.to/pedramha/basic-serverless-crud-application-with-cdk-for-terraform-5g3d</link>
      <guid>https://dev.to/pedramha/basic-serverless-crud-application-with-cdk-for-terraform-5g3d</guid>
      <description>&lt;p&gt;In this post, we will build a basic serverless CRUD application on AWS using CDK for Terraform. In a previous post we created a simple hello world applicaiton, but in this one, we persist the data using dynamo db and we add some functionality to the applicaiton. Addiditionally we add some security to our API. &lt;/p&gt;

&lt;p&gt;First thing's first, let's initialize a CDKTF project. Similar to Cloudformation CDK, you can run the following command to initialize the project:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cdktf init
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Then it will ask you for the language of your choice and name and description for your project. I opted for Typescript but you can use whatever language you are comfortable with.&lt;br&gt;
The "cdktf init" command will create the project structure for us. As you might know, Terraform has a very large provider ecosystem. To continue with the project, you need to specify which provider you want to use. To do that, open the cdktf.json and change the "terraformProviders" section to:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  "terraformProviders": [
    "aws@&amp;gt;3.0"
  ]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Of course, you can use the version of your choice. After that, you need to run the following command to pull down the providers:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cdktf get 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;OR:    &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt; npm run get 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now we are ready to begin. In the main.ts file, add the following imports:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import { Construct } from "constructs";
import { App, AssetType, TerraformAsset, TerraformStack, TerraformOutput } from "cdktf";
import { lambdafunction, s3, apigateway, iam, AwsProvider, dynamodb } from "@cdktf/provider-aws";
import * as path from "path";
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;It is worth mentioning, that unlike Cloudformation CDK, you will get a compile error if you have unused imports or variables in your code.&lt;br&gt;
Now add the following code to specify the provider:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;new AwsProvider(this, "aws", {
  region: "eu-west-1"
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;You also need to specify where your lambda function is. I created a folder called "src" and used TerraformAsset to import it as follows:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const asset = new TerraformAsset(this, "asset", {
  path: path.resolve(__dirname,'./src'),
  type:AssetType.ARCHIVE
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now we need to create the S3 Bucket and Object to upload our TerraformAsset function:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const assetBucket = new s3.S3Bucket(this, "assetBucket", {
  bucket:"a-unique-bucket-name"
});

const lambdaArchive = new s3.S3BucketObject(this, "lambdaArchive", {
  bucket: assetBucket.bucket,
  key: asset.fileName,
  source: asset.path,
  sourceHash: asset.assetHash // to inform cdktf of changes in file
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now we can create our database:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const db = new dynamodb.DynamodbTable(this, "db", {
  name: "my-table",
  billingMode: "PAY_PER_REQUEST",
  hashKey: "id",
  attribute: [
    {
      name: "id",
      type: "S"
    }
  ]
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Then create the IAM Policy and Role for the lambda function:&lt;/p&gt;

&lt;p&gt;const lambPolicy = {&lt;br&gt;
      "Version": "2012-10-17",&lt;br&gt;
      "Statement": [&lt;br&gt;
        {&lt;br&gt;
          "Sid": "",&lt;br&gt;
          "Effect": "Allow",&lt;br&gt;
          "Principal": {&lt;br&gt;
            "Service": [&lt;br&gt;
              "lambda.amazonaws.com"&lt;br&gt;
            ]&lt;br&gt;
          },&lt;br&gt;
          "Action": "sts:AssumeRole"&lt;br&gt;
        }&lt;br&gt;
      ]&lt;br&gt;
    };&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const role = new iam.IamRole(this, "role", {
  assumeRolePolicy: JSON.stringify(lambPolicy),
  name: "my-lambda-role"
});


new iam.IamRolePolicyAttachment(this, "rolePolicy", {
  policyArn: "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole",
  role: role.name
});

//we can use a managed policy to access our db
new iam.IamRolePolicyAttachment(this, "rolePolicyDB", {
  policyArn: "arn:aws:iam::aws:policy/AmazonDynamoDBFullAccess",
  role: role.name
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;After that, we can specify the configuration of our lambda function:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const lambdaFunc = new lambdafunction.LambdaFunction(this, "lambdaFunc", {
  functionName: "my-lambda-function",
  runtime: "nodejs14.x",
  handler: "index.handler",
  role: role.arn,
  s3Bucket: assetBucket.bucket,
  s3Key: lambdaArchive.key,
  sourceCodeHash: lambdaArchive.sourceHash,
  environment: {
    variables: {
      "TABLE_NAME": db.name,
      "PRIMARY_KEY": 'itemId',
    }
  }
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;I added the table name and PK as environment varaibles.&lt;/p&gt;

&lt;p&gt;Then, we create an API Gateway to receive HTTP Requests from the Internet. We can also add the resources to our API and define HTTP Methods:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const restApi = new apigateway.ApiGatewayRestApi(this, "restApi", {
  name: "my-rest-api",
  description: "my-rest-api"
});

const resourceApi = new apigateway.ApiGatewayResource(this, "resourceApi", {
  restApiId: restApi.id,
  parentId: restApi.rootResourceId,
  pathPart: "my-resource",
});

const postApi = new apigateway.ApiGatewayMethod(this, "postApi", {
  restApiId: restApi.id,
  resourceId: resourceApi.id,
  httpMethod: "POST",
  authorization: "NONE"
});

const getApi = new apigateway.ApiGatewayMethod(this, "getApi", {
  restApiId: restApi.id,
  resourceId: resourceApi.id,
  httpMethod: "GET",
  authorization: "NONE"
});

const putApi = new apigateway.ApiGatewayMethod(this, "putApi", {
  restApiId: restApi.id,
  resourceId: resourceApi.id,
  httpMethod: "PUT",
  authorization: "NONE"
});

const delApi = new apigateway.ApiGatewayMethod(this, "delApi", {
  restApiId: restApi.id,
  resourceId: resourceApi.id,
  httpMethod: "DELETE",
  authorization: "NONE"
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;We also need to add API integrations to our methods as follows:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;new apigateway.ApiGatewayIntegration(this, "apiIntegration", {
  restApiId: restApi.id,
  resourceId: resourceApi.id,
  httpMethod: postApi.httpMethod,      
  integrationHttpMethod: "POST",
  type: "AWS_PROXY",
  uri: lambdaFunc.invokeArn
});

new apigateway.ApiGatewayIntegration(this, "apiIntegration2", {
  restApiId: restApi.id,
  resourceId: resourceApi.id,
  httpMethod: getApi.httpMethod,      
  integrationHttpMethod: "POST",
  type: "AWS_PROXY",
  uri: lambdaFunc.invokeArn
});

new apigateway.ApiGatewayIntegration(this, "apiIntegration3", {
  restApiId: restApi.id,
  resourceId: resourceApi.id,
  httpMethod: putApi.httpMethod,
  integrationHttpMethod: "POST",
  type: "AWS_PROXY",
  uri: lambdaFunc.invokeArn
});

new apigateway.ApiGatewayIntegration(this, "apiIntegration4", {
  restApiId: restApi.id,
  resourceId: resourceApi.id,
  httpMethod: delApi.httpMethod,
  integrationHttpMethod: "POST",
  type: "AWS_PROXY",
  uri: lambdaFunc.invokeArn
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Note that integrationHttpMethod is always set to POST:&lt;/p&gt;

&lt;p&gt;Finally, you need to create a LambdaPermission Policy to allow the API Gateway to invoke our Lambda Function:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;new lambdafunction.LambdaPermission(this, "apig-lambda", {
  statementId: "AllowExecutionFromAPIGateway",
  action: "lambda:InvokeFunction",
  functionName: lambdaFunc.functionName,
  principal: "apigateway.amazonaws.com",
  sourceArn: `${restApi.executionArn}/*/*`      
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now we have our API and Lambda Function Integrations. We can go ahead and deploy our API into a Stage. However, we should secure our API, because at this point, it will be reachable from the Internet.&lt;/p&gt;

&lt;p&gt;So we create a deployment for our API and add a stage:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const apiDepl = new apigateway.ApiGatewayDeployment(this, "deployment", {
  restApiId: restApi.id,
  dependsOn: [lambdaFunc]
});

const apiStage = new apigateway.ApiGatewayStage(this, "stage", {
  restApiId : restApi.id,
  stageName: "test",
  deploymentId: apiDepl.id,
  dependsOn: [apiDepl]
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;I also added the "dependsOn" property to make sure certain resources will be created before others.&lt;/p&gt;

&lt;p&gt;Now we can create an API Key for our API:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const apiKey = new apigateway.ApiGatewayApiKey(this, "apiKey", {
  name: "my-api-key",
  description: "my-api-key",
  enabled: true
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;After that, we need to create a usage plan and attach the api key to it. The Usage plan gives you a lot of control over how your API is used.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const usagePlan = new apigateway.ApiGatewayUsagePlan(this, "usagePlan", {
  name: "my-usage-plan",
  description: "my-usage-plan",
  throttleSettings: {
    burstLimit: 10,
    rateLimit: 10
  },
  apiStages: [
    {
      apiId: restApi.id,
      stage: apiStage.stageName
    }
  ],
  dependsOn: [apiKey]
});

new apigateway.ApiGatewayUsagePlanKey(this, "usagePlanKey", {
  keyId: apiKey.id,
  keyType: "API_KEY",
  usagePlanId: usagePlan.id,
  dependsOn:[usagePlan]
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Note that I limited the rate and burst limit to make sure the costs will be in control.&lt;/p&gt;

&lt;p&gt;If you like to see the endpoint of your API and test it curl or postman, you can use the following command:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;new TerraformOutput(this, "apiUrl", {
  value: apiStage.invokeUrl,
  description: "API URL"
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Don't forget to add your lambda function in the "src" folder; create a file called index.ts and add the functionality that you want. I only wanted to send a hello world response:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import * as AWS from 'aws-sdk';
import { v4 as uuidv4 } from 'uuid';

const db = new AWS.DynamoDB.DocumentClient({ apiVersion: '2012-08-10', region: 'eu-west-1' });
const TABLE_NAME = process.env.TABLE_NAME || '';
const PRIMARY_KEY = process.env.PRIMARY_KEY || '';

export const handler = async (event: any = {}): Promise&amp;lt;any&amp;gt; =&amp;gt; {
  if (event.httpMethod === 'POST') {
    try {
      if (!event.body) {
        return { statusCode: 400, body: 'invalid request, you are missing the parameter body' };
      }
      const item = typeof event.body == 'object' ? event.body : JSON.parse(event.body);
      item[PRIMARY_KEY] = uuidv4();
      const params = {
        TableName: TABLE_NAME,
        Item: item
      };

      return await db.put(params).promise();
    } catch (error) {
      return { statusCode: 500, body: error };
    }
  }
  else if (event.httpMethod === 'GET') {
    //get all items
    try {
      const params = {
        TableName: TABLE_NAME
      };
      const resp = await db.scan(params).promise()
      return { statusCode: 200, body: JSON.stringify(resp.Items) };
    } catch (error) {
      return { statusCode: 500, body: error };
    }
  }

  else if (event.httpMethod === 'PUT') {
    try {
      if (!event.body) {
        return { statusCode: 400, body: 'invalid request, you are missing the parameter body' };
      }
      const item = typeof event.body == 'object' ? event.body : JSON.parse(event.body);
      const params = {
        TableName: TABLE_NAME,
        Item: item
      };
      return await db.put(params).promise();
    } catch (error) {
      return { statusCode: 500, body: error };
    }
  }
  else if (event.httpMethod === 'DELETE') {
    try {
      if (!event.pathParameters) {
        return { statusCode: 400, body: 'invalid request, you are missing the parameter pathParameters' };
      }
      const params = {
        TableName: TABLE_NAME,
        Key: {
          [PRIMARY_KEY]: event.pathParameters[PRIMARY_KEY]
        }
      };
      return await db.delete(params).promise();
    } catch (error) {
      return { statusCode: 500, body: error };
    }
  }
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;I will explain this lambda function in a separate post.&lt;br&gt;
You can create this lambda function without packaging it. But I'd like to package it before deploying to make sure that I have all the dependencies that I need:&lt;br&gt;
So I create a file called "package.json" and add the following:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "name": "lambda-crud-dynamodb",
    "version": "1.0.0",
    "description": "Lambdas to do CRUD operations on DynamoDB",
    "private": true,
    "license": "MIT",
    "devDependencies": {
      "@types/node": "*",
      "@types/uuid": "*"
    },
    "dependencies": {
      "aws-sdk": "*",
      "uuid": "*"
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Regarding deploying this code, you will find your way if you have used CDK Cloudformation or Terraform HCL.&lt;br&gt;
You can run the following commands to deploy the project:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm run build
cdktf plan OR cdktf synth
cdktf apply OR cdktf deploy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The youtube video: &lt;a href="https://youtu.be/sf2EgmHRHiw" rel="noopener noreferrer"&gt;https://youtu.be/sf2EgmHRHiw&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Github: &lt;a href="https://github.com/pedramha/cdktf-aws-example" rel="noopener noreferrer"&gt;https://github.com/pedramha/cdktf-aws-example&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cdk</category>
      <category>terraform</category>
      <category>serverless</category>
    </item>
    <item>
      <title>Building a Sample Lambda Function and API Gateway with CDK for Terraform</title>
      <dc:creator>Pedram Hamidehkhan</dc:creator>
      <pubDate>Fri, 04 Feb 2022 15:35:37 +0000</pubDate>
      <link>https://dev.to/pedramha/building-a-sample-lambda-function-and-api-gateway-with-cdk-for-terraform-36c2</link>
      <guid>https://dev.to/pedramha/building-a-sample-lambda-function-and-api-gateway-with-cdk-for-terraform-36c2</guid>
      <description>&lt;p&gt;In this post, you learn how to use CDK for Terraform to build a sample serverless application on AWS. We are going to use AWS lambda and API Gateway to build this application. &lt;/p&gt;

&lt;p&gt;First thing's first, let's initialize a CDKTF project. Similar to Cloudformation CDK, you can run the following command to initialize the project:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cdktf init
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Then it will ask you for the language of your choice and name and description for your project. I opted for Typescript but you can use whatever language you are comfortable with.&lt;br&gt;
The "cdktf init" command will create the project structure for us. As you might know, Terraform has a very large provider ecosystem. To continue with the project, you need to specify which provider you want to use. To do that, open the cdktf.json and change the "terraformProviders" section to:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  "terraformProviders": [
    "aws@&amp;gt;3.0"
  ]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Of course, you can use the version of your choice. After that, you need to run the following command to pull down the providers:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cdktf get    
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now we are ready to begin. In the main.ts file, add the following imports:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import { Construct } from "constructs";
import { App, AssetType, TerraformAsset, TerraformOutput, TerraformStack } from "cdktf";
import {lambdafunction, s3, apigatewayv2, iam, AwsProvider} from "./.gen/providers/aws"
import path = require("path");
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;It is worth mentioning, that unlike Cloudformation CDK, you will get a compile error if you have unused imports or variables in your code.&lt;br&gt;
Now add the following code to specify the provider:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;new AwsProvider(this, "aws", {
  region: "eu-west-1"
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;You also need to specify where your lambda function is. I created a folder called "src" and used TerraformAsset to import it as follows:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const asset = new TerraformAsset(this, "asset", {
  path: path.resolve(__dirname,'./src'),
  type:AssetType.ARCHIVE
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now we need to create the S3 Bucket and Object to upload our TerraformAsset function:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const assetBucket = new s3.S3Bucket(this, "assetBucket", {
  bucket:"a-unique-bucket-name"
});

const lambdaArchive = new s3.S3BucketObject(this, "lambdaArchive", {
  bucket:assetBucket.bucket,
  key:asset.fileName,
  source:asset.path
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Then create the IAM Policy and Role for the lambda function:&lt;/p&gt;

&lt;p&gt;const lambdaRole = {&lt;br&gt;
      "Version": "2012-10-17",&lt;br&gt;
      "Statement": [&lt;br&gt;
        {&lt;br&gt;
          "Effect": "Allow",&lt;br&gt;
          "Principal": {&lt;br&gt;
            "Service": "lambda.amazonaws.com"&lt;br&gt;
          },&lt;br&gt;
          "Action": "sts:AssumeRole",&lt;br&gt;
          "Sid": "",&lt;br&gt;
        }&lt;br&gt;
      ]&lt;br&gt;
    };&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const role = new iam.IamRole(this, "role", {
  assumeRolePolicy: JSON.stringify(lambdaRole),
  name: "my-lambda-role"
});

new iam.IamRolePolicyAttachment(this, "rolePolicy", {
  policyArn: "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole",
  role: role.name
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;After that, we can specify the configuration of our lambda function:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const lambdaFunc = new lambdafunction.LambdaFunction(this, "lambdaFunc", {
  functionName: "my-lambda-function",
  runtime: "nodejs14.x",
  handler: "index.handler",
  role: role.arn,
  s3Bucket: assetBucket.bucket,
  s3Key: lambdaArchive.key
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Then, we create an API Gateway to receive HTTP Requests from the Internet:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const api = new apigatewayv2.Apigatewayv2Api(this, "api", {
  name: name, 
  protocolType: "HTTP",
  target: lambdaFunc.arn
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Finally, you need to create a LambdaPermission Policy to allow the API Gateway to invoke our Lambda Function:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;new lambdafunction.LambdaPermission(this, "apig-lambda", {
  functionName: lambdaFunc.functionName,
  action: "lambda:InvokeFunction",
  principal: "apigateway.amazonaws.com",
  sourceArn: `${api.executionArn}/*/*`,
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;If you like to see the endpoint of your API and test it curl or postman, you can use the following command:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;new TerraformOutput(this, "apiUrl", {
  value: api.apiEndpoint
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Don't forget to add your lambda function in the "src" folder; create a file called index.ts and add the functionality that you want. I only wanted to send a hello world response:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;export const handler = async () =&amp;gt; {
    return { statusCode: 200, body: 'hello world'  };
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Regarding deploying this code, you will find your way if you have used CDK Cloudformation or Terraform HCL.&lt;br&gt;
You can run the following commands to deploy the project:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm run build
cdktf plan OR cdktf synth
cdktf apply OR cdktf deploy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The youtube video: &lt;a href="https://youtu.be/3rVgMYc7jkg"&gt;https://youtu.be/3rVgMYc7jkg&lt;/a&gt;&lt;br&gt;
Github: &lt;a href="https://github.com/pedramha/cdktf-aws-example"&gt;https://github.com/pedramha/cdktf-aws-example&lt;/a&gt;&lt;/p&gt;

</description>
      <category>cdk</category>
      <category>terraform</category>
      <category>aws</category>
      <category>typescript</category>
    </item>
    <item>
      <title>Create Static Website on AWS using Terraform</title>
      <dc:creator>Pedram Hamidehkhan</dc:creator>
      <pubDate>Wed, 19 Jan 2022 17:35:24 +0000</pubDate>
      <link>https://dev.to/pedramha/create-static-website-on-aws-using-terraform-3d9g</link>
      <guid>https://dev.to/pedramha/create-static-website-on-aws-using-terraform-3d9g</guid>
      <description>&lt;p&gt;In this post, you learn how to use Terraform to create a static website on AWS. This example is suitable for scenarios where you have a frontend team working on a static web application. (not ideal for dynamic websites like mvc applications in ASP.NET) Moreover, you would generally use such pattern in larger applications, not for small blogs or single page applications. &lt;/p&gt;

&lt;p&gt;First thing's first, let's initialize a Terraform project project. To do that create a file called configuration.tf (the name doesn't really matter, only the suffix) and add the following:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;provider "aws" {
  region = "eu-central-1"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Then run the following command to initialize the project:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Terraform init
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Using the configuration.tf file, terraform will know that provider you need and pulls the dependencies for you.&lt;/p&gt;

&lt;p&gt;Then, in the main.tf file (create it if it doesn't exist) create a s3 bucket using the following command:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    resource "aws_s3_bucket" "bucket" {
      bucket = "youruniquebucketname"
      acl    = "public-read"

      provisioner "local-exec" {
          command = "aws s3 sync static/ s3://${aws_s3_bucket.bucket.bucket} --acl public-read --delete"

      }

      website {
        index_document = "index.html"
      }

    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;In the first line, you create the bucket resource, second line is the name, the next is the access control list (must be public). The provisioner resource will copy a basic aws command, syncing the contents of the static folder (which is where you put your static content) to the bucket. You could also use the copy command, but I find sync more convenient, as it also can delete files in the target bucket. &lt;br&gt;
At last, you can also make sure that index_document is pointing to your index.html file.&lt;/p&gt;

&lt;p&gt;Then go ahead and paste your static content in the static folder.&lt;br&gt;
Now we can deploy to AWS.&lt;br&gt;
Use "terraform plan" to see what will be deployed. Then, using "terraform apply" you can deploy to your aws account.&lt;/p&gt;

&lt;p&gt;The youtube video: &lt;a href="https://www.youtube.com/watch?v=UU3drURkTtw"&gt;https://www.youtube.com/watch?v=UU3drURkTtw&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>terraform</category>
      <category>webdev</category>
      <category>html</category>
    </item>
    <item>
      <title>Easiest way to create a serverless backend in AWS using UI</title>
      <dc:creator>Pedram Hamidehkhan</dc:creator>
      <pubDate>Wed, 12 Jan 2022 17:30:39 +0000</pubDate>
      <link>https://dev.to/pedramha/easiest-way-to-create-a-serverless-backend-in-aws-using-ui-hpk</link>
      <guid>https://dev.to/pedramha/easiest-way-to-create-a-serverless-backend-in-aws-using-ui-hpk</guid>
      <description>&lt;p&gt;I've recently discovered a new way to create serverless backend in AWS using the GUI. Of course, creating lambda functions and dynamodb tables through the UI isn't really difficult. The main point is to get everything as Code at the end. This is a really long awaited feature in aws, which is finally being delivered. Now you can configure your infrastructure through the AWS console, and then export it as code. You can use CDK or SAM as the export target. A similar feature has been offered by azure (exporting ARM templates), but it is great to see AWS is offering this feature.&lt;/p&gt;

&lt;p&gt;As opposed to previous posts, in this one we only work with the GUI (AWS Console).&lt;/p&gt;

&lt;p&gt;So after logging in to the aws console, go to Lambda section and click on Applications.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--aSrMy_n8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8lvla7p6njtlg3kgwlx8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--aSrMy_n8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8lvla7p6njtlg3kgwlx8.png" alt="Image description" width="880" height="201"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After that you see the following options:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KnX1TC2D--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1hcurr4ipepl147n43ap.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KnX1TC2D--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1hcurr4ipepl147n43ap.png" alt="Image description" width="880" height="336"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As you can see, you can not do everything using this method, but you have quite a few options.&lt;br&gt;
We go ahead and create a Serverless Backend API. This will generate a new API Gateway, a few Lambda Functions and a DynamoDB Table.&lt;br&gt;
Please note that at this moment only Node.js 14 is supported. It is also worth mentionning that the generated code can be exported to github or to aws code commit.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--AV3MsnE1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4v6yrodu23fii2fj7um2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--AV3MsnE1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4v6yrodu23fii2fj7um2.png" alt="Image description" width="880" height="461"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So go ahead and click on next to go to the page where you can specify your application name, github/codecommit repository and whether you want the export to be SAM or CDK. If you want to export to github, you need to make a connection between your aws account and github. Click on "Connect with CodeStar Connections". In the popup window choose a name fo the connection and click "connect to github". After that a pop-up window will appear, asking you for your credentials. After entering your credentials your connection is there and you can go ahead and click on "create" on the bottom of the page. &lt;/p&gt;

&lt;p&gt;And that is IT! it takes a few minutes for all the resources to be created and for the code to be available in github. The cherry on top is, that using this method you already get a functioning CICD pipeline! As a matter of fact, aws configures code Pipeline and CodeBuild and connects them to your github repository for you. So every time you make a change and push it to github, the change will be automatically deployed to aws.&lt;/p&gt;

&lt;p&gt;It might be a bit difficult to follow using only text, so I went ahead and created a vide, which hopefully will be easier to follow.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    Youtube: https://youtu.be/z8lDx2l7f0w
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;That is it. Thank you very much for reading this post.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>iac</category>
      <category>cdk</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Building on AWS Using Github Copilot and AWS CDK</title>
      <dc:creator>Pedram Hamidehkhan</dc:creator>
      <pubDate>Mon, 08 Nov 2021 19:21:47 +0000</pubDate>
      <link>https://dev.to/pedramha/building-on-aws-using-github-copilot-and-aws-cdk-4ja8</link>
      <guid>https://dev.to/pedramha/building-on-aws-using-github-copilot-and-aws-cdk-4ja8</guid>
      <description>&lt;p&gt;In this post we are going to have a look at Github's Copilot and how we can leverage it spin up some resources using AWS CDK.&lt;br&gt;
Copilot is still in technical preview. But if you already signed up for it and are lucky enough like me to have access to it, you can enable it as a simple VS Code extension.&lt;br&gt;
Of course, I do not expect for the Copilot to be very mature, as it is not publicly released yet, but I believe it seems very promising and arguably it changes the way we think about software development. Just like driving assistance systems, which are not a replacement for a driver, Github Copilot will also not replace developers, but rather empowers them.&lt;br&gt;
There are already many videos on youtube and blog posts explaining basic functions that Copilot automatagically generates. In this post though, we try to see how it can be used in a serverless/aws world.&lt;br&gt;
As you might already know, copilot uses billions of lines of code on Github, to give you suggestion while coding. It actually is even giving me suggestions a I am writing this post. What I am going to do is to perform the same tasks I did in  &lt;a href="https://github.com/pedramha/cdk-crudsample"&gt;this post &lt;/a&gt; using Copilot.&lt;br&gt;
Some of the things I explain here are a bit difficult to grasp in writing form, so I will explain them in a video in a later parts of this post.&lt;/p&gt;

&lt;p&gt;So as always, first thing's first, let's initialize an empty CDK typescript project:&lt;/p&gt;

&lt;p&gt;cdk init --language=typescript.&lt;/p&gt;

&lt;p&gt;Then we go to package.json to declare the dependencies for our project.&lt;br&gt;
Add the following modules the dependencies:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm install @aws-cdk/aws-ecs @aws-cdk/aws-ecs-patterns @aws-cdk/aws-ecr-assets --save
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;By default, CDK deploys the VPC for you, but if you want control over your VPC, go ahead and import EC2 Module and add the VPC.&lt;br&gt;
Then go to the lib folder and the typescript file for your application stack.&lt;/p&gt;

&lt;p&gt;Add the following import statements to before class declaration:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import cdk = require("@aws-cdk/core");
import apigateway = require("@aws-cdk/aws-apigateway");
import dynamodb = require("@aws-cdk/aws-dynamodb");
import lambda = require("@aws-cdk/aws-lambda");
import { RemovalPolicy } from "@aws-cdk/core";
import { create } from "domain";
import { BillingMode } from "@aws-cdk/aws-dynamodb";
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;As I write const dynamodbTable, copilot automatically generates the following part.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;new dynamodb.Table(this, "dynamodbTable", {
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;So it only instantiates the table using the packages I provided. However, as I go to the next line to provide the partition key and other settings for the table, it surprisingly suggested the following code:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;partitionKey: { name: "id", type: dynamodb.AttributeType.STRING },
billingMode: BillingMode.PAY_PER_REQUEST,
removalPolicy: RemovalPolicy.DESTROY,
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Regarding the lambda functions, I start with the get all function. As I mentioned, it is important that you give meaningful names to your functions, because copilot will use those names to generate the code. So I only type const getAllItemsLambda, copilot generates the following for me:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  const getAllItemsLambda = new lambda.Function(this, "getAllItemsLambda", {
    runtime: lambda.Runtime.NODEJS_12_X,
    code: lambda.Code.fromAsset("lambda"),
    handler: "getAllItems.handler",
    environment: {
      TABLE_NAME: dynamodbTable.tableName,
    },
  });
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Very nice so far, regarding the IAM Roles for our lambdas, as I start the name of the db, I get the following suggestions:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  dynamodbTable.grantReadWriteData(createItemsLambda);
  dynamodbTable.grantReadWriteData(getAllItemsLambda);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;This is nice, though, it doesn't respect the principal of least privilege. so I amend the second line to grantReadData.&lt;/p&gt;

&lt;p&gt;Now let's go for the API Gateway. As I start typing api, I get the following suggestion:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const api = = new apigateway.RestApi(this, "restApi", {
    restApiName: "cdkwithcopilot",
  });
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;I also get the following suggestions as I type the name of the resource:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  const root = api.root;
  const getAllItemsApi = new apigateway.LambdaIntegration(getAllItemsLambda);
  root.addMethod("GET", getAllItemsApi);

  const createItemsApi = new apigateway.LambdaIntegration(createItemsLambda);
  root.addMethod("POST", createItemsApi);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;All the resources were suggested as expected, only for the usage plan, copilot wrongly suggested api.UsagePlane instead of api.addUsagePlan . Also, it had a little problem when I added the Api Key.&lt;/p&gt;

&lt;p&gt;Getting to the lambda functions, I was able to generate the following using copilot:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    const AWS = require('aws-sdk');
    const db = new AWS.DynamoDB.DocumentClient();
    const TablName = process.env.TABLE_NAME;

    export const handler = async (event: any = {}): Promise&amp;lt;any&amp;gt; =&amp;gt; {
        const params = {
            TableName: TablName
        };
        const result = await db.scan(params).promise();
        return result.Items;
    }    
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Also for the other lambda function, which creates the items:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    const AWS = require('aws-sdk');
    const db = new AWS.DynamoDB.DocumentClient();
    const TABLE_NAME = process.env.TABLE_NAME || ''; //added manually

    export const handler = async (event: any = {}): Promise&amp;lt;any&amp;gt; =&amp;gt; {

        const params = {
            TableName: TABLE_NAME
        };
        try{
            const response = await db.scan(params).promise();
            return {status: 'success', data: response.Items};
        }
        catch(dbError){
            return{status: 'error', data: dbError};
        }
    };
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now the moment of truth: CDK synth&lt;br&gt;
And worked as expected. I was truly amazed by how powerful copilot is. I even wrote a few lines of this blog post with it!&lt;br&gt;
Arguably copilot is not a replacement for developers, but an amazing tool which helps us (developers) build better software.&lt;br&gt;
I hope you enjoyed this post.&lt;br&gt;
Thank you very much for reading.&lt;br&gt;
Pedram and Copilot &lt;/p&gt;

</description>
    </item>
    <item>
      <title>BASIC AWS CDK Backend API</title>
      <dc:creator>Pedram Hamidehkhan</dc:creator>
      <pubDate>Wed, 14 Jul 2021 19:14:00 +0000</pubDate>
      <link>https://dev.to/pedramha/basic-aws-cdk-backend-api-gi5</link>
      <guid>https://dev.to/pedramha/basic-aws-cdk-backend-api-gi5</guid>
      <description>&lt;p&gt;In this post, you learn how to use AWS CDK to create a basic yet powerful backend in AWS. The components used for this application are, AWS API Gateway which receives http requests as an entrance to our application, AWS Lambda where we implement our business logic, Dynamodb a NoSQL Database where we store our data.&lt;/p&gt;

&lt;p&gt;First thing's first, let's initialize an empty CDK typescript project:&lt;br&gt;
cdk init app --language=typescript.&lt;/p&gt;

&lt;p&gt;Then we go to package.json to declare the dependencies for our project.&lt;br&gt;
Add the following modulesto the dependencies:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"@aws-cdk/core": "*",
"@aws-cdk/aws-apigateway": "*",,
"@aws-cdk/aws-lambda": "*",,
"@aws-cdk/aws-dynamodb": "*",
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;This is an optional step, but go ahead and run "npm install" in the root directory of your project. This will download the dependencies and give you intellisense which comes very handy.&lt;br&gt;
The go to the lib folder and the typescript file for your application stack.&lt;/p&gt;

&lt;p&gt;Add the following import statements to before class declaration:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import cdk = require("@aws-cdk/core");
import apigateway = require("@aws-cdk/aws-apigateway");
import dynamodb = require("@aws-cdk/aws-dynamodb");
import lambda = require("@aws-cdk/aws-lambda");
import { RemovalPolicy } from "@aws-cdk/core";
import { create } from "domain";
import { BillingMode } from "@aws-cdk/aws-dynamodb";
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;In order to create our database, add the following:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const dynamoTable=new dynamodb.Table(this,'mytestDB',{
  partitionKey:{
    name:'itemId',
    type:dynamodb.AttributeType.STRING
  },
    removalPolicy:RemovalPolicy.RETAIN
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Here we defined a partition key 'itemId' of type string, and the removal policy of our database. As data is very sensitive (of course, not for this demo but in real life), we chose the Retain policy if our stack gets destroyed. You could also chose Snapshot, or destory policy if you like.&lt;br&gt;
Getting to the business logic of our application, we declare our lambda functions.&lt;br&gt;
Firstly, we create a get all function, which returns everything in the databse.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const getAll = new lambda.Function(this, 'getAllitems',{
  code: new lambda.AssetCode('./src'),
  handler:'get-all.handler',
  runtime:lambda.Runtime.NODEJS_10_X,
  environment:{
    TABLE_NAME: dynamoTable.tableName,
    PRIMARY_KEY:'itemId'
  }
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Firstly, we tell cdk where the code for our function is, (src folder), then the handler inside that code, the runtime and some environment variables where we pass in our dynamodb table name and the pratition key.&lt;/p&gt;

&lt;p&gt;Than we add one more lambda function which creates item in our databse as follows.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const createLambda = new lambda.Function(this, 'createItem',{
  code: new lambda.AssetCode('./src'),
  handler:'create.handler',
  runtime:lambda.Runtime.NODEJS_10_X,
  environment:{
    TABLE_NAME: dynamoTable.tableName,
    PRIMARY_KEY:'itemId'
  },
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;As of now, our lambdas cannot connect to the database, as the IAM role for them is not yet created. Using the following command, we grant access to our lambdas, while respecting the prinicple of least privilege.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;dynamoTable.grantReadData(getAll);
dynamoTable.grantReadWriteData(createLambda);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Having created our persistance layer and our business logic layer, we go ahead and create the gateway to our application. As we would like to serve our content via REST API, we declare an REST API gateway as follows:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const api = new apigateway.RestApi(this,' testApi',{
  restApiName:'my test api'
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Then we create a root API on this API Gateway. When a request hits this 'root' api, they can get all items in our db. Declaring the api is not sufficient, we need to connect it with our lambda. for that we use the &lt;br&gt;
LambdaIntegration method of the api gateway. API gateway has many integration types that are worth looking.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const rootApi=api.root.addResource('root');
const getAllApi=new apigateway.LambdaIntegration(getAll);
rootApi.addMethod('GET',getAllApi);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Using the following code we can also create items in our databse. As the Create item in database translates to POST in REST, we add this functionality on POST method of our API.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const createApi=rootApi.addResource('create');
const createApiIntegration=new apigateway.LambdaIntegration(createLambda);
createApi.addMethod('POST', createApiIntegration);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;As we are security concious developers, we don't provide apis without auhtentication in the wild internet. Using the follwing code, we add throttling to our api, which means that our costs won't skyrocket unexpectedly. We also add an apikey to our api gateway. This is, of course, a very basic authentication mechanism, but for the purpose of this demo is sufficient. For production grade applications you might want to have a look at AWS Cognito.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const plan = api.addUsagePlan('UsagePlan', {
  name:'EASY',
  throttle:{
    rateLimit:20,
    burstLimit:2
  }
});
const key =api.addApiKey('ApiKey');
plan.addApiKey(key);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now we get to the actual code of our lambda functions. Inside the root folder, create a folder called 'src'. Then create a file called 'create.ts'.&lt;/p&gt;

&lt;p&gt;Add the follwing to create.ts&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const AWS = require('aws-sdk');
const db = new AWS.DynamoDB.DocumentClient();
const TABLE_NAME = process.env.TABLE_NAME || '';
const PRIMARY_KEY = process.env.PRIMARY_KEY || '';
export const handler = async (event: any={}) : Promise &amp;lt;any&amp;gt; =&amp;gt; {

   const item =typeof event.body == 'object' ? event.body :JSON.parse(event.body);
   const ID = String.fromCharCode(65 + Math.floor(Math.random()*26));
   item[PRIMARY_KEY] = ID;
    const params = {
        TableName: TABLE_NAME,
        Item:item
    };

  try {
    await db.put(params).promise();
    return { statusCode: 200, body: 'success' };
  } catch (dbError) {
    return { statusCode: 500, body: JSON.stringify(dbError)};
  }
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The important part of this code is the ID. Dynamodb is designed differently from traditional relational dbs, and therefore you cannot have an identity field which is incremented by the db. You have to provide a unique ID when you insert items in your db.&lt;/p&gt;

&lt;p&gt;Having declared our create function, we then create our get all function. Add a file called get-all.ts and add the following to it.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const AWS = require('aws-sdk');
const db = new AWS.DynamoDB.DocumentClient();
const TABLE_NAME = process.env.TABLE_NAME || '';

export const handler = async () : Promise &amp;lt;any&amp;gt; =&amp;gt; {

  const params = {
    TableName: TABLE_NAME
  };

  try {
    const response = await db.scan(params).promise();
    return { statusCode: 200, body: JSON.stringify(response.Items) };
  } catch (dbError) {
    return { statusCode: 500, body: JSON.stringify(dbError)};
  }
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now we are ready to verify our cdk stack. First run 'npm run build' which compiles typescript to javascript.&lt;br&gt;
Then run cdk synth. This should give a cloud formation template. If it returns an error, go to previous steps and verify your work.&lt;br&gt;
Before you can deploy this applicaion, you need to bootsrap your environment. To do that, run the following.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cdk bootstrap aws://youraccountID/yourregion

set CDK_NEW_BOOTSTRAP=1
cdk bootstrap aws://youraccountID/yourregion
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now you are ready to deploy your cdk app. &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cdk deploy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Some more useful cdk commands:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cdk ls //lists the stacks in your application
cdk diff //shows the difference of your current deployed stack and your local stack
cdk doctor //*kind of* verifies your cdk stack for warnings
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;That is it for a very minimal backend in aws. Of course, you can add more functions, to do update and delete. But as they are very similar, they are left out in this demo.&lt;/p&gt;

&lt;p&gt;The source code on github: &lt;a href="https://github.com/pedramha/cdk-crudsample"&gt;https://github.com/pedramha/cdk-crudsample&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The link to this video can be found at:&lt;/p&gt;

&lt;p&gt;Part1:    &lt;a href="https://www.youtube.com/watch?v=uhE3Z2lGGXQ"&gt;https://www.youtube.com/watch?v=uhE3Z2lGGXQ&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Part2:    &lt;a href="https://www.youtube.com/watch?v=5oKEti1W5BQ"&gt;https://www.youtube.com/watch?v=5oKEti1W5BQ&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
