<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Marcos Ferrero</title>
    <description>The latest articles on DEV Community by Marcos Ferrero (@maf1978).</description>
    <link>https://dev.to/maf1978</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/maf1978"/>
    <language>en</language>
    <item>
      <title>Hosting a static website on AWS and implementing CI/CD</title>
      <dc:creator>Marcos Ferrero</dc:creator>
      <pubDate>Wed, 24 May 2023 02:36:21 +0000</pubDate>
      <link>https://dev.to/maf1978/hosting-a-static-website-on-aws-and-implementing-cicd-2d27</link>
      <guid>https://dev.to/maf1978/hosting-a-static-website-on-aws-and-implementing-cicd-2d27</guid>
      <description>&lt;p&gt;A static website is a website that does not require any server-side processing. This means that the website's content is stored in files and served directly to the user without any need for a web server to process the files. Static website hosting: S3 provides a simple and easy way to host static websites. You can simply upload your website's files to an S3 bucket and configure S3 to serve your website.&lt;/p&gt;

&lt;p&gt;For this project, I want to show the benefits of using the following services: Github, AWS Code Pipeline, AWS S3, and Cloudfront to speed the process to set up a website and allow different developers to provide modification to the features. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Github&lt;/strong&gt;: is a popular code hosting platform that offers a number of benefits for developers, including version control, collaboration, documentation and security.&lt;/p&gt;

&lt;p&gt;Create a new GitHub repo for this project.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--y_9MPc17--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/aumq7ldzjl0grj9ni37p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--y_9MPc17--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/aumq7ldzjl0grj9ni37p.png" alt="Image description" width="800" height="63"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;AWs S3 offers a number of features that make it ideal for hosting static websites, such as: is a fully managed object storage service that provides a durable and scalable storage solution for static websites. Scalability: S3 is a highly scalable service that can easily handle large volumes of traffic.&lt;br&gt;
Durability: S3 is a durable service that provides 99.9999999999% durability for objects stored in S3.&lt;/p&gt;

&lt;p&gt;Create a Bucket add a name and keep all the other default options.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jkOrc69f--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hkuq0lz8ze1bw2pgq46m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jkOrc69f--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hkuq0lz8ze1bw2pgq46m.png" alt="Image description" width="800" height="774"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once the Bucket is created, go to the properties tab of that bucket and below click where the option is to edit Static website hosting. Enable it and add the index file name containing the HTML code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wxZCloy2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3o6skrquc13ehz4old6s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wxZCloy2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3o6skrquc13ehz4old6s.png" alt="Image description" width="800" height="496"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8TydL34R--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6qek7wodvb1jtrw7jupn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8TydL34R--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6qek7wodvb1jtrw7jupn.png" alt="Image description" width="800" height="704"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Since the bucket can be access from the public, it does not mean your object is also accessible. At the object level, it should also be accessible else it will throw an error in S3 URL. So here we will define the policies at the object level as shown below.&lt;/p&gt;

&lt;p&gt;Please, on the Resource field, add your Bucket name.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--O1ukmnV4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gx67ifgiauj0ay7is6ok.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--O1ukmnV4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gx67ifgiauj0ay7is6ok.png" alt="Image description" width="800" height="445"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now upload your files to your bucket as shown and after uploading your files, click on the S3 URL.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zL1ZqjaK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/voqgc0sawsskmychhtes.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zL1ZqjaK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/voqgc0sawsskmychhtes.png" alt="Image description" width="800" height="263"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6PWBmboQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vl0n04r6d58kj5jblukd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6PWBmboQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vl0n04r6d58kj5jblukd.png" alt="Image description" width="800" height="354"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS CloudFront&lt;/strong&gt; is a content delivery network (CDN) that can be used to improve the performance and security of static websites. CloudFront caches your website's content in edge locations around the world, which can significantly improve the loading time of your website for users in different regions. CloudFront also provides a number of security features, such as:&lt;/p&gt;

&lt;p&gt;SSL/TLS encryption: CloudFront can encrypt all traffic between users and your website, which helps to protect user data.&lt;br&gt;
DDoS protection: CloudFront can help to protect your website from distributed denial-of-service (DDoS) attacks.&lt;/p&gt;

&lt;p&gt;By using AWS S3 and CloudFront together, you can easily create a highly scalable, durable, and secure static website. Below are some of the benefits of using AWS S3 and CloudFront to host a static website:&lt;/p&gt;

&lt;p&gt;Scalability: S3 and CloudFront are both highly scalable services, so you can easily add more capacity as your website grows.&lt;/p&gt;

&lt;p&gt;Durability: S3 and CloudFront are both highly durable services, so your website will be up and running even if there is an outage at your web host.&lt;/p&gt;

&lt;p&gt;Security: S3 and CloudFront both offer a number of security features, so you can be confident that your website is safe from attack.&lt;/p&gt;

&lt;p&gt;Cost-effectiveness: S3 and CloudFront are both very cost-effective services, so you can save money on hosting your website.&lt;br&gt;
If you are looking for a reliable and affordable way to host a static website, then AWS S3 and CloudFront are a great option.&lt;/p&gt;

&lt;p&gt;Now, Go the CloudFront and create a distribution and for the origin, it will automatically pop out the S3 URL when you drop down. Unmark the Block all public access and check the Acknowledge box.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WUzknUzE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ivgp7gy0umd11mvdjemx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WUzknUzE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ivgp7gy0umd11mvdjemx.png" alt="Image description" width="800" height="977"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_HVSHYDY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/q06wh3kxju31tg7nlnle.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_HVSHYDY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/q06wh3kxju31tg7nlnle.png" alt="Image description" width="674" height="985"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;CloudFront uses cache content it might delay in displaying your changed content. Therefore,  I will be disabling caching as shown above.&lt;/p&gt;

&lt;p&gt;After this is created, you will get the Distribution domain name. Open your browser and check if your website is up and running.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0bRfDKCV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3yn1xfnfjljwjbmrcvi6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0bRfDKCV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3yn1xfnfjljwjbmrcvi6.png" alt="Image description" width="800" height="481"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;AWS CodePipeline is a continuous delivery service that helps you automate your release pipelines for fast and reliable application and infrastructure updates. It provides a visual pipeline editor that makes it easy to model your software release process.&lt;/p&gt;

&lt;p&gt;Now, I will create a pipeline to automate release process of the code or any developer can add new features.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ZsFyk0CL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7q13g88hp9l0d0xcf8h7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ZsFyk0CL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7q13g88hp9l0d0xcf8h7.png" alt="Image description" width="800" height="587"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the Source section, you can choose your repository. For me, I have uploaded it on GitHub so I will be linking my Github repo here.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5_Ak_27Z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/iwnbg43f0j4n4bicpos5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5_Ak_27Z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/iwnbg43f0j4n4bicpos5.png" alt="Image description" width="794" height="885"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--68akyBbh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dh1enruhbvm0q04xxivf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--68akyBbh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dh1enruhbvm0q04xxivf.png" alt="Image description" width="800" height="593"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click on Next, Review the configs and click to create your pipeline. You make some changes to your source code and push it to repo. CodePipeline will automatically detect it and trigger the pipeline.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1CDl-Rdb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gb12o5ty47b7omzartyq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--1CDl-Rdb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gb12o5ty47b7omzartyq.png" alt="Image description" width="758" height="1149"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--CFeAnuFx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nirul85c5pkgxlxb2w3t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--CFeAnuFx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nirul85c5pkgxlxb2w3t.png" alt="Image description" width="684" height="705"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After creating the pipeline, I added some file to the GitHub repository and went to check out the new release on the pipeline where successfully added the new updates to the pipeline. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--YvMSba78--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fds5rvuyowhpb7gekzbi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--YvMSba78--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fds5rvuyowhpb7gekzbi.png" alt="Image description" width="800" height="252"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Lz-IIKhm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l6nklmvxax4ut8xejfph.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Lz-IIKhm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l6nklmvxax4ut8xejfph.png" alt="Image description" width="800" height="551"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Thank you for reading this blog.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Serverless API CRUD with lambda and DynamoDB</title>
      <dc:creator>Marcos Ferrero</dc:creator>
      <pubDate>Wed, 26 Apr 2023 01:54:46 +0000</pubDate>
      <link>https://dev.to/maf1978/serverless-api-crud-with-lambda-and-dynamodb-5bme</link>
      <guid>https://dev.to/maf1978/serverless-api-crud-with-lambda-and-dynamodb-5bme</guid>
      <description>&lt;p&gt;In this blog post, I will show you how to build a serverless CRUD API with AWS Lambda and Amazon DynamoDB using the AWS Console. This is a great way to build APIs quickly and easily without having to worry about managing servers or infrastructure.&lt;/p&gt;

&lt;p&gt;An Amazon API Gateway is a collection of resources and methods. For this tutorial, you create one resource (DynamoDBManager) and define one method (POST) on it. The method is backed by a Lambda function (LambdaFunctionOverHttps). That is, when you call the API through an HTTPS endpoint, Amazon API Gateway invokes the Lambda function.&lt;/p&gt;

&lt;p&gt;The POST method on the DynamoDBManager resource supports the following DynamoDB operations:&lt;/p&gt;

&lt;p&gt;Create, update, and delete an item.&lt;br&gt;
Read an item.&lt;br&gt;
Scan an item.&lt;br&gt;
Other operations (echo, ping), not related to DynamoDB, that you can use for testing.&lt;/p&gt;

&lt;p&gt;Prerequisites&lt;br&gt;
An AWS account&lt;br&gt;
The AWS Console&lt;/p&gt;

&lt;p&gt;Step 1: Create Lambda IAM Role&lt;br&gt;
Create the execution role that gives your function permission to access AWS resources.&lt;/p&gt;

&lt;p&gt;To create an execution role : Open the roles page in the IAM console, Choose Create role. Create a role with the following properties. Trusted entity – Lambda. Role name –lambda-apigateway-role.&lt;br&gt;
Create a Permissions – Custom policy with permission to DynamoDB and CloudWatch Logs. This custom policy has the permissions that the function needs to write data to DynamoDB and upload logs.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
"Version": "2012-10-17",
"Statement": [
{
  "Sid": "Stmt1428341300017",
  "Action": [
    "dynamodb:DeleteItem",
    "dynamodb:GetItem",
    "dynamodb:PutItem",
    "dynamodb:Query",
    "dynamodb:Scan",
    "dynamodb:UpdateItem"
  ],
  "Effect": "Allow",
  "Resource": "*"
},
{
  "Sid": "",
  "Resource": "*",
  "Action": [
    "logs:CreateLogGroup",
    "logs:CreateLogStream",
    "logs:PutLogEvents"
  ],
  "Effect": "Allow"
}
]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Step 2: Create a DynamoDB table&lt;br&gt;
First, we need to create a DynamoDB table to store our data. We can do this using the AWS Console:&lt;/p&gt;

&lt;p&gt;Go to the DynamoDB console. To create a DynamoDB table&lt;br&gt;
Open the DynamoDB console.&lt;br&gt;
Choose Create table.&lt;br&gt;
Create a table with the following settings.&lt;br&gt;
Table name – lambda-apigateway&lt;br&gt;
Primary key – id (string)&lt;br&gt;
Click on "Create".&lt;/p&gt;

&lt;p&gt;Step 3: Create a Lambda function&lt;br&gt;
Next, we need to create a Lambda function to handle our API requests. We can do this using the AWS Console:&lt;/p&gt;

&lt;p&gt;Go to the Lambda console.&lt;br&gt;
Click on "Create Function".&lt;br&gt;
In the "Function name" field, enter a name for your function.&lt;br&gt;
In the "Runtime" drop-down, select "Phyton 3.7".&lt;br&gt;
Add the script with python code.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from __future__ import print_function

import boto3
import json

print('Loading function')


def lambda_handler(event, context):
    '''Provide an event that contains the following keys:

      - operation: one of the operations in the operations dict below
      - tableName: required for operations that interact with DynamoDB
      - payload: a parameter to pass to the operation being performed
    '''
    #print("Received event: " + json.dumps(event, indent=2))

    operation = event['operation']

    if 'tableName' in event:
        dynamo = boto3.resource('dynamodb').Table(event['tableName'])

    operations = {
        'create': lambda x: dynamo.put_item(**x),
        'read': lambda x: dynamo.get_item(**x),
        'update': lambda x: dynamo.update_item(**x),
        'delete': lambda x: dynamo.delete_item(**x),
        'list': lambda x: dynamo.scan(**x),
        'echo': lambda x: x,
        'ping': lambda x: 'pong'
    }

    if operation in operations:
        return operations[operation](event.get('payload'))
    else:
        raise ValueError('Unrecognized operation "{}"'.format(operation))
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Add the existing script ambda-apigateway-role wich allow to create, update and delete objects in the DynamoDB table.&lt;br&gt;
Click on "Create".&lt;/p&gt;

&lt;p&gt;Test Lambda Function&lt;br&gt;
Let's test our newly created function. We haven't created DynamoDB and the API yet, so we'll do a sample echo operation. The function should output whatever input we pass.&lt;/p&gt;

&lt;p&gt;Paste the following JSON into the event. The field "operation" dictates what the lambda function will perform. In this case, it'd simply return the payload from input event as output. Click "Create" to save.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "operation": "echo",
    "payload": {
        "somekey1": "somevalue1",
        "somekey2": "somevalue2"
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--sX_8dMC5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/12tbewmst1tfgd93csrr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--sX_8dMC5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/12tbewmst1tfgd93csrr.png" alt="Image description" width="800" height="333"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Step 4: Create an API Gateway API&lt;br&gt;
Finally, we need to create an API Gateway API to expose our Lambda function to the internet. We can do this using the AWS Console:&lt;/p&gt;

&lt;p&gt;Go to the API Gateway console.&lt;br&gt;
Click on "Create API".&lt;br&gt;
In the "API Name" field, enter a name for your API.&lt;br&gt;
In the "API Type" drop-down, select "REST API".&lt;br&gt;
Click on "Create".&lt;br&gt;
Step 4: Create a method&lt;br&gt;
Now that we have created an API, we need to create a method to handle our API requests. We can do this using the AWS Console:&lt;/p&gt;

&lt;p&gt;In the "Methods" tab, click on "Create Method".&lt;br&gt;
In the "Method Type" drop-down, select "POST".&lt;br&gt;
In the "Path" field, enter the path for your method.&lt;br&gt;
Click on "Create".&lt;/p&gt;

&lt;p&gt;Step 5: Integrate the method with Lambda&lt;br&gt;
Now that we have created a method, we need to integrate it with our Lambda function. We can do this using the AWS Console:&lt;/p&gt;

&lt;p&gt;In the "Integrations" tab, click on "Create Integration".&lt;br&gt;
In the "Integration Type" drop-down, select "Lambda Function".&lt;br&gt;
In the "Lambda Function" field, select the name of your Lambda function.&lt;br&gt;
Click on "Create".&lt;/p&gt;

&lt;p&gt;Step 6: Deploy the API&lt;br&gt;
Now that we have configured our API, we need to deploy it. We can do this using the AWS Console:&lt;/p&gt;

&lt;p&gt;In the "Actions" menu, click on "Deploy API".&lt;br&gt;
In the "Deployment stage" drop-down, select a stage.&lt;br&gt;
Click on "Deploy".&lt;/p&gt;

&lt;p&gt;Testing the API&lt;br&gt;
Running our solution&lt;br&gt;
The Lambda function supports using the create operation to create an item in your DynamoDB table. To request this operation, use the following JSON:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "operation": "create",
    "tableName": "lambda-apigateway",
    "payload": {
        "Item": {
            "id": "1234ABCD",
            "number": 5
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;T o test the solution I used a VS Code extension call Thunder client, is a lightweight Rest API Client Extension for Visual Studio Code, hand-crafted by Ranga Vadhineni with simple and clean design.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TVlFVKh8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rg8ecnntns3cncb019oi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TVlFVKh8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rg8ecnntns3cncb019oi.png" alt="Image description" width="800" height="363"&gt;&lt;/a&gt;&lt;br&gt;
Or also can use the Code snippet&lt;br&gt;
curl -X GET https://{API_GATEWAY_ENDPOINT}/{PATH}&lt;br&gt;
Use code with caution.&lt;br&gt;
This will return the response from our Lambda function.&lt;/p&gt;

&lt;p&gt;To validate that the item is indeed inserted into DynamoDB table, go to Dynamo console, select "lambda-apigateway" table, select "Items" tab, and the newly inserted item should be displayed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--94EJgh93--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zdp31poyo82khliwkar2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--94EJgh93--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zdp31poyo82khliwkar2.png" alt="Image description" width="800" height="278"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Cleanup&lt;br&gt;
Let's clean up the resources we have created for this lab.&lt;/p&gt;

&lt;p&gt;Cleaning up DynamoDB&lt;br&gt;
To delete the table, from DynamoDB console, select the table "lambda-apigateway", and click "Delete table"&lt;/p&gt;

&lt;p&gt;To delete the Lambda, from the Lambda console, select lambda "LambdaFunctionOverHttps", click "Actions", then click Delete&lt;/p&gt;

&lt;p&gt;To delete the API we created, in API gateway console, under APIs, select "DynamoDBOperations" API, click "Actions", then "Delete"&lt;/p&gt;

&lt;p&gt;Conclusion&lt;br&gt;
In this blog post, we showed you how to build a serverless CRUD API with AWS Lambda and Amazon DynamoDB using the AWS Console. This is a great way to build APIs quickly and easily without having to worry about managing servers or infrastructure.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>AWS Internship 2022</title>
      <dc:creator>Marcos Ferrero</dc:creator>
      <pubDate>Tue, 29 Nov 2022 03:29:46 +0000</pubDate>
      <link>https://dev.to/maf1978/aws-internship-2022-3pfp</link>
      <guid>https://dev.to/maf1978/aws-internship-2022-3pfp</guid>
      <description>&lt;p&gt;Summer of 2022, I completed an internship with AWS in Herndon, VA. I was assigned to a Data Lake Accelerator project for one of the government agencies which was very exciting and a great experience.&lt;/p&gt;

&lt;p&gt;During my internship, I learned AWS processes, different recoding practices, cloud architecture, and how to code better. In addition, I experienced the corporate culture and engaged with the knowledgeable developers that work for AWS. &lt;/p&gt;

&lt;p&gt;My role responsibilities consisted of deploying data lake infrastructure, automatically build, test, and deploy new or changed versions of the application from the version control system. I utilized AWS CodeCommit and CodePipeline to set up a CI/CD to help automate multiple steps to automate builds, push code to a repository and then deploy to your updated code to AWS. This helps minimize potential mistakes as opposed to running multiple manual steps Automate deployment to different stages (dev, staging, and production). &lt;/p&gt;

&lt;p&gt;I learned AWS CDK and used it to deploy pipelines. The AWS Cloud Development Kit (AWS CDK) is an open-source software development framework to define cloud infrastructure in familiar programming languages and provision it through AWS CloudFormation. The AWS CDK consists of three major components: The core framework for modeling reusable infrastructure components, A CLI for deploying CDK applications, the AWS Construct Library, a set of high-level components that abstract cloud resources and encapsulate proven defaults. The CDK makes it easy to deploy an application to the AWS Cloud from your workstation by simply running CDK deploy. This is good when you’re doing initial development and testing, but you should deploy production workloads through more reliable, automated pipelines.&lt;/p&gt;

&lt;p&gt;At some point of the project, I was instructed to test Glue ETL process and querying data information using Athena. AWS Glue is a serverless data integration service that makes it easy for analytics users to discover, prepare, move, and integrate data from multiple sources. You can use it for analytics, machine learning, and application development. It also includes additional productivity and data ops tooling for authoring, running jobs, and implementing business workflows. &lt;/p&gt;

&lt;p&gt;Athena helps to analyze unstructured, semi-structured, and structured data stored in Amazon S3. Athena integrates with the AWS Glue Data Catalog, which offers a persistent metadata store for your data in Amazon S3. This allows you to create tables and&lt;br&gt;
query data in Athena based on a central metadata store available throughout your Amazon Web Services account.&lt;/p&gt;

&lt;p&gt;As part of my internship, I attended daily 1:1 and scheduled team meeting on Jira (project management skills software), gathered with my internship team to discuss other projects of interest. This gave me the opportunity to take in all of the feedback and put it into practice. &lt;/p&gt;

&lt;p&gt;Coming into this internship with zero experience, I definitely feel this internship provided me with the tools I need to grow my career as a solution architect and developer.&lt;/p&gt;

</description>
      <category>internship</category>
    </item>
    <item>
      <title>Hands-On Approach with AWS Cloud Resume Challenge</title>
      <dc:creator>Marcos Ferrero</dc:creator>
      <pubDate>Tue, 26 Apr 2022 01:54:33 +0000</pubDate>
      <link>https://dev.to/maf1978/how-i-develop-practical-skills-in-aws-with-the-cloud-resume-challenge-1o08</link>
      <guid>https://dev.to/maf1978/how-i-develop-practical-skills-in-aws-with-the-cloud-resume-challenge-1o08</guid>
      <description>&lt;p&gt;At this stage in my life, I decided to switch careers to focus on the Tech world and what it had to offer. I enrolled at Miami-Dade College with basic knowledge of Cloud Computing. I knew coming into this new world, I had to take a hands-on approach if I wanted to succeed in this new career. &lt;/p&gt;

&lt;p&gt;By attending classes, studying, and exposing myself to the unlimited resources, I became a certified &lt;a href="https://www.credly.com/badges/08e6028b-48f2-4209-a7f8-7a472f8256b5/public_url"&gt;AWS Cloud Practitioner&lt;/a&gt; (September 2021) and &lt;a href="https://www.credly.com/badges/eb36423a-1440-48db-bfb0-17f0899b83cc/public_url"&gt;AWS Solution Architect&lt;/a&gt; (April 2022). One of my first projects that I was tasked with was the Cloud Resume Challenge by Forrest Brazeal.&lt;/p&gt;

&lt;p&gt;Starting the Challenge - &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Front end- HTML / CSS&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I found a Bootstrap template I liked and modified it to showcase my projects by removing additional pages and links. I tried to keep it as simple as possible. I didn't have previous experience in HTML/CSS; therefore, online resources helped me with editing the template. The HTML page would also include a JavaScript snippet which updated and fetched the visitor count from the "back end."&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Static S3 Website&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Based on the knowledge I gained with the unlimited resources &lt;br&gt;
and preparing for the AWS certifications, I learned how to host a static website on S3 and using CloudFront for content distribution. I purchased a domain name using AWS Route 53 and configured it to use the CloudFront distribution. Also, I made use of AWS Certification Manager (ACM) to procure an SSL certificate for the site.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Back-end&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;DynamoDB&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I had some knowledge of DynamoDB but very little practical/hands-on experience. I created a Table with a Partition key (date) and an attribute count. Each day, the webpage has a certain amount of visitors. The code to retrieve items from a Table and to update the count, I searched it online and found the code for Python.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lambda&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The Lambda function was also challenging for me because I had no experience. After researching, I created a function with Python code to update the count in the table, and at the same time get the count number. I added the IAM role for the function to access the DynamoDB table and tested that it was working.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;API Gateway&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Another services was new to me but I was determined to learn it. I had to familiarize myself with the concept of an API in which I was able to understand its goal and function.&lt;br&gt;
The API Gateway exposes a REST API Endpoint, which will be called by Javascript snippet embedded in the front-end html page, on every page visit/refresh - to update and fetch the visitor count from a DynamoDB table, through Lambda function. Enabling CORS (Cross-Origin Resource Sharing) on the API Gateway resource is mandatory to fetch the response back, when it is called.&lt;br&gt;
After this, the backend was done. I could call the API link, this triggered the Lambda function, which increased the count by 1 and returned the value.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Tests&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I had previous experience with unit tests so I wrote some tests for the Lambda function.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Infrastructure as Code&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;At this point, the backend infrastructure creation was fully automated with SAM and the frontend was done fully manually.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Source control&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I used git from the beginning and stored my code and all my files in Github →(&lt;a href="https://github.com/maf1978/CloudResumeChallenge.git"&gt;https://github.com/maf1978/CloudResumeChallenge.git&lt;/a&gt;)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;CI/CD&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I had never used Github Actions but I had used a similar CI/CD tool, so this step was quite fast and fun. I created my template and first ran the unit tests for the Lambda function. In the second step, I used the predefined SAM action to build and deploy the infrastructure to AWS. In the last step, I upload the static files to S3.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Although this was my first project, it gave me confidence because I was able to familiarize myself with new terminology, it gave me experience in the parts where I was lacking skills, gained useful skills, learn how to overcome the challenges of completing each step in the process and succeeding all with a hands-on approach in AWS.&lt;/p&gt;

&lt;p&gt;Thanks for reading! Please visit my CV website (&lt;a href="http://www.mafcloudsolutions.com"&gt;www.mafcloudsolutions.com&lt;/a&gt;) for more of my Cloud adventures!&lt;/p&gt;

&lt;p&gt;My Github repo with the challenge:(&lt;a href="https://github.com/maf1978/CloudResumeChallenge.git"&gt;https://github.com/maf1978/CloudResumeChallenge.git&lt;/a&gt;) &lt;/p&gt;

</description>
      <category>aws</category>
      <category>beginners</category>
      <category>serverless</category>
    </item>
  </channel>
</rss>
