<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Deepak Sharma</title>
    <description>The latest articles on DEV Community by Deepak Sharma (@ideepaksharma).</description>
    <link>https://dev.to/ideepaksharma</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/ideepaksharma"/>
    <language>en</language>
    <item>
      <title>Building Scalable Event Processing with Fan-out Pattern using the Serverless Framework</title>
      <dc:creator>Deepak Sharma</dc:creator>
      <pubDate>Fri, 03 Jan 2025 10:05:59 +0000</pubDate>
      <link>https://dev.to/ideepaksharma/building-scalable-event-processing-with-fan-out-pattern-using-the-serverless-framework-1o9f</link>
      <guid>https://dev.to/ideepaksharma/building-scalable-event-processing-with-fan-out-pattern-using-the-serverless-framework-1o9f</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;Want to process the same event across multiple services simultaneously? The fan-out pattern using AWS Serverless services might be exactly what you need. In this tutorial, we’ll build a robust asynchronous fan-out system using AWS Lambda, SNS, and SQS, all orchestrated with the Serverless Framework.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;What is the Fan-out Pattern?&lt;/strong&gt;&lt;br&gt;
The fan-out pattern is a messaging pattern where a single message triggers multiple parallel processing flows. The fan out pattern implements a single topic that will push each received message to multiple queues. That way, by only sending a single message from your code, it will be sent into multiple queues, so multiple functions will be invoked asynchronously. Think of it like a broadcaster sending the same message to multiple receivers simultaneously. This pattern is particularly useful when you need to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Process the same data in different ways&lt;/li&gt;
&lt;li&gt;Trigger multiple workflows from a single event&lt;/li&gt;
&lt;li&gt;Distribute notifications to multiple subscribers&lt;/li&gt;
&lt;li&gt;Scale your event processing horizontally&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Prerequisites&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Node.js installed on your machine&lt;/li&gt;
&lt;li&gt;AWS account with appropriate permissions&lt;/li&gt;
&lt;li&gt;Basic understanding of JavaScript/Node.js&lt;/li&gt;
&lt;li&gt;AWS CLI installed and configured&lt;/li&gt;
&lt;li&gt;Serverless Framework CLI installed&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Architecture Overview&lt;/strong&gt;&lt;br&gt;
The Serverless Framework allows you to easily define the resources of an asynchronous microservice, define the triggers (i.e. a trigger could be a message in a queue or a message sent to a topic), and define the middleware components that are essential for realizing the asynchronous communication.&lt;/p&gt;

&lt;p&gt;Our solution uses these AWS services:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;AWS Lambda&lt;/strong&gt;: For processing events&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon SNS (Simple Notification Service)&lt;/strong&gt;: For message broadcasting&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon SQS (Simple Queue Service)&lt;/strong&gt;: For reliable message delivery&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Serverless Framework&lt;/strong&gt;: For infrastructure as code&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Below is the high level architectural diagram of the implementation:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;A trigger Lambda receives an event&lt;/li&gt;
&lt;li&gt;The event is published to an SNS topic&lt;/li&gt;
&lt;li&gt;SNS broadcasts the message to multiple SQS queues&lt;/li&gt;
&lt;li&gt;Separate Lambda functions process messages from each queue&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw22ood3wbi8v3zveuszs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw22ood3wbi8v3zveuszs.png" alt=" " width="765" height="646"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Common Use Cases&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Event notification systems&lt;/li&gt;
&lt;li&gt;Data processing pipelines&lt;/li&gt;
&lt;li&gt;Microservices communication&lt;/li&gt;
&lt;li&gt;Log processing and analytics&lt;/li&gt;
&lt;li&gt;Real-time data distribution&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Implementation&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Step 1: Project Set Up&lt;/strong&gt;&lt;br&gt;
First, let’s create our project structure and install necessary dependencies:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mkdir serverless-async-fanout
cd serverless-async-fanout
npm init -y
npm install aws-sdk serverless
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once the above dependencies are installed, we proceed with the project structure creation which should look like below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.
├── serverless.yml
├── src
│   ├── trigger.js
│   ├── processorOne.js
│   └── processorTwo.js
└── package.json
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Project Structure:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;serverless.yml: Configuration for functions, resources, and permissions. It is the backbone of the application, defining our infrastructure as code.&lt;/li&gt;
&lt;li&gt;src/trigger.js: Handles incoming requests and publishes to SNS&lt;/li&gt;
&lt;li&gt;src/processorOne.js: Processes individual messages from SQS One&lt;/li&gt;
&lt;li&gt;src/processorTwo.js: Processes individual messages from SQS Two&lt;/li&gt;
&lt;li&gt;package.json: Project dependencies&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Step 2: Configure Serverless.yml&lt;/strong&gt;&lt;br&gt;
The serverless.yml file contains the configuration of runtime, trigger functions, aws region, SNS and SQS resources. Below is how my file looks like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;service: serverless-async-fanout

provider:
  name: aws
  runtime: nodejs18.x
  region: eu-central-1
  iam:
    role:
      statements:
        - Effect: Allow
          Action:
            - sns:Publish
            - sqs:SendMessage
            - sqs:ReceiveMessage
            - sqs:DeleteMessage
          Resource:
            - !Ref NotificationTopic
            - !GetAtt ProcessingQueueOne.Arn
            - !GetAtt ProcessingQueueTwo.Arn

functions:
  trigger:
    handler: src/trigger.handler
    events:
      - http:
          path: /trigger
          method: post
    environment:
      SNS_TOPIC_ARN: !Ref NotificationTopic

  processorOne:
    handler: src/processorOne.handler
    events:
      - sqs:
          arn: !GetAtt ProcessingQueueOne.Arn
          batchSize: 1

  processorTwo:
    handler: src/processorTwo.handler
    events:
      - sqs:
          arn: !GetAtt ProcessingQueueTwo.Arn
          batchSize: 1

resources:
  Resources:
    NotificationTopic:
      Type: AWS::SNS::Topic
      Properties:
        TopicName: ${self:service}-notification-topic

    ProcessingQueueOne:
      Type: AWS::SQS::Queue
      Properties:
        QueueName: ${self:service}-processing-queue-one
        VisibilityTimeout: 60

    ProcessingQueueTwo:
      Type: AWS::SQS::Queue
      Properties:
        QueueName: ${self:service}-processing-queue-two
        VisibilityTimeout: 60

    QueueOnePolicy:
      Type: AWS::SQS::QueuePolicy
      Properties:
        Queues:
          - !Ref ProcessingQueueOne
        PolicyDocument:
          Version: "2012-10-17"
          Statement:
            - Effect: Allow
              Principal:
                Service: sns.amazonaws.com
              Action: sqs:SendMessage
              Resource: !GetAtt ProcessingQueueOne.Arn
              Condition:
                ArnEquals:
                  aws:SourceArn: !Ref NotificationTopic

    QueueTwoPolicy:
      Type: AWS::SQS::QueuePolicy
      Properties:
        Queues:
          - !Ref ProcessingQueueTwo
        PolicyDocument:
          Version: "2012-10-17"
          Statement:
            - Effect: Allow
              Principal:
                Service: sns.amazonaws.com
              Action: sqs:SendMessage
              Resource: !GetAtt ProcessingQueueTwo.Arn
              Condition:
                ArnEquals:
                  aws:SourceArn: !Ref NotificationTopic

    NotificationTopicSubscriptionOne:
      Type: AWS::SNS::Subscription
      Properties:
        TopicArn: !Ref NotificationTopic
        Protocol: sqs
        Endpoint: !GetAtt ProcessingQueueOne.Arn

    NotificationTopicSubscriptionTwo:
      Type: AWS::SNS::Subscription
      Properties:
        TopicArn: !Ref NotificationTopic
        Protocol: sqs
        Endpoint: !GetAtt ProcessingQueueTwo.Arn
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step 3: Implement the Lambda Functions&lt;/strong&gt;&lt;br&gt;
Create your trigger function (src/trigger.js):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const AWS = require('aws-sdk');
const sns = new AWS.SNS();

module.exports.handler = async (event) =&amp;gt; {
    try {
        const body = JSON.parse(event.body);
        const items = body.items || [];

        // Publish each message to SNS - it will automatically fan out to all subscribed queues
        const publishPromises = items.map(item =&amp;gt;
            sns.publish({
                TopicArn: process.env.SNS_TOPIC_ARN,
                Message: JSON.stringify(item)
            }).promise()
        );

        await Promise.all(publishPromises);

        return {
            statusCode: 200,
            body: JSON.stringify({
                message: `Successfully initiated processing for ${items.length} items`
            })
        };
    } catch (error) {
        console.error('Error:', error);
        return {
            statusCode: 500,
            body: JSON.stringify({
                message: 'Error processing request',
                error: error.message
            })
        };
    }
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create your receiver functions (src/processorOne.js):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;module.exports.handler = async (event) =&amp;gt; {
    try {
        for (const record of event.Records) {
            const item = JSON.parse(record.body);
            console.log('Processing item in Queue One:', item);

            // Queue One specific processing
            await processItemInQueueOne(item);
        }

        return {
            statusCode: 200,
            body: JSON.stringify({
                message: 'Successfully processed messages in Queue One'
            })
        };
    } catch (error) {
        console.error('Error in Queue One processor:', error);
        throw error;
    }
};

async function processItemInQueueOne(item) {
    // Add your Queue One specific processing logic
    console.log('Queue One processing:', item);
    await new Promise(resolve =&amp;gt; setTimeout(resolve, 1000));
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create your receiver functions (src/processorTwo.js):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;module.exports.handler = async (event) =&amp;gt; {
    try {
        for (const record of event.Records) {
            const item = JSON.parse(record.body);
            console.log('Processing item in Queue Two:', item);

            // Queue Two specific processing
            await processItemInQueueTwo(item);
        }

        return {
            statusCode: 200,
            body: JSON.stringify({
                message: 'Successfully processed messages in Queue Two'
            })
        };
    } catch (error) {
        console.error('Error in Queue Two processor:', error);
        throw error;
    }
};

async function processItemInQueueTwo(item) {
    // Add your Queue Two specific processing logic
    console.log('Queue Two processing:', item);
    await new Promise(resolve =&amp;gt; setTimeout(resolve, 1000));
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step 4: Deploy and Test&lt;/strong&gt;&lt;br&gt;
Deploy the service using below commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Configure AWS credentials
aws configure

# Deploy to AWS
serverless deploy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;Note: You may encounter below error on deployment:&lt;br&gt;
serverless.ps1 cannot be loaded. The file ..\npm\serverless.ps1 is not digitally signed.&lt;br&gt;
To resolve this error, you may execute the below command and retry deployment:&lt;br&gt;
&lt;/p&gt;
&lt;/blockquote&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once the deployment is successful, the API endpoint will be generated along with the functions (topic &amp;amp; queues):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;endpoint: POST - https://xxxxx.execute-api.eu-central-1.amazonaws.com/dev/trigger
functions:
  trigger: async-fanout-service-dev-trigger (23 MB)
  processorOne: async-fanout-service-dev-processorOne (23 MB)
  processorTwo: async-fanout-service-dev-processorTwo (23 MB)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Test the fan-out pattern using curl or Postman:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#create an event/message
curl -X POST -H "Content-Type: application/json" \
  -d '{"items":[{"id":1,"data":"test data 1"}, 
                {"id":2,"data":"test data 2"}
               ]}' \
  https://your-api-endpoint/dev/trigger
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "message": "Successfully initiated processing for 2 items"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When you publish a message to SNS, it will automatically be delivered to both SQS queues, and each processor will handle the message according to its own logic.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Verifying the Implementation&lt;/strong&gt;&lt;br&gt;
To verify that your fan-out pattern is working:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Check CloudWatch Logs in AWS Console for both processor functions:
Go to AWS Console → CloudWatch → Log groups
Look for two log groups:&lt;/li&gt;
&lt;li&gt;/aws/lambda/async-fanout-service-dev-processorOne&lt;/li&gt;
&lt;li&gt;&lt;p&gt;/aws/lambda/async-fanout-service-dev-processorTwo&lt;br&gt;
You should see matching messageIds in both log groups&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Monitor SQS queue metrics in CloudWatch&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;View SNS topic delivery metrics&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Cleanup&lt;/strong&gt;&lt;br&gt;
To remove all deployed resources:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;serverless remove
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This command will (check your AWS Console):&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Delete the Lambda functions&lt;/li&gt;
&lt;li&gt;Remove the API Gateway endpoints&lt;/li&gt;
&lt;li&gt;Delete the SNS Topic and SQS Queues&lt;/li&gt;
&lt;li&gt;Remove the IAM roles and policies&lt;/li&gt;
&lt;li&gt;Clean up any CloudWatch log groups&lt;/li&gt;
&lt;li&gt;Remove any other AWS resources that were created as part of your service&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;br&gt;
The fan-out pattern using AWS Serverless services provides a powerful way to build scalable, event-driven architectures. By combining SNS and SQS, we get the benefits of both broadcast-style messaging and reliable message delivery.&lt;/p&gt;

&lt;p&gt;Please don’t forget to clap and follow me on &lt;a href="https://www.linkedin.com/in/ideepaksharma/" rel="noopener noreferrer"&gt;LinkedIn &lt;/a&gt;and &lt;a href="https://github.com/ideepaksharma/" rel="noopener noreferrer"&gt;Github &lt;/a&gt;for more such posts.&lt;/p&gt;

</description>
      <category>serverless</category>
      <category>serverlessframework</category>
      <category>sns</category>
      <category>sqs</category>
    </item>
    <item>
      <title>Build a highly scalable Serverless CRUD Microservice with AWS Lambda and the Serverless Framework</title>
      <dc:creator>Deepak Sharma</dc:creator>
      <pubDate>Sun, 29 Dec 2024 22:02:12 +0000</pubDate>
      <link>https://dev.to/ideepaksharma/build-a-highly-scalable-serverless-crud-microservice-with-aws-lambda-and-the-serverless-framework-1eac</link>
      <guid>https://dev.to/ideepaksharma/build-a-highly-scalable-serverless-crud-microservice-with-aws-lambda-and-the-serverless-framework-1eac</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;In today’s cloud-native world, serverless architecture has become increasingly popular for building scalable and cost-effective applications. This blog post will guide you through creating a production-ready serverless CRUD (Create, Read, Update, Delete) microservice using AWS Lambda, DynamoDB, and the Serverless Framework.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Agenda:&lt;/strong&gt;&lt;br&gt;
Create a fully functional REST API that can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create new items&lt;/li&gt;
&lt;li&gt;Retrieve items (both individual and list)&lt;/li&gt;
&lt;li&gt;Update existing items&lt;/li&gt;
&lt;li&gt;Delete items&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Prerequisites:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Node.js installed on your machine&lt;/li&gt;
&lt;li&gt;AWS account with appropriate permissions&lt;/li&gt;
&lt;li&gt;Basic understanding of JavaScript/Node.js&lt;/li&gt;
&lt;li&gt;AWS CLI installed and configured&lt;/li&gt;
&lt;li&gt;Serverless Framework CLI installed&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Architecture:&lt;/strong&gt;&lt;br&gt;
Below is the high level architectural diagram of the implementation.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AWS Lambda functions for each CRUD operation&lt;/li&gt;
&lt;li&gt;DynamoDB for data storage&lt;/li&gt;
&lt;li&gt;API Gateway for HTTP endpoints&lt;/li&gt;
&lt;li&gt;Proper IAM roles and permissions&lt;/li&gt;
&lt;li&gt;Cloudwatch logs to access the logs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8hxuem0893rsq7qedsxb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8hxuem0893rsq7qedsxb.png" alt=" " width="677" height="514"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Project Setup:&lt;/strong&gt;&lt;br&gt;
First, let’s create our project structure and install necessary dependencies:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mkdir serverless-crud-ms
cd serverless-crud-ms
npm init -y
npm install aws-sdk uuid
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once the above dependencies are installed, we proceed with the project structure creation which should look like below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.
├── serverless.yml
├── handlers
│   ├── create.js
│   ├── get.js
│   ├── update.js
│   └── delete.js
└── package.json
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Project Structure:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;serverless.yml: Configuration for functions, resources, and permissions. It is the backbone of the application, defining our infrastructure as code.&lt;/li&gt;
&lt;li&gt;handlers/: Contains Lambda function handlers&lt;/li&gt;
&lt;li&gt;package.json: Project dependencies&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Implementing CRUD Operations:&lt;/strong&gt;&lt;br&gt;
For each database operation a respective handler (.js) file is created under the handler directory. We will go through each one of them in corresponding sections. The names of the files are create.js, get.js, update.js, delete.js&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create Operation&lt;/strong&gt;&lt;br&gt;
To start with, a new file “create.js” is created for the POST operation, where a new item is inserted in the dynamo db table. The api uses the aws sdk client kit to insert records in dynamodb table. The operation performs the below:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Input validation&lt;/li&gt;
&lt;li&gt;Error handling&lt;/li&gt;
&lt;li&gt;CORS support&lt;/li&gt;
&lt;li&gt;Unique ID generation&lt;/li&gt;
&lt;li&gt;Timestamps for created/updated items
Below is the code snippet for reference:
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const AWS = require('aws-sdk');
const { v4: uuidv4 } = require('uuid');

const dynamoDb = new AWS.DynamoDB.DocumentClient();

module.exports.create = async (event) =&amp;gt; {
    try {
        const timestamp = new Date().getTime();
        const data = JSON.parse(event.body);

        if (!data.name || !data.description) {
            return {
                statusCode: 400,
                headers: {
                    'Content-Type': 'application/json',
                    'Access-Control-Allow-Origin': '*'
                },
                body: JSON.stringify({
                    message: 'Missing required fields'
                })
            };
        }

        const params = {
            TableName: process.env.DYNAMODB_TABLE,
            Item: {
                id: uuidv4(),
                name: data.name,
                description: data.description,
                createdAt: timestamp,
                updatedAt: timestamp
            }
        };

        await dynamoDb.put(params).promise();

        return {
            statusCode: 201,
            headers: {
                'Content-Type': 'application/json',
                'Access-Control-Allow-Origin': '*'
            },
            body: JSON.stringify(params.Item)
        };
    } catch (error) {
        console.error(error);
        return {
            statusCode: error.statusCode || 500,
            headers: {
                'Content-Type': 'application/json',
                'Access-Control-Allow-Origin': '*'
            },
            body: JSON.stringify({
                message: error.message || 'Internal server error'
            })
        };
    }
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;&lt;strong&gt;Read Operation&lt;/strong&gt;&lt;br&gt;
To start with, a new file “get.js” is created for the GET operation, which fetches all the item records or a specific item record from the dynamo db table.&lt;/p&gt;

&lt;p&gt;Below is the code snippet for reference:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const AWS = require('aws-sdk');
const dynamoDb = new AWS.DynamoDB.DocumentClient();

module.exports.getAll = async (event) =&amp;gt; {
    try {
        const params = {
            TableName: process.env.DYNAMODB_TABLE
        };

        const result = await dynamoDb.scan(params).promise();

        return {
            statusCode: 200,
            headers: {
                'Content-Type': 'application/json',
                'Access-Control-Allow-Origin': '*'
            },
            body: JSON.stringify(result.Items)
        };
    } catch (error) {
        console.error(error);
        return {
            statusCode: error.statusCode || 500,
            headers: {
                'Content-Type': 'application/json',
                'Access-Control-Allow-Origin': '*'
            },
            body: JSON.stringify({
                message: error.message || 'Internal server error'
            })
        };
    }
};

module.exports.getOne = async (event) =&amp;gt; {
    try {
        const params = {
            TableName: process.env.DYNAMODB_TABLE,
            Key: {
                id: event.pathParameters.id
            }
        };

        const result = await dynamoDb.get(params).promise();

        if (!result.Item) {
            return {
                statusCode: 404,
                headers: {
                    'Content-Type': 'application/json',
                    'Access-Control-Allow-Origin': '*'
                },
                body: JSON.stringify({
                    message: 'Item not found'
                })
            };
        }

        return {
            statusCode: 200,
            headers: {
                'Content-Type': 'application/json',
                'Access-Control-Allow-Origin': '*'
            },
            body: JSON.stringify(result.Item)
        };
    } catch (error) {
        console.error(error);
        return {
            statusCode: error.statusCode || 500,
            headers: {
                'Content-Type': 'application/json',
                'Access-Control-Allow-Origin': '*'
            },
            body: JSON.stringify({
                message: error.message || 'Internal server error'
            })
        };
    }
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Update Operation&lt;/strong&gt;&lt;br&gt;
To start with, a new file “update.js” is created for the PUT operation, which updates an existing item record in the dynamo db table.&lt;/p&gt;

&lt;p&gt;Below is the code snippet for reference:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const AWS = require('aws-sdk');
const dynamoDb = new AWS.DynamoDB.DocumentClient();

module.exports.update = async (event) =&amp;gt; {
    try {
        const timestamp = new Date().getTime();
        const data = JSON.parse(event.body);

        if (!data.name || !data.description) {
            return {
                statusCode: 400,
                headers: {
                    'Content-Type': 'application/json',
                    'Access-Control-Allow-Origin': '*'
                },
                body: JSON.stringify({
                    message: 'Missing required fields'
                })
            };
        }

        const params = {
            TableName: process.env.DYNAMODB_TABLE,
            Key: {
                id: event.pathParameters.id
            },
            ExpressionAttributeNames: {
                '#item_name': 'name'
            },
            ExpressionAttributeValues: {
                ':name': data.name,
                ':description': data.description,
                ':updatedAt': timestamp
            },
            UpdateExpression: 'SET #item_name = :name, description = :description, updatedAt = :updatedAt',
            ReturnValues: 'ALL_NEW'
        };

        const result = await dynamoDb.update(params).promise();

        return {
            statusCode: 200,
            headers: {
                'Content-Type': 'application/json',
                'Access-Control-Allow-Origin': '*'
            },
            body: JSON.stringify(result.Attributes)
        };
    } catch (error) {
        console.error(error);
        return {
            statusCode: error.statusCode || 500,
            headers: {
                'Content-Type': 'application/json',
                'Access-Control-Allow-Origin': '*'
            },
            body: JSON.stringify({
                message: error.message || 'Internal server error'
            })
        };
    }
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Delete Operation&lt;/strong&gt;&lt;br&gt;
To start with, a new file “delete.js” is created for the DEL operation, which deletes a specific item record from the dynamo db table whose id is provided as input.&lt;/p&gt;

&lt;p&gt;Below is the code snippet for reference:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const AWS = require('aws-sdk');
const dynamoDb = new AWS.DynamoDB.DocumentClient();

module.exports.delete = async (event) =&amp;gt; {
    try {
        const params = {
            TableName: process.env.DYNAMODB_TABLE,
            Key: {
                id: event.pathParameters.id
            }
        };

        await dynamoDb.delete(params).promise();

        return {
            statusCode: 204,
            headers: {
                'Content-Type': 'application/json',
                'Access-Control-Allow-Origin': '*'
            },
            body: ''
        };
    } catch (error) {
        console.error(error);
        return {
            statusCode: error.statusCode || 500,
            headers: {
                'Content-Type': 'application/json',
                'Access-Control-Allow-Origin': '*'
            },
            body: JSON.stringify({
                message: error.message || 'Internal server error'
            })
        };
    }
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;serverless.yml&lt;/strong&gt;&lt;br&gt;
The serverless.yml file contains the configuration of runtime, handler functions, aws region, and dynamoDB resources. Below is how my file looks like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;service: serverless-crud-ms

provider:
  name: aws
  runtime: nodejs18.x
  stage: ${opt:stage, 'dev'}
  region: eu-central-1
  environment:
    DYNAMODB_TABLE: ${self:service}-${self:provider.stage}
  iamRoleStatements:
    - Effect: Allow
      Action:
        - dynamodb:Query
        - dynamodb:Scan
        - dynamodb:GetItem
        - dynamodb:PutItem
        - dynamodb:UpdateItem
        - dynamodb:DeleteItem
      Resource: "arn:aws:dynamodb:${self:provider.region}:*:table/${self:provider.environment.DYNAMODB_TABLE}"

functions:
  create:
    handler: handlers/create.create
    events:
      - http:
          path: items
          method: post
          cors: true

  getAll:
    handler: handlers/get.getAll
    events:
      - http:
          path: items
          method: get
          cors: true

  getOne:
    handler: handlers/get.getOne
    events:
      - http:
          path: items/{id}
          method: get
          cors: true

  update:
    handler: handlers/update.update
    events:
      - http:
          path: items/{id}
          method: put
          cors: true

  delete:
    handler: handlers/delete.delete
    events:
      - http:
          path: items/{id}
          method: delete
          cors: true

resources:
  Resources:
    ItemsTable:
      Type: AWS::DynamoDB::Table
      Properties:
        TableName: ${self:provider.environment.DYNAMODB_TABLE}
        AttributeDefinitions:
          - AttributeName: id
            AttributeType: S
        KeySchema:
          - AttributeName: id
            KeyType: HASH
        BillingMode: PAY_PER_REQUEST
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Deployment:&lt;/strong&gt;&lt;br&gt;
Deploy the service using below commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Configure AWS credentials
aws configure

# Deploy to AWS
serverless deploy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;Note: You may encounter below error on deployment:&lt;br&gt;
serverless.ps1 cannot be loaded. The file ..\npm\serverless.ps1 is not digitally signed.&lt;br&gt;
To resolve this error, you may execute the below command and retry deployment:&lt;br&gt;
&lt;/p&gt;
&lt;/blockquote&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once the deployment is successful, the API endpoints will be generated in the format like below:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create new items (POST /items)&lt;/li&gt;
&lt;li&gt;Retrieve all items (GET /items)&lt;/li&gt;
&lt;li&gt;Get specific items by ID (GET /items/{id})&lt;/li&gt;
&lt;li&gt;Update existing items (PUT /items/{id})&lt;/li&gt;
&lt;li&gt;Delete items (DELETE /items/{id})
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;endpoints:
  POST - https://xxxxx.execute-api.eu-central-1.amazonaws.com/dev/items
  GET - https://xxxxx.execute-api.eu-central-1.amazonaws.com/dev/items
  GET - https://xxxxx.execute-api.eu-central-1.amazonaws.com/dev/items/{id}
  PUT - https://xxxxx.execute-api.eu-central-1.amazonaws.com/dev/items/{id}
  DELETE - https://xxxxx.execute-api.eu-central-1.amazonaws.com/dev/items/{id}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Testing the API&lt;/strong&gt;&lt;br&gt;
Use curl or Postman to test your endpoints:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Create an item
curl -X POST https://your-api-url/dev/items \
  -H "Content-Type: application/json" \
  -d '{"name": "Test Item 1", "description": "This is a test item 1"}'

# Get all items
curl https://your-api-url/dev/items

# Get one item
curl https://your-api-url/dev/items/{id}

# Update an item
curl -X PUT https://your-api-url/dev/items/{id} \
  -H "Content-Type: application/json" \
  -d '{"name": "Updated Item 1", "description": "This is an updated item 1"}'

# Delete an item
curl -X DELETE https://your-api-url/dev/items/{id}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Cleanup&lt;/strong&gt;&lt;br&gt;
To remove all deployed resources:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;serverless remove
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This command will (check your AWS Console):&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Delete the Lambda functions&lt;/li&gt;
&lt;li&gt;Remove the API Gateway endpoints&lt;/li&gt;
&lt;li&gt;Delete the DynamoDB table&lt;/li&gt;
&lt;li&gt;Remove the IAM roles and policies&lt;/li&gt;
&lt;li&gt;Clean up any CloudWatch log groups&lt;/li&gt;
&lt;li&gt;Remove any other AWS resources that were created as part of your service&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;br&gt;
We’ve built a production-ready serverless CRUD microservice that’s scalable, maintainable, and follows best practices. This architecture can serve as a foundation for more complex applications, handling millions of requests while maintaining cost efficiency through the serverless model.&lt;/p&gt;

</description>
      <category>serverless</category>
      <category>serverlessframework</category>
      <category>lambda</category>
      <category>microservices</category>
    </item>
    <item>
      <title>AWS certification vs Marathon Race — the analogy</title>
      <dc:creator>Deepak Sharma</dc:creator>
      <pubDate>Fri, 24 Mar 2023 09:40:00 +0000</pubDate>
      <link>https://dev.to/ideepaksharma/aws-certification-vs-marathon-race-the-analogy-55ma</link>
      <guid>https://dev.to/ideepaksharma/aws-certification-vs-marathon-race-the-analogy-55ma</guid>
      <description>&lt;p&gt;Often there is an analogy used by many industry experts about the AWS Certfications Marathon, which made me to think up writing this blog.&lt;br&gt;
&lt;a href="https://miro.medium.com/v2/resize:fit:828/0*1oZYHuPbQ19vGx7O" rel="noopener noreferrer"&gt;&lt;/a&gt;&lt;br&gt;
Photo by Steward Masweneng on Unsplash&lt;br&gt;
What is a Marathon ? A marathon is a running event with a distance of 42.195 km which takes place in the form of a road race in several countries.&lt;/p&gt;

&lt;p&gt;In order to prepare for a Marathon, a lot of endurance, patience, encouragement and support is required for several months before the event date. Once you register for an event, generally a training plan of approximate 12/18 weeks is followed by an individual for this long and tedious but extremely rewarding journey.&lt;/p&gt;

&lt;p&gt;A marathon finisher, credits Berlin Marathon Instagram account&lt;br&gt;
And you will see the smiles and joys once you have crossed the finishing line knowing that your hard work has paid off, with a finisher medal, ofcourse! 😊&lt;/p&gt;

&lt;p&gt;What is AWS Certifications ? AWS Certifications are available for any level of learner, whether in a technical role or not, to build cloud skills for a particular role or domain.&lt;/p&gt;

&lt;p&gt;Similarly like a marathon race, preparing for an AWS Certifications is no different. Getting an AWS certification is a long and challenging journey that requires dedication, persistence, and hard work. It requires a lot of time and effort to train, learn and prepare for both the race and the certification exam.&lt;/p&gt;

&lt;p&gt;When you make up your mind for one of the certification exam, you start with the plan just like you do for a marathon. You don’t start running a marathon on first day itself, instead start slowly with a progressive approach. In a similar manner for the certification you follow the same approach with little steps at a time.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Start with a plan&lt;/strong&gt;&lt;br&gt;
Just like you need to create a training plan for a marathon, you should also create a study plan for your AWS certification exam. You need to identify the resources you need, set your goals, and determine the time and effort you need to put in.&lt;/p&gt;

&lt;p&gt;The first thing to start with is by enrolling into online course, which provides a detailed information about the AWS services. The online courses can be AWS Skill Builder program, Udemy courses from well known authors, etc.&lt;/p&gt;

&lt;p&gt;Schedule your plan to spend atleast one hour per day to study the course. And if possible over the weekend, try to invest more time to study. Just like in marathon training plan Sunday’s are usually long run day :)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Consistency is key&lt;/strong&gt;&lt;br&gt;
To achieve success in both a marathon and an AWS certification exam, you need to be consistent in your efforts. Consistent practice and study sessions will help you build your stamina and knowledge.&lt;/p&gt;

&lt;p&gt;If you have enrolled in one of the above training courses, it most probably contains the practical hands-on lab. The idea is to perform the hands-on for each lab to understand how the services work. By doing it, you gain more insights and also helps to remember the behaviour of the AWS services. After all it’s the practical experience which matters at the end.&lt;/p&gt;

&lt;p&gt;PS: Do not plan to skip the hands-on lab, if you are new to the cloud technology domain.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Overcome challenges&lt;/strong&gt;&lt;br&gt;
Just like a marathon, preparing for an AWS certification exam will come with its challenges. You need to be prepared to face these challenges and develop the skills and knowledge required to overcome them.&lt;/p&gt;

&lt;p&gt;When you prepare for the certifications, the courses are generally very long, and till the time you reach at the end, you often forget what you studied at the beginning of the course. This is normal, and you don’t have to panic about it 🤭.&lt;/p&gt;

&lt;p&gt;The idea is to revise the courses atleast twice or until you gain confidence to appear for the exam.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Check for the preparedness / Don’t give up&lt;/strong&gt;&lt;br&gt;
Both a marathon and an AWS certification exam will require mental and physical toughness. You need to be determined to finish what you started, even when it gets tough.&lt;/p&gt;

&lt;p&gt;This is the crucial phase where we get insights of our preparation till now. To check for the readiness, take the practise exams available in udemy or skill builder program. The idea is to continue taking the practise exams until you score consistently above 80%. Do go through using the review mode for all the questions which were incorrect, and understand the reasoning behind it.&lt;/p&gt;

&lt;p&gt;Revisit the sections which need more attention and gain the knowledge which was lacking. Repeat the practise exams again and again till you score consistently and improve on the failures.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Celebrate your achievement&lt;/strong&gt;&lt;br&gt;
Finally, both a marathon runner and someone who earns an AWS certification can feel a tremendous sense of accomplishment and pride in what they’ve achieved. In both cases, the journey is as important as the destination, and the process of learning, growing, and pushing oneself to new heights is what makes the experience so rewarding.&lt;/p&gt;

&lt;p&gt;Crossing the finish line of a marathon or passing an AWS certification exam is a great achievement. You should celebrate your hard work and dedication to achieve your goal.&lt;/p&gt;

&lt;p&gt;Overall, both preparing for an AWS certification exam and running a marathon require commitment, persistence, and dedication. With the right mindset and approach, you can achieve success in both 😊&lt;/p&gt;

&lt;p&gt;Good luck and keep going!&lt;/p&gt;

&lt;p&gt;Please don’t forget to clap and follow me on Medium and LinkedIn for more such posts.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>awscertifications</category>
      <category>analogy</category>
      <category>awscertificationjourney</category>
    </item>
    <item>
      <title>How not to fail your AWS Solution Architect Associate (SAA) exam</title>
      <dc:creator>Deepak Sharma</dc:creator>
      <pubDate>Sat, 27 Aug 2022 21:53:00 +0000</pubDate>
      <link>https://dev.to/ideepaksharma/how-not-to-fail-your-aws-solution-architect-associate-saa-exam-49</link>
      <guid>https://dev.to/ideepaksharma/how-not-to-fail-your-aws-solution-architect-associate-saa-exam-49</guid>
      <description>&lt;p&gt;After passing my AWS Cloud Practitioner exam, I decided to move to the Associate level and appear for the certification exam.&lt;/p&gt;

&lt;p&gt;I started my journey of learning the AWS SAA by purchasing courses on &lt;a href="https://www.udemy.com/" rel="noopener noreferrer"&gt;Udemy&lt;/a&gt; online platform. It took almost 3-4 months of preparation time for me to be confident enough to appear for the exam. Since I am a working professional, I use to study in the night making sure to spend atleast one hour after my dinner. Of course not every day was same, but tried to be consistent as much as possible. During the weekends I used to stretch a little longer, just to achieve my targets.&lt;/p&gt;

&lt;p&gt;Note: Every individual is different and has their own capabilities, it might be different period of preparation time for you.&lt;/p&gt;

&lt;p&gt;The two authors I referred to were Stéphane Maarek and Neal Davis.&lt;br&gt;
At first, I started my course study with Neal Davis: &lt;a href="https://www.udemy.com/course/aws-certified-solutions-architect-associate-hands-on/" rel="noopener noreferrer"&gt;AWS Certified Solutions Architect Associate Training&lt;/a&gt;. I find the training material very useful and easy to understand with the diagrams and hands-on lab. The hands-on lab in each section were performed by me, which helped in remembering while doing practice. The exam cram and architecture patterns in each section helps in revising what you learn in that particular section.&lt;/p&gt;

&lt;p&gt;PS: The course is quite long and you often tend to forget what you learn before. Thus exam crams are must to refer on a regular basis, to keep remembering the previous sections.&lt;/p&gt;

&lt;p&gt;After I finished the course, I went on to refer another course which was by Stephane Maarek: &lt;a href="https://www.udemy.com/course/aws-certified-solutions-architect-associate-saa-c03/" rel="noopener noreferrer"&gt;Ultimate AWS Certified Solutions Architect Associate 2022&lt;/a&gt;. I felt Stephane's course had covered more services compared to previous mentioned course. This also acted as a revision for other topics, which I did went through previously.&lt;/p&gt;

&lt;p&gt;Lastly I also referred the practice exams by Neal Davis: &lt;a href="https://www.udemy.com/course/aws-certified-solutions-architect-associate-practice-tests-k/" rel="noopener noreferrer"&gt;AWS Certified Solutions Architect Associate Practice Exams&lt;/a&gt;. I practiced the exams until I was scoring consistently above 80%. After every course I finished, I used to take the practice exams, just to test my knowledge and which section requires more attention. I failed those tests every time, but it gave me a hint on how I am progressing. At the end of each test, it gives you summary of each section scored and based on it, I used to revise the sections in which I scored less.&lt;/p&gt;

&lt;p&gt;Now the time was to go for the exam. I scheduled my online exam with PSI, once I was sure enough and didn't wanted to fail at any cost. On the day of my exam, I did faced technical challenge in terms of validating my ID. My webcam was not able to focus the ID, thus I had to repeatedly take photo of my ID card and submit it. This process took almost 30 mins or more for me (it made me panic a bit), but at the end I was able to submit it successfully.&lt;br&gt;
Tip: Always use laptop camera instead of external camera, if it doesn't work.&lt;/p&gt;

&lt;p&gt;Once all the hassle was over, the exam started. It took me all the minutes to finish the exam. To be frank, it was a tough one and in my mind I kept on thinking if I am going to fail. The test results are generally released in 24hrs and the moment I saw the pass in the email, I was relieved. All my efforts from previous months didn't went in vain and was happy with the score.&lt;/p&gt;

&lt;p&gt;Whosoever is preparing for the certification, my suggestion would be to consistent in learning. It does not matter if you have experience working with AWS services or not. Over the period you will gain knowledge, which will help you to grow further in your career.&lt;/p&gt;

&lt;p&gt;Good luck and keep going!&lt;br&gt;
My certification can be viewed from this &lt;a href="https://www.linkedin.com/feed/update/urn:li:activity:6960176187927109632/" rel="noopener noreferrer"&gt;post&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F10e73vu6v1jin701mazk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F10e73vu6v1jin701mazk.png" alt=" " width="340" height="340"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Please don't forget to clap and follow me on &lt;a href="https://www.linkedin.com/in/ideepaksharma/" rel="noopener noreferrer"&gt;LinkedIn &lt;/a&gt;and &lt;a href="https://github.com/ideepaksharma/" rel="noopener noreferrer"&gt;Github &lt;/a&gt; for more such posts.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>awscertification</category>
      <category>learning</category>
    </item>
  </channel>
</rss>
