<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Amruta Pardeshi</title>
    <description>The latest articles on DEV Community by Amruta Pardeshi (@seeyouoncloud).</description>
    <link>https://dev.to/seeyouoncloud</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/seeyouoncloud"/>
    <language>en</language>
    <item>
      <title>AWS Organizations: The Hidden Backbone of Enterprise Security</title>
      <dc:creator>Amruta Pardeshi</dc:creator>
      <pubDate>Sat, 22 Nov 2025 11:51:20 +0000</pubDate>
      <link>https://dev.to/aws-builders/aws-organizations-the-hidden-backbone-of-enterprise-security-30ng</link>
      <guid>https://dev.to/aws-builders/aws-organizations-the-hidden-backbone-of-enterprise-security-30ng</guid>
      <description>&lt;p&gt;Securing a rapidly expanding AWS environment requires more than just IAM users, roles, and policies. As teams, accounts, and workloads grow, a more foundational approach is necessary — a means to centrally govern, restrict, and secure all resources across accounts.&lt;/p&gt;

&lt;p&gt;That “hidden backbone” is &lt;strong&gt;AWS Organizations&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;In this guide, we clarify how AWS Organizations integrates with IAM, SCPs, ABAC, managed policies, and resource policies, providing practical examples that you can apply in real environments.&lt;/p&gt;

&lt;p&gt;🚀 &lt;strong&gt;Features for AWS Organizations:&lt;/strong&gt;&lt;br&gt;
AWS Organizations does more than just create multiple AWS accounts. It serves as the control plane for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Manage your AWS accounts&lt;/li&gt;
&lt;li&gt;Define and manage your organization&lt;/li&gt;
&lt;li&gt;Secure and monitor your accounts&lt;/li&gt;
&lt;li&gt;Control access and permissions&lt;/li&gt;
&lt;li&gt;Share resources across accounts&lt;/li&gt;
&lt;li&gt;Audit your environment for compliance&lt;/li&gt;
&lt;li&gt;Centrally manage billing and costs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;🚀 &lt;strong&gt;Use cases for AWS Organizations:&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Automate the creation of AWS accounts and categorize workloads&lt;/strong&gt;&lt;br&gt;
You can automate the creation of AWS accounts to quickly launch new workloads. Add the accounts to user-defined groups for instant security policy application, touchless infrastructure deployments, and auditing.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Define and enforce audit and compliance policies&lt;/strong&gt;&lt;br&gt;
You can implement service control policies (SCPs) to ensure that your users engage only in actions that align with your security and compliance requirements.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Provide tools and access for your Security teams while encouraging development&lt;/strong&gt;&lt;br&gt;
Create a security group with read-only access to all resources.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Share common resources across accounts&lt;/strong&gt;&lt;br&gt;
Organizations makes it easy for you to share critical central resources across your accounts.&lt;/p&gt;

&lt;p&gt;It is the foundation of an enterprise-grade AWS security model.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. How AWS Organizations Works With IAM&lt;/strong&gt;&lt;br&gt;
AWS Organizations and IAM are often misunderstood. Here’s the simplest mental model:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;IAM = What a principal can do inside an account&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;(Users, roles, groups)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;SCP = What is allowed at the organizational level&lt;/strong&gt;&lt;br&gt;
(Sets the boundary of maximum permissions)&lt;/p&gt;

&lt;p&gt;Example of the interaction&lt;/p&gt;

&lt;p&gt;IAM says: “You can delete S3 buckets.”&lt;/p&gt;

&lt;p&gt;SCP says: “No one is allowed to delete S3 buckets.”&lt;/p&gt;

&lt;p&gt;Final result: Bucket deletion is denied.&lt;/p&gt;

&lt;p&gt;SCPs never grant access — they only restrict access.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Managing Access Permissions in an Organization&lt;/strong&gt;&lt;br&gt;
AWS Organizations provides multiple tools for governing security.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;a) Service Control Policies (SCPs)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;SCPs apply at:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;the root organization&lt;/li&gt;
&lt;li&gt;OU (Organizational Unit)&lt;/li&gt;
&lt;li&gt;individual accounts&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;They define &lt;strong&gt;maximum permissions&lt;/strong&gt;, regardless of IAM.&lt;/p&gt;

&lt;p&gt;Useful for enforcing:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;mandatory encryption&lt;/li&gt;
&lt;li&gt;mandatory logging&lt;/li&gt;
&lt;li&gt;region restrictions&lt;/li&gt;
&lt;li&gt;disallowing IAM user creation&lt;/li&gt;
&lt;li&gt;preventing CloudTrail deletion&lt;/li&gt;
&lt;li&gt;protecting guardrails&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;b) IAM Roles for Cross-Account Access&lt;/strong&gt;&lt;br&gt;
Instead of creating admin users in each account, you create &lt;strong&gt;one central IAM role per use case.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Examples:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;SecurityAuditRole for the security team&lt;/li&gt;
&lt;li&gt;AutomationRole for CI/CD pipelines&lt;/li&gt;
&lt;li&gt;BillingRole for finance&lt;/li&gt;
&lt;li&gt;Roles are assumed cross-account using the AWS CLI or AWS SSO.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;c) IAM Identity Center (AWS SSO)&lt;/strong&gt;&lt;br&gt;
The recommended identity solution.&lt;br&gt;
It provides:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Centralized users and groups&lt;/li&gt;
&lt;li&gt;MFA enforced everywhere&lt;/li&gt;
&lt;li&gt;Fine-grained permission sets&lt;/li&gt;
&lt;li&gt;Automatic access provisioning per account&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This eliminates IAM users entirely.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. AWS Managed Policies&lt;/strong&gt;&lt;br&gt;
AWS provides prebuilt policies so you don’t reinvent the wheel.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Common managed policies:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;SecurityAudit → read-only security insights&lt;/li&gt;
&lt;li&gt;AdministratorAccess → full control (use carefully)&lt;/li&gt;
&lt;li&gt;ViewOnlyAccess → non-intrusive visibility&lt;/li&gt;
&lt;li&gt;AmazonS3FullAccess&lt;/li&gt;
&lt;li&gt;AWSLambdaExecute&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Advantages&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;✔ Automatically updated by AWS&lt;br&gt;
✔ Quick to apply&lt;br&gt;
✔ Good starting point for least privilege&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Caution&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Many managed policies are too broad.&lt;br&gt;
Prefer custom policies for production.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Attribute-Based Access Control (ABAC) with Tags&lt;/strong&gt;&lt;br&gt;
Attribute-based access control enables you to use tags attached to AWS resources and identities to manage access. For instance, a user can access a resource only if both share the same tag value.&lt;/p&gt;

&lt;p&gt;Perfect for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;multi-team platforms&lt;/li&gt;
&lt;li&gt;dynamic environments&lt;/li&gt;
&lt;li&gt;large-scale automation&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;5. Identity-Based IAM Policy&lt;/strong&gt;&lt;br&gt;
Identity policies attach to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;IAM users&lt;/li&gt;
&lt;li&gt;IAM roles&lt;/li&gt;
&lt;li&gt;IAM groups&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;They grant permissions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Identity policies&lt;/strong&gt; = grant permissions&lt;br&gt;
&lt;strong&gt;SCPs&lt;/strong&gt; = restrict permissions&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6. Resource-Based Policy&lt;/strong&gt;&lt;br&gt;
Resource policies protect resources like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;S3 buckets&lt;/li&gt;
&lt;li&gt;KMS keys&lt;/li&gt;
&lt;li&gt;Lambda functions&lt;/li&gt;
&lt;li&gt;SQS queues&lt;/li&gt;
&lt;li&gt;API Gateway APIs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;They allow access from:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;other AWS accounts&lt;/li&gt;
&lt;li&gt;specific IAM roles or services&lt;/li&gt;
&lt;li&gt;external integrations&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;🔥&lt;strong&gt;Final thoughts&lt;/strong&gt;&lt;br&gt;
AWS Organizations serves as the critical foundation for securing, managing, and scaling large AWS environments. &lt;br&gt;
By integrating IAM, SCPs, resource policies, managed policies, and ABAC, you establish a comprehensive security model that is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Centralized&lt;/li&gt;
&lt;li&gt;Least-privilege by default&lt;/li&gt;
&lt;li&gt;Automated&lt;/li&gt;
&lt;li&gt;Compliant&lt;/li&gt;
&lt;li&gt;Enterprise-ready&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Reference: &lt;a href="https://docs.aws.amazon.com/organizations/latest/userguide/orgs_introduction.html" rel="noopener noreferrer"&gt;AWS Documentation&lt;/a&gt;&lt;/p&gt;

</description>
      <category>cloud</category>
      <category>security</category>
      <category>devops</category>
      <category>aws</category>
    </item>
    <item>
      <title>AWS Locksmith: Encrypting S3 and EBS with Amazon KMS</title>
      <dc:creator>Amruta Pardeshi</dc:creator>
      <pubDate>Mon, 05 Feb 2024 12:29:54 +0000</pubDate>
      <link>https://dev.to/aws-builders/aws-locksmith-encrypting-s3-and-ebs-with-amazon-kms-bl1</link>
      <guid>https://dev.to/aws-builders/aws-locksmith-encrypting-s3-and-ebs-with-amazon-kms-bl1</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction:&lt;/strong&gt;&lt;br&gt;
As businesses and organizations increasingly store sensitive data in the cloud, security becomes a top priority. To address this need, Amazon Web Services (AWS) offers the Amazon Key Management Service (KMS) – a robust encryption solution. In this blog post, we will explore how you can use AWS KMS to enhance the security of your data that is stored in Amazon Simple Storage Service (S3) and Elastic Block Store (EBS).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Understanding AWS Key Management Service:&lt;/strong&gt;&lt;br&gt;
Amazon KMS is a comprehensive encryption service that manages the creation and control of cryptographic keys used to encrypt data. It streamlines the process of integrating encryption into your applications and workflows by providing a secure and centralized location for managing keys.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Enabling CloudTrail and S3 logging with encryption&lt;/strong&gt;&lt;br&gt;
AWS CloudTrail is a service that allows you to track and log API events from the AWS console, API, or CLI (command line interface). CloudTrail is compatible with various AWS services, including KMS (Key Management Service). The JSON-formatted log files produced by CloudTrail are delivered to an S3 bucket. Once you enable and configure CloudTrail, the JSON logs will contain KMS events that can be used for monitoring, auditing, governance, and compliance. &lt;/p&gt;

&lt;p&gt;Let's see how to activate CloudTrail and create a Customer Master Key (CMK) that will encrypt the data CloudTrail logs in S3.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
In the AWS Management Console search bar, enter CloudTrail, and click on CloudTrail result to navigate to AWS Cloudtrail console:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo57xr3mr2e8dp5failzl.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo57xr3mr2e8dp5failzl.PNG" alt="Cloudtrail console" width="800" height="453"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Click on Create Trail.&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;&lt;em&gt;Fill out the following details:&lt;/em&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Trail name:&lt;/strong&gt; Key-Trail&lt;br&gt;
&lt;strong&gt;Enable for all accounts in my organization:&lt;/strong&gt; Unchecked&lt;br&gt;
&lt;strong&gt;Storage location:&lt;/strong&gt; Create new S3 bucket&lt;br&gt;
&lt;strong&gt;Trail log bucket and folder:&lt;/strong&gt; keytrail-bucket-unique_number&lt;br&gt;
&lt;em&gt;Note: S3 bucket names should be unique. Append a number to "Keytrail-bucket" for a unique bucket name.&lt;/em&gt;&lt;br&gt;
&lt;strong&gt;Log file SSE-KMS encryption:&lt;/strong&gt; Checked&lt;br&gt;
&lt;strong&gt;AWS KMS Key:&lt;/strong&gt; New&lt;br&gt;
&lt;strong&gt;AWS KMS alias:&lt;/strong&gt; S3-CloudTrail&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6bd2rb0qmn8x3o3dnuvy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6bd2rb0qmn8x3o3dnuvy.png" alt="Create trail" width="800" height="944"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Scroll to the bottom, and click on Next and fill out the form:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Event type: Ensure Management events is checked&lt;br&gt;
Click Next&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Review and create&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;In the left-hand menu, click Event history:&lt;/strong&gt;&lt;br&gt;
After the API activity is completed, it may take up to 15 minutes for the activity to update. The most recent events are located at the top.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Note: If you encounter a significant delay, you can proceed with the following steps. However, make sure to review the Event history later to see important events recorded by CloudTrail.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In the AWS search bar at the top, enter KMS, and under Services, click the KMS result to navigate there:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fugeo2mn3htq99frgumlz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fugeo2mn3htq99frgumlz.png" alt="AWS KMS" width="800" height="261"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the left-hand menu, click Customer managed keys.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsfc4rdejp15sh6wo2lf7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsfc4rdejp15sh6wo2lf7.png" alt="CMK" width="549" height="284"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select the key you just created (S3-CloudTrail) when enabling CloudTrail:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;In the row of tabs, click Key rotation.&lt;br&gt;
(In this way, you can automate key rotation by enabling a simple checkbox.)&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;In the AWS search bar at the top, enter S3, and under Services, click the S3 result to navigate there:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2vdgirksaeq0lv2bgddu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2vdgirksaeq0lv2bgddu.png" alt="S3" width="800" height="275"&gt;&lt;/a&gt;&lt;br&gt;
Select the unique bucket name you created earlier (keytrail-bucket-#)&lt;/p&gt;

&lt;p&gt;To access your data, navigate through the folders and follow the path including your AWS account ID, region, and date.&lt;br&gt;
&lt;em&gt;If you see two folders, click on CloudTrail and not CloudTrail-Digest.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Upon checking, you will find compressed JSON files in the directory. These are the log files sent from CloudTrail to S3:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgqfug2vjv4w0syj04a4b.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgqfug2vjv4w0syj04a4b.PNG" alt="s3-log" width="800" height="432"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Customer Master Key (CMK) has been successfully created and is now available for use in both encrypting and decrypting data that is at rest, whether it's on S3 (as in the example above) or an EBS volume. You can also use a similar process to search for other KMS-related events within a given day's CloudTrail log files, such as EnableKey, DisableKey, and so on.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create a Customer Master Key (CMK)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In the AWS Management Console search bar, enter KMS, and click on KMS result to navigate there:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fikgeiirgepzywjwucdfx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fikgeiirgepzywjwucdfx.png" alt="KMS" width="800" height="261"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select Customer managed keys in the left side-bar of the KMS console.&lt;/p&gt;

&lt;p&gt;Click Create Key&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Key type:&lt;/strong&gt; Symmetric (Symmetric keys are suitable for most data encryption applications. The same key is used for both encrypt and decrypt operations with symmetric key algorithms.)&lt;br&gt;
&lt;strong&gt;Key usage:&lt;/strong&gt; Encrypt and decrypt&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1b7nr5fut7r0im6bezx0.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1b7nr5fut7r0im6bezx0.PNG" alt="configure-KMS" width="673" height="380"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;scroll down then expand Advanced Options and set the following values:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Key Material Origin:&lt;/strong&gt;  Leave as KMS (default).&lt;br&gt;
&lt;strong&gt;Regionality:&lt;/strong&gt; Single-Region key&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff5fknkrormje7bcpyfrz.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff5fknkrormje7bcpyfrz.PNG" alt="Advance KMS" width="668" height="396"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click Next&lt;/p&gt;

&lt;p&gt;Set the following values before clicking Next (leave the default values for other fields)&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Alias: Test-CMK-key&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Click Next to the Define Key Administrative Permissions page and leave the default values.&lt;/p&gt;

&lt;p&gt;Click Next to Define Key Usage Permissions page&lt;br&gt;
Click Next to preview the key policy and then click Finish when ready.  &lt;/p&gt;

&lt;p&gt;&lt;em&gt;The CMK is created ..&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create a simple EC2 with unencrypted EBS volume.&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create an Encrypted EBS Volume&lt;/strong&gt;&lt;br&gt;
First, check if your running instance uses a non-encrypted EBS-backed root device. Then, create an encrypted EBS volume using a CMK, attach it, and confirm encryption on the volume from the console and CloudTrail.&lt;/p&gt;

&lt;p&gt;Click Volumes in the left side-bar below Elastic Block Store in EC2 console.&lt;/p&gt;

&lt;p&gt;Click Create Volume and fill out the dialog box&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Size:&lt;/strong&gt; 3 (Since we are not doing anything with a lot of data, it's OK to make this quite small.)&lt;br&gt;
&lt;strong&gt;Availability Zone:&lt;/strong&gt; us-west-2a (Select the same AZ as your running instance.)&lt;br&gt;
&lt;strong&gt;Encryption:&lt;/strong&gt; Check this (Once selected, it will expand to show Key information.)&lt;br&gt;
&lt;strong&gt;Master Key:&lt;/strong&gt; Test-CMK-key (Select the CMK you created earlier.)&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2d4hgn4jua7ocflro9z9.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2d4hgn4jua7ocflro9z9.PNG" alt="Encrypt-Volume" width="800" height="531"&gt;&lt;/a&gt;&lt;br&gt;
Create ..&lt;/p&gt;

&lt;p&gt;Select the EBS Volume, and click Actions &amp;gt; Attach Volume. Click in the Instance field and select your running instance&lt;/p&gt;

&lt;p&gt;After the Instance is selected, a default Device is set.&lt;/p&gt;

&lt;p&gt;Click Attach when ready. The State transitions from available to in-use.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;We have encrypted the S3 and EBS data in this manner ..&lt;/em&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>security</category>
      <category>devops</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Securing Your AWS Infrastructure: VPCs, Security Groups, and NACLs</title>
      <dc:creator>Amruta Pardeshi</dc:creator>
      <pubDate>Wed, 27 Dec 2023 07:44:22 +0000</pubDate>
      <link>https://dev.to/aws-builders/securing-your-aws-infrastructure-vpcs-security-groups-and-nacls-1c43</link>
      <guid>https://dev.to/aws-builders/securing-your-aws-infrastructure-vpcs-security-groups-and-nacls-1c43</guid>
      <description>&lt;p&gt;When it comes to Amazon Web Services (AWS), the first step towards protecting your applications and data is securing your infrastructure. It is essential to establish strong security measures at the network level, keeping the AWS shared responsibility model in mind. In this blog post, we will discuss the fundamental elements of securing your AWS infrastructure, including Virtual Private Clouds (VPCs), Security Groups, and Network Access Control Lists (NACLs).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Understanding AWS Virtual Private Cloud (VPC)&lt;/strong&gt;&lt;br&gt;
Amazon Virtual Private Cloud (VPC) is an essential component of your network infrastructure in AWS. It provides you with the ability to create a secure, isolated section of the AWS Cloud where you can deploy your AWS resources. Consider it as your cloud-based virtual data center, allowing you to have complete control over your network environment.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Best Practices for VPC Security:&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;1. Custom CIDR Blocks:&lt;/strong&gt;&lt;br&gt;
   Define custom CIDR blocks to ensure that your VPC's IP address &lt;br&gt;
   space doesn't overlap with on-premises networks. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Subnet Design:&lt;/strong&gt;&lt;br&gt;
   Use multiple subnets across Availability Zones for high &lt;br&gt;
   availability and fault tolerance.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Internet and VPN Gateways:&lt;/strong&gt;&lt;br&gt;
   Securely connect your VPC to the internet using an Internet &lt;br&gt;
   Gateway. Use Virtual Private Network (VPN) connections for &lt;br&gt;
   secure, private access.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. VPC Peering:&lt;/strong&gt;&lt;br&gt;
   Implement VPC peering for communication between VPCs. Ensure &lt;br&gt;
   that peering connections follow the principle of least &lt;br&gt;
   privilege.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Securing with AWS Security Groups&lt;/strong&gt;&lt;br&gt;
AWS Security Groups act as virtual firewalls for instances. They control inbound and outbound traffic at the instance level. If you allow outbound traffic, the corresponding inbound response traffic is automatically allowed because they are stateful.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Best Practices for Security Groups:&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;1. Principle of Least Privilege:&lt;/strong&gt;&lt;br&gt;
   Only open the ports and protocols necessary for your &lt;br&gt;
   application to function. Follow the principle of least &lt;br&gt;
   privilege to minimize attack surfaces.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Refined Inbound and Outbound Rules:&lt;/strong&gt;&lt;br&gt;
   Define specific rules for inbound and outbound traffic based on &lt;br&gt;
   the type of traffic required. Avoid leaving unnecessary ports &lt;br&gt;
   open.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Dynamic Port Ranges:&lt;/strong&gt;&lt;br&gt;
   When using applications that require dynamic port ranges, use &lt;br&gt;
   security group rules to define these ranges rather than leaving &lt;br&gt;
   all ports open.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Security Group Logging:&lt;/strong&gt;&lt;br&gt;
   Enable VPC Flow Logs to capture information about the IP &lt;br&gt;
   traffic going to and from network interfaces in your VPC.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Utilizing Network Access Control Lists (NACLs)&lt;/strong&gt;&lt;br&gt;
NACLs are stateless firewalls that control traffic in and out of subnets. They are associated with subnets and have separate inbound and outbound rules.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Best Practices for NACLs:&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;1. Default Deny Rule:&lt;/strong&gt;&lt;br&gt;
   Start with a default deny rule and only add rules that are &lt;br&gt;
   necessary for your application's functionality.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Sequential Rule Numbers:&lt;/strong&gt;&lt;br&gt;
   Number your rules sequentially to maintain clarity and make it &lt;br&gt;
   easier to add or remove rules in the future.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Limited Number of Rules:&lt;/strong&gt;&lt;br&gt;
   Keep the number of rules in NACLs to a minimum. Complicated &lt;br&gt;
   rule sets can lead to confusion and potential security &lt;br&gt;
   oversights.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Regular Auditing:&lt;/strong&gt;&lt;br&gt;
   Regularly audit your NACL rules to ensure they align with your &lt;br&gt;
   security policies. Remove any rules that are no longer &lt;br&gt;
   necessary.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;br&gt;
Securing your AWS infrastructure is a continuous and dynamic process. To implement a robust security framework that protects your applications and data from potential threats, it is crucial to understand the roles of VPCs, Security Groups, and NACLs. It is always recommended to stay informed about AWS security best practices, and regularly review and update your security configurations to adapt to the evolving threat landscape. A secure AWS infrastructure is not only a best practice but also a fundamental requirement for a successful and resilient cloud deployment.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>devops</category>
      <category>security</category>
      <category>cloud</category>
    </item>
    <item>
      <title>AWS Penetration Testing Insights</title>
      <dc:creator>Amruta Pardeshi</dc:creator>
      <pubDate>Mon, 02 Oct 2023 07:33:27 +0000</pubDate>
      <link>https://dev.to/aws-builders/aws-penetration-testing-insights-30g4</link>
      <guid>https://dev.to/aws-builders/aws-penetration-testing-insights-30g4</guid>
      <description>&lt;p&gt;In today's digital landscape, ensuring security is crucial, and Amazon Web Services (AWS) recognizes this significance by offering robust security measures. To provide the utmost protection, AWS provides a comprehensive guide on &lt;a href="https://aws.amazon.com/security/penetration-testing/" rel="noopener noreferrer"&gt;penetration testing&lt;/a&gt;. In this detailed blog post, we will delve into AWS penetration testing, aligning ourselves with AWS's guidelines, to help you effectively safeguard your AWS infrastructure.&lt;/p&gt;

&lt;p&gt;It's important to conduct AWS penetration testing, also known as ethical hacking. This proactive approach helps identify vulnerabilities and security weaknesses in your AWS infrastructure. By resolving these issues before they're exploited, you can significantly reduce the risk of security breaches and data compromises.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;I wanted to share some of the important reasons why AWS Penetration Testing is crucial:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Enhancing Security:&lt;/strong&gt; Identifying vulnerabilities in advance can help you improve your overall security posture proactively.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Regulatory Compliance:&lt;/strong&gt; It is often required by various industries and regulatory bodies to conduct regular penetration testing as part of compliance efforts.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Protecting Sensitive Data:&lt;/strong&gt; Since AWS frequently hosts sensitive data, penetration tests can ensure the security of this information.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Building Trust:&lt;/strong&gt; Regularly conducting penetration testing shows your dedication to security, which can help establish trust with customers and partners.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Customer Service Policy for Penetration Testing&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;&lt;em&gt;Permitted Services&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Amazon EC2 instances, WAF, NAT Gateways, and Elastic Load Balancers&lt;/li&gt;
&lt;li&gt;Amazon RDS&lt;/li&gt;
&lt;li&gt;Amazon CloudFront&lt;/li&gt;
&lt;li&gt;Amazon Aurora&lt;/li&gt;
&lt;li&gt;Amazon API Gateways&lt;/li&gt;
&lt;li&gt;AWS AppSync&lt;/li&gt;
&lt;li&gt;AWS Lambda and Lambda Edge functions&lt;/li&gt;
&lt;li&gt;Amazon Lightsail resources&lt;/li&gt;
&lt;li&gt;Amazon Elastic Beanstalk environments&lt;/li&gt;
&lt;li&gt;Amazon Elastic Container Service&lt;/li&gt;
&lt;li&gt;AWS Fargate&lt;/li&gt;
&lt;li&gt;Amazon Elasticsearch&lt;/li&gt;
&lt;li&gt;Amazon FSx&lt;/li&gt;
&lt;li&gt;Amazon Transit Gateway&lt;/li&gt;
&lt;li&gt;S3 hosted applications (targeting S3 buckets is strictly prohibited)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Prohibited Activities&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;DNS zone walking via Amazon Route 53 Hosted Zones&lt;/li&gt;
&lt;li&gt;DNS hijacking via Route 53&lt;/li&gt;
&lt;li&gt;DNS Pharming via Route 53&lt;/li&gt;
&lt;li&gt;Denial of Service (DoS), Distributed Denial of Service (DDoS), Simulated DoS, Simulated DDoS (These are subject to the DDoS Simulation Testing policy&lt;/li&gt;
&lt;li&gt;Port flooding&lt;/li&gt;
&lt;li&gt;Protocol flooding&lt;/li&gt;
&lt;li&gt;Request flooding (login request flooding, API request flooding) &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Customers seeking to test non approved services will need to work directly with their AWS Support Team. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Please keep in mind that when performing security testing, it is important to follow the AWS Security Testing Terms and Conditions:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Only perform security testing on the agreed-upon services, network bandwidth, requests per minute, and instance type.&lt;/li&gt;
&lt;li&gt;Use security assessment tools and services in accordance with AWS's policy.&lt;/li&gt;
&lt;li&gt;Security testing is subject to the &lt;a href="https://aws.amazon.com/agreement/" rel="noopener noreferrer"&gt;Amazon Web Services Customer Agreement&lt;/a&gt; between you and AWS.&lt;/li&gt;
&lt;li&gt;If any vulnerabilities or issues are discovered during the testing that are a direct result of AWS's tools or services, please report them to AWS Security &lt;a href="mailto:aws-security@amazon.com"&gt;aws-security@amazon.com&lt;/a&gt; within 24 hours of completing the testing.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;AWS have a policy that outlines how to use security assessment tools and services.&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A security tool that remotely queries your AWS asset to determine a software name and version is not a violation. A tool or service that crashes a running process temporarily for remote or local exploitation as part of the security assessment is not in violation. &lt;/li&gt;
&lt;li&gt;However, you can't use tools or services that perform DoS attacks or simulations against any AWS asset. You also can't use tools or services that create, determine, or demonstrate a DoS condition in any other manner. Customers wishing to perform a DDoS simulation test should review AWS's &lt;a href="https://aws.amazon.com/security/ddos-simulation-testing/" rel="noopener noreferrer"&gt;DDoS Simulation Testing policy&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;It's your responsibility to ensure that the tools and services used for security assessments do not perform DoS attacks or simulations. You should also validate that the tool or service employed does not perform such attacks before performing a security assessment of any AWS assets.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Reference: &lt;a href="https://aws.amazon.com/security/penetration-testing/" rel="noopener noreferrer"&gt;AWS Security Documentation on Penetration Testing&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>security</category>
      <category>awscloud</category>
    </item>
    <item>
      <title>Simplifying Data Transfer: Methods for Moving Large Amounts of Data with Ease</title>
      <dc:creator>Amruta Pardeshi</dc:creator>
      <pubDate>Tue, 08 Aug 2023 12:26:44 +0000</pubDate>
      <link>https://dev.to/seeyouoncloud/simplifying-data-transfer-methods-for-moving-large-amounts-of-data-with-ease-494o</link>
      <guid>https://dev.to/seeyouoncloud/simplifying-data-transfer-methods-for-moving-large-amounts-of-data-with-ease-494o</guid>
      <description>&lt;p&gt;In the ever-evolving world of technology, the need to transfer large amounts of data swiftly and securely has become a crucial aspect of various industries. Whether you're a business dealing with massive datasets or an individual moving precious memories to the cloud, understanding the different data transfer strategies can save you time, effort, and headaches. In this blog, we'll explore some simple methods for transferring large amounts of data in and out of Amazon S3, one of the most popular cloud storage solutions. Let's dive in!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. AWS DataSync: A Syncing Marvel&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Imagine you have a treasure trove of data that needs to be constantly updated between your on-premises storage and Amazon S3. Enter AWS DataSync, a magical tool that synchronizes your data seamlessly. It's like having a data butler, ensuring your files are up-to-date without you lifting a finger. Whether it's a bunch of photos or a database, AWS DataSync efficiently transfers your data over the internet, saving you time and bandwidth. It will easily and efficiently transfer hundreds of terabytes and millions of files into AWS.&lt;br&gt;
Refer to this link to check how AWS DataSync works &lt;a href="https://docs.aws.amazon.com/datasync/latest/userguide/getting-started.html"&gt;https://docs.aws.amazon.com/datasync/latest/userguide/getting-started.html&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Snowball: Your Data's Arctic Expedition&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;What if your data is so massive that transferring it over the internet seems like waiting for a snail to cross a racetrack? Here's where Snowball comes to the rescue. Snowball is like a rugged storage box that you fill with your data. Amazon sends you this box, and you transfer your data into it. Once your data is snugly inside, you send the Snowball back to Amazon. They'll plug it into their massive servers, and voilà! Your data is transferred much faster than if you had tried to send it through the internet. You can transfer hundreds of terabytes or petabytes of data between your on-premises data centers and Amazon Simple Storage Service (Amazon S3).&lt;br&gt;
To read more about it check out this link &lt;a href="https://docs.aws.amazon.com/snowball/latest/developer-guide/whatisedge.html"&gt;https://docs.aws.amazon.com/snowball/latest/developer-guide/whatisedge.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Direct Data Transfers: The Straightforward Route&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Sometimes, you might prefer the simplicity of transferring data directly without any third-party tools. This is where direct data transfers come into play. It's like sending an email attachment, but on a much larger scale. You can use tools like &lt;em&gt;AWS Command Line Interface (CLI)&lt;/em&gt; or even &lt;em&gt;scripting languages&lt;/em&gt; to move your data from your computer to Amazon S3. While this method might require a bit more technical know-how, it gives you fine-grained control over the transfer process.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;In Conclusion: Finding Your Data's Path&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;When it comes to transferring large amounts of data in and out of Amazon S3, you have a variety of strategies at your disposal. Choosing the right method depends on factors such as the volume of data, your technical expertise, and your timeline. &lt;em&gt;AWS DataSync&lt;/em&gt; is perfect for keeping your data in sync effortlessly, while &lt;em&gt;Snowball&lt;/em&gt; offers a robust solution for huge datasets. If you're comfortable with a bit more technical involvement, &lt;em&gt;direct data transfers&lt;/em&gt; provide a straightforward route.&lt;/p&gt;

&lt;p&gt;As technology continues to advance, these methods might evolve too. However, for now, you have these powerful tools to make your data transfer journey smoother. Whether you're a business striving for efficiency or an individual safeguarding precious memories, exploring these strategies can help you find the perfect path for your data's journey to and from Amazon S3.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>storage</category>
      <category>data</category>
      <category>transfer</category>
    </item>
    <item>
      <title>Leveraging S3 Lifecycle Policies for Data Tiering and Cost Reduction</title>
      <dc:creator>Amruta Pardeshi</dc:creator>
      <pubDate>Tue, 04 Jul 2023 06:46:35 +0000</pubDate>
      <link>https://dev.to/seeyouoncloud/leveraging-s3-lifecycle-policies-for-data-tiering-and-cost-reduction-d50</link>
      <guid>https://dev.to/seeyouoncloud/leveraging-s3-lifecycle-policies-for-data-tiering-and-cost-reduction-d50</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;br&gt;
Organizations are creating and storing enormous volumes of data in the data-driven world of today. Keeping track of storage expenses becomes more crucial as data volumes increase. Amazon Web Services (AWS) provides the highly scalable and economical Amazon Simple Storage Service (S3) as a cloud storage option. Lifecycle policies, a potent feature of S3, let you automatically move data across various storage classes in accordance with established conditions. Organizations can save storage expenses while assuring data availability and durability by utilizing S3 lifecycle policies for data tiering. The advantages and best practices of using S3 lifecycle policies to cut expenses and enhance data management will be covered in this blog.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Understanding S3 Storage Classes&lt;/strong&gt;&lt;br&gt;
Let's quickly cover the various S3 storage classes before getting into lifespan policies. S3 provides a variety of storage classes, each of which is created to satisfy particular demands for performance, cost, and durability:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;S3 Standard: It offers high durability, availability, and low latency and is the default storage class. For regularly accessible data, it is appropriate.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;S3 Intelligent-Tiering: This storage type use machine learning to shift things between the frequent and infrequent access tiers automatically. For data with erratic access patterns, it is perfect.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The S3 Standard-IA (Infrequent Access) storage class is intended for less often accessed data that nonetheless needs to be immediately available when needed. Compared to S3 Standard, it provides cheaper storage.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;S3 One Zone-IA: Similar to S3 Standard-IA, S3 One Zone-IA stores data in a single availability zone. Compared to S3 Standard-IA, it is significantly less durable but offers cost benefits.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;S3 Glacier: This storage class is safe, dependable, and reasonably priced for archiving data that is infrequently accessed. Retrieval timeframes can be customized and range from minutes to hours.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;S3 Glacier Deep Archive: The most economical storage class for storing data that is accessed just once or twice a year is S3 Glacier Deep Archive. It may take many hours to retrieve data.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Using S3 Lifecycle Policies for Data Tiering&lt;/strong&gt;&lt;br&gt;
You can specify rules to automatically switch items between different storage classes as they get older using S3 lifecycle policies. Effectively utilizing lifecycle strategies can enable you to reduce storage expenses while preserving data availability. This is how it goes:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Rule-setting for transitions: Establish the guidelines for when objects should be transferred to a different storage class first. Objects older than 30 days, for instance, can be migrated from S3 Standard to S3 Intelligent-Tiering or S3 Standard-IA, per your specifications. For data moving from S3 Intelligent-Tiering to S3 Standard-IA or Glacier, similar rules can be established.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Configure expiration actions: You can provide rules to automatically delete or archive things after a certain amount of time, in addition to migrating objects between storage classes. For the reasons of compliance or data retention, this is especially useful. For instance, you can archive things to S3 Glacier Deep Archive or remove those older than seven years.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Think about access patterns: It's crucial to take your data's access patterns into account when developing transition rules. While less often accessed data can be shifted to cheaper storage classes, frequently used objects should be preserved in performance-optimized storage classes.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Review and improve: Be careful to regularly check your lifecycle policies to make sure they still meet your evolving data management needs. To reduce expenses and ensure data availability, modify the transition and expiration rules as necessary.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Benefits of Using S3 Lifecycle Policies&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Cost optimization: You can drastically minimize storage costs by automatically transferring data to less expensive storage classes as it receives fewer accesses. You may balance cost reductions with the demands of data availability and performance with S3 lifecycle policies.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Data administration is made easier because to lifecycle policies, which automate data tiering and do away with the need for manual intervention. By defining the rules only once and letting S3 manage the rest, you can save time and money.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Compliance and data retention: By automatically removing or archiving data after a predetermined amount of time, lifecycle rules aid in the enforcement of compliance and data retention standards. By doing this, it is made sure that data is managed consistently and in line with legal standards.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Performance improvement: You can enhance the performance of often used data by migrating less frequently used data to less expensive storage classes. Fast retrieval times are maintained, and system performance is improved as a result.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Best Practices for Using S3 Lifecycle Policies&lt;/strong&gt;&lt;br&gt;
To make the most of S3 lifecycle policies, consider the following best practices:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Create a data lifecycle plan: Recognize the data's lifecycle and establish suitable transition and expiration policies. Take into account elements including access patterns, retention needs, and compliance duties.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Test and validate: Conduct testing and validation in a non-production environment before introducing lifecycle policies there. This makes it easier to make sure the policies are set up properly and produce the results you want.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Monitor and improve: Consistently keep an eye on the effectiveness and financial effects of your lifecycle policies. To maximize expenses and data availability, analyze the access patterns and make necessary rule adjustments.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Leverage analytics: Make the most of AWS resources like Amazon S3 Storage Lens and Amazon S3 Analytics to acquire understanding of your data and decide on the best storage class and lifecycle policy.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;br&gt;
A potent method for reducing storage costs while preserving data availability is to use S3 lifecycle policies for data tiering. Organizations may efficiently manage their data lifecycle and guarantee that data is stored in the most cost-effective way by automating the flow of objects between various storage classes based on preset rules. Businesses may achieve the ideal mix between cost savings, data availability, and performance in their AWS S3 storage system by adhering to best practices and routinely assessing and improving lifecycle policies.&lt;/p&gt;

</description>
      <category>storage</category>
      <category>s3</category>
      <category>costoptimization</category>
      <category>aws</category>
    </item>
    <item>
      <title>Guarding Your AWS Credentials: Identifying Compromises and Mitigating Damage</title>
      <dc:creator>Amruta Pardeshi</dc:creator>
      <pubDate>Fri, 24 Mar 2023 10:11:09 +0000</pubDate>
      <link>https://dev.to/seeyouoncloud/guarding-your-aws-credentials-identifying-compromises-and-mitigating-damage-46kp</link>
      <guid>https://dev.to/seeyouoncloud/guarding-your-aws-credentials-identifying-compromises-and-mitigating-damage-46kp</guid>
      <description>&lt;p&gt;As more businesses move their infrastructure to the cloud, security becomes a critical issue. One of the significant security risks for cloud infrastructure is the compromise of AWS (Amazon Web Services) credentials. AWS credentials are used to access and manage AWS resources, and if they fall into the wrong hands, an attacker can cause significant damage to your organization. In this blog post, we'll explore how to identify compromised AWS credentials using GuardDuty and the steps you can take to mitigate the damage.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Identifying the AWS credentials compromised using GuardDuty&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;GuardDuty is a threat detection service provided by AWS that continuously monitors your AWS environment for malicious activity and unauthorized behavior. GuardDuty analyzes event logs and network traffic to detect potential security threats in real-time. Here are some ways GuardDuty can help you identify compromised AWS credentials:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Unusual API calls: GuardDuty can detect unusual API calls made using AWS credentials, such as calls from an unusual location or an unusual time of day.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Credential stuffing: GuardDuty can detect credential stuffing attacks, where an attacker uses a list of stolen credentials to try to gain access to an AWS account.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Brute-force attacks: GuardDuty can detect brute-force attacks, where an attacker tries to guess an AWS account password or access key.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Password spraying: GuardDuty can detect password spraying attacks, where an attacker tries a small number of common passwords against many AWS accounts.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Reconnaissance activities: GuardDuty can detect reconnaissance activities, where an attacker tries to gather information about an AWS environment, such as by running port scans or making DNS queries.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Taking Steps to Mitigate the Damage of Compromised AWS Credentials&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;If GuardDuty detects that your AWS credentials have been compromised, you should take the following steps to mitigate the damage:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Revoke the compromised credentials immediately: Go to the AWS Management Console, navigate to the IAM (Identity and Access Management) service, and revoke the compromised credentials.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Change all related credentials: Change all related credentials, including access keys, secret keys, and passwords.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Check for any unauthorized changes: Check for any unauthorized changes that may have been made to your AWS resources, such as new EC2 instances, S3 buckets, or other resources created by the attacker.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Enable Multi-Factor Authentication (MFA): Enabling Multi-Factor Authentication (MFA) is an effective way to prevent unauthorized access to your AWS resources.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Review your security policies and procedures: Review your security policies and procedures to ensure that they are robust enough to prevent similar attacks in the future. This includes reviewing your access control policies, monitoring your logs regularly, and providing regular security awareness training to your employees.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In addition to the steps mentioned above, it is essential to verify your AWS account information to ensure that the attacker has not made any unauthorized changes or accessed any sensitive data. Here are the steps you should take to verify your account information:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Check your billing information: Verify that your billing information is correct and that there are no unexpected charges or unusual activity.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Check your CloudTrail logs: CloudTrail is a service that provides event history of your AWS account activity. Review your CloudTrail logs to ensure that there are no unauthorized activities or unusual patterns of activity.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Review your security groups: Security groups control inbound and outbound traffic to your AWS resources. Check your security groups to ensure that there are no unauthorized changes or unusual traffic patterns.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Check your S3 buckets: Amazon S3 (Simple Storage Service) is a scalable and secure object storage service. Verify that there are no unauthorized changes or unusual activity in your S3 buckets.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Review your IAM policies: IAM (Identity and Access Management) policies control access to your AWS resources. Check your IAM policies to ensure that there are no unauthorized changes or unusual activity.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Verify your contact information: Make sure that your contact information, such as email addresses and phone numbers, is up to date and that you can receive notifications about any suspicious activity.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;By verifying your AWS account information, you can ensure that there are no unauthorized changes or activities and that your account is secure. If you notice any suspicious activity or unauthorized changes, you should report them immediately to AWS support and take appropriate steps to mitigate the damage.&lt;/p&gt;

&lt;p&gt;In conclusion, GuardDuty is an essential tool for monitoring the security of your AWS environment, and it can help you detect compromised AWS credentials. If GuardDuty detects that your AWS credentials have been compromised, it is crucial to take immediate action to revoke the credentials, change related credentials, check for unauthorized changes, enable MFA, and review your security policies and procedures to prevent similar attacks in the future.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloudsecurity</category>
      <category>devops</category>
      <category>security</category>
    </item>
    <item>
      <title>Detecting Security Threats in Real-time with AWS GuardDuty</title>
      <dc:creator>Amruta Pardeshi</dc:creator>
      <pubDate>Mon, 06 Mar 2023 12:28:02 +0000</pubDate>
      <link>https://dev.to/seeyouoncloud/detecting-security-threats-in-real-time-with-aws-guardduty-5a9e</link>
      <guid>https://dev.to/seeyouoncloud/detecting-security-threats-in-real-time-with-aws-guardduty-5a9e</guid>
      <description>&lt;p&gt;AWS GuardDuty is a threat detection service offered by Amazon Web Services (AWS) that helps you monitor your AWS environment for malicious activity and unauthorized behavior. With GuardDuty, you can continuously monitor and analyze logs and events from various sources such as AWS CloudTrail logs, Amazon VPC Flow Logs, and DNS logs to detect potential security threats.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Benefits of AWS GuardDuty&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Improved security: GuardDuty helps you improve the security of your AWS environment by detecting potential security threats in real-time and providing actionable insights that can help you remediate any security issues.&lt;/li&gt;
&lt;li&gt;Cost-effective: GuardDuty is a cost-effective solution for threat detection as it does not require any additional hardware or software, and you only pay for the data that is analyzed.&lt;/li&gt;
&lt;li&gt;Easy integration with other AWS services: GuardDuty integrates seamlessly with other AWS services such as AWS CloudTrail, Amazon VPC Flow Logs, and AWS Lambda, making it easy to incorporate into your existing security workflow.&lt;/li&gt;
&lt;li&gt;Scalable: GuardDuty is designed to scale with your AWS environment, and it can handle large amounts of data and traffic.&lt;/li&gt;
&lt;li&gt;Compliance: GuardDuty helps you meet compliance requirements by providing continuous monitoring and alerting for potential security threats.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Ways of Accessing GuardDuty&lt;/strong&gt;&lt;br&gt;
There are several ways to access AWS GuardDuty, depending on your preferences and needs. Here are some of the most common ways:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;AWS Management Console: The AWS Management Console is a web-based interface that you can use to access and manage GuardDuty. You can view and analyze findings, configure settings, and take action on potential security threats. &lt;a href="https://console.aws.amazon.com/guardduty"&gt;https://console.aws.amazon.com/guardduty&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;AWS CLI: The AWS Command Line Interface (CLI) is a tool that allows you to interact with AWS services using command-line commands. You can use the AWS CLI to manage GuardDuty, including enabling and disabling the service, retrieving findings, and updating settings.&lt;/li&gt;
&lt;li&gt;AWS SDKs: AWS provides software development kits (SDKs) for various programming languages, such as Python, Java, and Ruby. You can use the SDKs to integrate GuardDuty into your applications and automate GuardDuty-related tasks.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;How to enable AWS GuardDuty&lt;/strong&gt;&lt;br&gt;
Enabling GuardDuty is a simple process. Here are the steps to enable AWS GuardDuty:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Log in to the AWS Management Console and navigate to the GuardDuty console.&lt;/li&gt;
&lt;li&gt;If this is your first time using GuardDuty, you will see a welcome screen. Click on the "Get started" button to begin.&lt;/li&gt;
&lt;li&gt;At the top left corner select the region you want to enable GuardDuty for.&lt;/li&gt;
&lt;li&gt;Click the "Enable GuardDuty" button.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Once GuardDuty is enabled, it will begin analyzing logs and events from various sources, such as CloudTrail logs, VPC Flow Logs, and DNS logs. GuardDuty will generate findings based on the analysis of the data and will alert you in real-time if it detects any potential security threats.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Finding types&lt;/strong&gt;&lt;br&gt;
GuardDuty generates a finding whenever it detects unexpected and potentially malicious activity in your AWS environment. AWS GuardDuty currently generates findings for &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;EC2 &lt;/li&gt;
&lt;li&gt;IAM &lt;/li&gt;
&lt;li&gt;Kubernetes audit logs &lt;/li&gt;
&lt;li&gt;Malware Protection &lt;/li&gt;
&lt;li&gt;RDS Protection &lt;/li&gt;
&lt;li&gt;S3 &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Amazon GuardDuty pricing&lt;/strong&gt;&lt;br&gt;
AWS GuardDuty service is free for 30 days trial and you can gain access to all features and detection findings. During the preview period, GuardDuty RDS Protection for Amazon Aurora databases is available to GuardDuty customers at no additional cost for some AWS supported region. For more information on pricing you can check out this link &lt;a href="https://aws.amazon.com/guardduty/pricing/"&gt;https://aws.amazon.com/guardduty/pricing/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;br&gt;
AWS GuardDuty is an excellent solution for threat detection in your AWS environment. It is easy to set up and manage, and it provides continuous monitoring and alerting for potential security threats. With GuardDuty, you can improve the security of your AWS environment, reduce the risk of security breaches, and meet compliance requirements.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>security</category>
      <category>awsguardduty</category>
      <category>devops</category>
    </item>
    <item>
      <title>Protecting Your Sensitive Data in the Cloud with Amazon S3 Encryption</title>
      <dc:creator>Amruta Pardeshi</dc:creator>
      <pubDate>Sat, 25 Feb 2023 12:46:35 +0000</pubDate>
      <link>https://dev.to/seeyouoncloud/protecting-your-sensitive-data-in-the-cloud-with-amazon-s3-encryption-28ld</link>
      <guid>https://dev.to/seeyouoncloud/protecting-your-sensitive-data-in-the-cloud-with-amazon-s3-encryption-28ld</guid>
      <description>&lt;p&gt;Amazon Web Services (AWS) provides a wide range of services to help you securely store, manage, and access your data in the cloud. One such service is Amazon S3, which is a highly scalable, durable, and secure object storage service. In addition to providing robust data protection mechanisms, S3 also allows you to encrypt your data at rest and in transit. In this blog post, we will discuss S3 encryption, its benefits, and how to use it.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Benefits of S3 Encryption&lt;/strong&gt;&lt;br&gt;
The benefits of S3 encryption include:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Increased Data Security: Encryption adds a layer of security that protects your data from unauthorized access and malicious activities.&lt;/li&gt;
&lt;li&gt;Compliance with Industry Regulations: S3 encryption can help you comply with regulations that require encryption for sensitive data.&lt;/li&gt;
&lt;li&gt;Protection Against Internal Threats: Encryption safeguards your data from internal threats such as data breaches or rogue employees.&lt;/li&gt;
&lt;li&gt;Ease of Use: AWS offers simple options for encrypting your S3 data, making it easy to secure your sensitive information in the cloud.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;S3 Encryption Options&lt;/strong&gt;&lt;br&gt;
Objects in the S3 bucket can be encrypted using one of the following methods:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Server-side encryption with Amazon S3 managed keys (SSE-S3)&lt;/li&gt;
&lt;li&gt;Server-side encryption with AWS Key Management Service (AWS 
KMS) keys (SSE-KMS)&lt;/li&gt;
&lt;li&gt;Server-side encryption with customer-provided keys (SSE-C)&lt;/li&gt;
&lt;li&gt;Client-side encryption &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Server-Side Encryption (SSE)&lt;/strong&gt;&lt;br&gt;
SSE is a built-in S3 encryption feature that encrypts your data at rest using either AWS-managed keys or customer-managed keys. It is easy to use.&lt;/p&gt;

&lt;h5&gt;Server-side encryption with Amazon S3 managed keys (SSE-S3):&lt;/h5&gt;

&lt;p&gt;&lt;em&gt;All Amazon S3 buckets have encryption configured by default. Each object is encrypted with a unique key. As an additional safeguard, SSE-S3 encrypts the key itself with a root key that it regularly rotates. SSE-S3 is ideal for most use cases and provides strong encryption at no additional cost.&lt;/em&gt;&lt;/p&gt;

&lt;h5&gt;Server-side encryption with AWS Key Management Service (AWS KMS) keys (SSE-KMS):&lt;/h5&gt;

&lt;p&gt;&lt;em&gt;SSE-KMS is done using the integration of AWS KMS with Amazon S3. When you use SSE-KMS encryption with an S3 bucket, the AWS KMS keys must be in the same Region as the bucket. And there are additional charges for using AWS KMS keys.&lt;/em&gt;&lt;/p&gt;

&lt;h5&gt;Server-side encryption with customer-provided keys (SSE-C):&lt;/h5&gt;

&lt;p&gt;&lt;em&gt;By using SSE-C, you generate and manage your encryption keys and provide them to AWS when you upload your data to S3. When you request access to your data, you provide your encryption keys to AWS to decrypt your data. AWS does not store your encryption keys, ensuring that only you have access to your data.&lt;/em&gt; &lt;/p&gt;

&lt;h5&gt;Client-side encryption:&lt;/h5&gt;

&lt;p&gt;&lt;em&gt;Client-side encryption is a method of encrypting your data before it is uploaded to S3. With client-side encryption, you manage your own encryption keys, and AWS does not play a role in encrypting or decrypting it.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How to Use S3 Encryption&lt;/strong&gt;&lt;br&gt;
Encrypting your S3 data is easy. You can enable server-side encryption when you create a new S3 bucket or apply it to an existing bucket. To use SSE-S3 or SSE-KMS, simply select the appropriate option when you create or modify your S3 bucket.&lt;br&gt;
To use client-side encryption, you will need to use an encryption library or tool to encrypt your data before uploading it to S3. There are several open-source and commercial encryption tools available.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;br&gt;
In conclusion, securing your data is a critical aspect of managing your data in the cloud, and Amazon S3 encryption provides a reliable and straightforward solution. AWS offers several options for encrypting your data, including server-side encryption and client-side encryption. By taking advantage of AWS S3 encryption, you can confidently store, manage, and access your sensitive data in the cloud.&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>frontend</category>
      <category>browser</category>
      <category>howto</category>
    </item>
    <item>
      <title>Install Docker on Ubuntu 22.04</title>
      <dc:creator>Amruta Pardeshi</dc:creator>
      <pubDate>Tue, 10 Jan 2023 13:33:05 +0000</pubDate>
      <link>https://dev.to/seeyouoncloud/install-docker-on-ubuntu-2204-55hn</link>
      <guid>https://dev.to/seeyouoncloud/install-docker-on-ubuntu-2204-55hn</guid>
      <description>&lt;p&gt;Docker is a software platform that simplifies the process of building, running, managing and distributing applications. It does this by virtualizing the operating system of the computer on which it is installed and running.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Prerequisites:&lt;/strong&gt;&lt;br&gt;
&lt;em&gt;Ubuntu 20.04 or higher&lt;br&gt;
A non-root user with sudo privileges&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Update package list&lt;/strong&gt;&lt;br&gt;
&lt;em&gt;The first step is to update the package list on the Ubuntu machine. This is done by running the following command in the terminal:&lt;/em&gt;&lt;br&gt;
$ sudo apt update&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2: Install packages to allow apt to use a repository over HTTPS&lt;/strong&gt;&lt;br&gt;
&lt;em&gt;Next, we will install the necessary packages to allow apt to use a repository over HTTPS:&lt;/em&gt;&lt;br&gt;
$ sudo apt install apt-transport-https ca-certificates curl software-properties-common&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3: Add Docker's official GPG key&lt;/strong&gt;&lt;br&gt;
&lt;em&gt;To add Docker's official GPG key, run the following command in the terminal:&lt;/em&gt;&lt;br&gt;
$ curl -fsSL &lt;a href="https://download.docker.com/linux/ubuntu/gpg" rel="noopener noreferrer"&gt;https://download.docker.com/linux/ubuntu/gpg&lt;/a&gt; | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 4: Add the stable Docker repository to the system&lt;/strong&gt;&lt;br&gt;
&lt;em&gt;To add the stable Docker repository to the system, run the following command:&lt;/em&gt;&lt;br&gt;
$ echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] &lt;a href="https://download.docker.com/linux/ubuntu" rel="noopener noreferrer"&gt;https://download.docker.com/linux/ubuntu&lt;/a&gt; $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list &amp;gt; /dev/null&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 5: Update package list again and install Docker CE&lt;/strong&gt;&lt;br&gt;
$ sudo apt update&lt;br&gt;
$ apt-cache policy docker-ce&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 6: Finally install Docker&lt;/strong&gt;&lt;br&gt;
$ sudo apt install docker-ce&lt;/p&gt;

</description>
      <category>watercooler</category>
    </item>
    <item>
      <title>Install SSM-Agent on AWS EC2 Instance</title>
      <dc:creator>Amruta Pardeshi</dc:creator>
      <pubDate>Tue, 29 Nov 2022 08:11:22 +0000</pubDate>
      <link>https://dev.to/seeyouoncloud/install-ssm-agent-on-aws-ec2-instance-21o2</link>
      <guid>https://dev.to/seeyouoncloud/install-ssm-agent-on-aws-ec2-instance-21o2</guid>
      <description>&lt;p&gt;&lt;em&gt;Create an AWS EC2 Instance&lt;br&gt;
Create IAM Role with AmazonEc2RoleforSSM permission&lt;br&gt;
Attach this IAM Role to your EC2 Instance &lt;br&gt;
Now SSH into the EC2&lt;br&gt;
Check status if SSM-Agent is installed using this command,&lt;/em&gt;&lt;br&gt;
&lt;strong&gt;$ sudo systemctl status amazon-ssm-agent&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;$ yum info amazon-ssm-agent&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Command to install SSM-Agent&lt;/em&gt;&lt;br&gt;
&lt;strong&gt;$ sudo yum install -y &lt;a href="https://s3.region.amazonaws.com/amazon-ssm-region/latest/linux_amd64/amazon-ssm-agent.rpm" rel="noopener noreferrer"&gt;https://s3.region.amazonaws.com/amazon-ssm-region/latest/linux_amd64/amazon-ssm-agent.rpm&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Add preferred region in command and run to install ssm agent&lt;/em&gt;&lt;br&gt;
&lt;strong&gt;$ sudo yum install -y &lt;a href="https://s3.us-east-1.amazonaws.com/amazon-ssm-us-east-1/latest/linux_amd64/amazon-ssm-agent.rpm" rel="noopener noreferrer"&gt;https://s3.us-east-1.amazonaws.com/amazon-ssm-us-east-1/latest/linux_amd64/amazon-ssm-agent.rpm&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Now you can open the instance on console and login using session manager&lt;/em&gt;&lt;/p&gt;

</description>
      <category>interview</category>
      <category>career</category>
    </item>
    <item>
      <title>Install Mongodb in AWS Ubuntu Server 22.04</title>
      <dc:creator>Amruta Pardeshi</dc:creator>
      <pubDate>Sat, 26 Nov 2022 09:35:19 +0000</pubDate>
      <link>https://dev.to/seeyouoncloud/install-mongodb-in-aws-ubuntu-server-2204-360b</link>
      <guid>https://dev.to/seeyouoncloud/install-mongodb-in-aws-ubuntu-server-2204-360b</guid>
      <description>&lt;p&gt;&lt;em&gt;Login to your AWS Ubuntu Server&lt;br&gt;
Install the dependencies&lt;/em&gt;&lt;br&gt;
$ sudo apt update&lt;br&gt;
$ sudo apt install dirmngr gnupg apt-transport-https ca-certificates software-properties-common&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Download and add the MongoDB GPG key with the following command&lt;/em&gt;&lt;br&gt;
$ sudo wget -qO - &lt;a href="https://www.mongodb.org/static/pgp/server-5.0.asc" rel="noopener noreferrer"&gt;https://www.mongodb.org/static/pgp/server-5.0.asc&lt;/a&gt; | sudo apt-key add -&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Create a list for MongoDB&lt;/em&gt;&lt;br&gt;
$ echo "deb [ arch=amd64,arm64 ] &lt;a href="https://repo.mongodb.org/apt/ubuntu" rel="noopener noreferrer"&gt;https://repo.mongodb.org/apt/ubuntu&lt;/a&gt; focal/mongodb-org/5.0 multiverse" | sudo tee /etc/apt/sources.list.d/mongodb-org-5.0.list&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Update the local package database&lt;/em&gt;&lt;br&gt;
$ sudo apt-get update&lt;/p&gt;

&lt;p&gt;$ echo "deb &lt;a href="http://security.ubuntu.com/ubuntu" rel="noopener noreferrer"&gt;http://security.ubuntu.com/ubuntu&lt;/a&gt; focal-security main" | sudo tee /etc/apt/sources.list.d/focal-security.list&lt;br&gt;
$ sudo apt-get update &lt;br&gt;
$ sudo apt-get install libssl1.1&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Now Install the MongoDB with the following command&lt;/em&gt;&lt;br&gt;
$ sudo apt-get install -y mongodb-org&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Start the MongoDB service and enable it to start automatically after rebooting the system.&lt;/em&gt;&lt;br&gt;
$ sudo systemctl start mongod&lt;br&gt;
$ sudo systemctl enable mongod&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Now, check the status of the MongoDB service&lt;/em&gt;&lt;br&gt;
$ sudo systemctl status mongod&lt;/p&gt;

&lt;p&gt;&lt;em&gt;To verify whether the installation has completed successfully by running the following command.&lt;/em&gt;&lt;br&gt;
$ mongo --eval 'db.runCommand({ connectionStatus: 1 })'&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Output:&lt;/em&gt;&lt;br&gt;
  ubuntu@ip-172-31-7-45:~$ mongo --eval 'db.runCommand({ connectionStatus: 1 })'&lt;br&gt;
MongoDB shell version v5.0.13&lt;br&gt;
connecting to: mongodb://127.0.0.1:27017/?compressors=disabled&amp;amp;gssapiServiceName=mongodb&lt;br&gt;
Implicit session: session { "id" : UUID("66cf060a-dbd1-419f-9317-980163b0b34f") }&lt;br&gt;
MongoDB server version: 5.0.13&lt;br&gt;
{&lt;br&gt;
        "authInfo" : {&lt;br&gt;
                "authenticatedUsers" : [ ],&lt;br&gt;
                "authenticatedUserRoles" : [ ]&lt;br&gt;
        },&lt;br&gt;
        "ok" : 1&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Use this command, to access the MongoDB shell.&lt;/em&gt;&lt;br&gt;
$ mongo&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Output:&lt;/em&gt;&lt;br&gt;
ubuntu@ip-172-31-7-45:~$ mongo&lt;br&gt;
MongoDB shell version v5.0.13&lt;br&gt;
connecting to: mongodb://127.0.0.1:27017/?compressors=disabled&amp;amp;gssapiServiceName=mongodb&lt;br&gt;
Implicit session: session { "id" : UUID("61a1f8c5-2e86-449a-a7ec-792bcd1d5b93") }&lt;/p&gt;

&lt;h1&gt;
  
  
  MongoDB server version: 5.0.13
&lt;/h1&gt;

&lt;p&gt;Warning: the "mongo" shell has been superseded by "mongosh",&lt;br&gt;
which delivers improved usability and compatibility.The "mongo" shell has been deprecated and will be removed in&lt;br&gt;
an upcoming release.&lt;br&gt;
For installation instructions, see&lt;/p&gt;

&lt;h1&gt;
  
  
  &lt;a href="https://docs.mongodb.com/mongodb-shell/install/" rel="noopener noreferrer"&gt;https://docs.mongodb.com/mongodb-shell/install/&lt;/a&gt;
&lt;/h1&gt;

&lt;p&gt;Welcome to the MongoDB shell.&lt;br&gt;
For interactive help, type "help".&lt;br&gt;
For more comprehensive documentation, see&lt;br&gt;
        &lt;a href="https://docs.mongodb.com/" rel="noopener noreferrer"&gt;https://docs.mongodb.com/&lt;/a&gt;&lt;br&gt;
Questions? Try the MongoDB Developer Community Forums&lt;/p&gt;

&lt;h2&gt;
  
  
          &lt;a href="https://community.mongodb.com" rel="noopener noreferrer"&gt;https://community.mongodb.com&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;The server generated these startup warnings when booting:&lt;br&gt;
        2022-11-15T08:01:08.164+00:00: Using the XFS filesystem is strongly recommended with the WiredTiger storage engine. See &lt;a href="http://dochub.mongodb.org/core/prodnotes-filesystem" rel="noopener noreferrer"&gt;http://dochub.mongodb.org/core/prodnotes-filesystem&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
          2022-11-15T08:01:08.921+00:00: Access control is not enabled for the database. Read and write access to data and configuration is unrestricted
&lt;/h2&gt;




&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    Enable MongoDB's free cloud-based monitoring service, which will then receive and display
    metrics about your deployment (disk utilization, CPU, operation statistics, etc).

    The monitoring data will be available on a MongoDB website with a unique URL accessible to you
    and anyone you share the URL with. MongoDB may use this information to make product
    improvements and to suggest MongoDB products and deployment options to you.

    To enable free monitoring, run the following command: db.enableFreeMonitoring()
    To permanently disable this reminder, run the following command: db.disableFreeMonitoring()
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;




&lt;p&gt;&lt;em&gt;Connect to the admin database.&lt;/em&gt;&lt;br&gt;
$ use admin&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Output:&lt;/em&gt;&lt;br&gt;
    switched to db admin&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Run the following command to create a new user and set the password for the user.&lt;/em&gt;&lt;br&gt;
$ db.createUser(&lt;br&gt;
   {&lt;br&gt;
     user: "mongoAdmin",&lt;br&gt;
     pwd: "KAb3747d",&lt;br&gt;
     roles: [ { role: "userAdminAnyDatabase", db: "admin" } ]&lt;br&gt;
   }&lt;br&gt;
  )&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Output:&lt;/em&gt;&lt;br&gt;
   Successfully added user: {&lt;br&gt;
        "user" : "mongoAdmin",&lt;br&gt;
        "roles" : [&lt;br&gt;
                {&lt;br&gt;
                        "role" : "userAdminAnyDatabase",&lt;br&gt;
                        "db" : "admin"&lt;br&gt;
                }&lt;br&gt;
        ]&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;Exit the mongo shell.&lt;br&gt;
$ quit()&lt;/p&gt;

&lt;p&gt;To test the changes, access the mongo shell using the created administrative user.&lt;br&gt;
$ mongo -u mongoAdmin -p --authenticationDatabase admin&lt;/p&gt;

&lt;p&gt;_Output: _&lt;br&gt;&lt;br&gt;
    ubuntu@ip-172-31-7-45:~$ mongo -u mongoAdmin -p --authenticationDatabase admin&lt;br&gt;
    MongoDB shell version v5.0.13&lt;br&gt;
    Enter password:&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Switch to the admin database.&lt;/em&gt;&lt;br&gt;
$ use admin&lt;/p&gt;

&lt;p&gt;&lt;em&gt;List the users and see if you can list the created user.&lt;/em&gt;&lt;br&gt;
$ show users&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Output:&lt;/em&gt;&lt;br&gt;
    {&lt;br&gt;
        "_id" : "admin.mongoAdmin",&lt;br&gt;
        "userId" : UUID("ef767177-be14-488f-8eb9-947941a16d0f"),&lt;br&gt;
        "user" : "mongoAdmin",&lt;br&gt;
        "db" : "admin",&lt;br&gt;
        "roles" : [&lt;br&gt;
                {&lt;br&gt;
                        "role" : "userAdminAnyDatabase",&lt;br&gt;
                        "db" : "admin"&lt;br&gt;
                }&lt;br&gt;
        ],&lt;br&gt;
        "mechanisms" : [&lt;br&gt;
                "SCRAM-SHA-1",&lt;br&gt;
                "SCRAM-SHA-256"&lt;br&gt;
        ]&lt;br&gt;
}&lt;/p&gt;

</description>
      <category>java</category>
      <category>springboot</category>
      <category>learning</category>
      <category>help</category>
    </item>
  </channel>
</rss>
