<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Aman Singh</title>
    <description>The latest articles on DEV Community by Aman Singh (@amaze_singh41).</description>
    <link>https://dev.to/amaze_singh41</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/amaze_singh41"/>
    <language>en</language>
    <item>
      <title>Migrating Projects Between Google Cloud Organizations</title>
      <dc:creator>Aman Singh</dc:creator>
      <pubDate>Mon, 10 Feb 2025 18:46:05 +0000</pubDate>
      <link>https://dev.to/amaze_singh41/migrating-projects-between-google-cloud-organizations-3k44</link>
      <guid>https://dev.to/amaze_singh41/migrating-projects-between-google-cloud-organizations-3k44</guid>
      <description>&lt;p&gt;This blog outlines the process of migrating a project from one Google Cloud organization to another, focusing on necessary updates to organizational policies, project migration, VPC and subnet updates, and post-migration test.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Organizational Policies for Export and Import&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To allow the export and import of projects between organizations, certain organizational policies need to be configured&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Source Organization (Exporting Project)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You need to configure the policy &lt;code&gt;constraints/resourcemanger.allowExportDestinations&lt;/code&gt; to allow the export of projects to the destination organization.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Target Organization (Import Project)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The target organization must allow imports using the &lt;code&gt;constraints/resourcemanager.allowedImportSources&lt;/code&gt; policy.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Steps to Set Organization policies&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Enable Exporting in the Source Organization - Run the following command to configure the source organization &lt;code&gt;resourcemanager.allowImportSources&lt;/code&gt; to allow export to the target organization&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;gcloud resource-manager org-policies allow --organization "SOURCE_ORG_ID" \
resourcemanager.allowImportSources "under:organizations/TARGET_ORG_ID" 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Enable Importing in the Target Organization&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Similarly, configure the target organization &lt;code&gt;resourcemanager.allowedImportSources&lt;/code&gt; to allow imports from the source organization.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;glcoud resource-manager org-policies allow --organization "TARGET_ORG_ID" \
resourcemanager.allowedImportSources "under:organizations/SOURCE_ORG_ID"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Analyze resource migration blockers&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;Before migrating a project, it’s crucial to analyze potential blockers that could prevent a smooth migration. You can use following commmand to analyze resource migration blockers.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;glcoud asset analyze-move --project=PROJECT_ID \
--destination-folder=DESTINATION_FOLDER_ID
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Link Billing Account to the Project&lt;/strong&gt; &lt;br&gt;
Make sure the project to be migrated is linked to the correct billing account. Use the following command.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;gcloud billing project link PROJECT_ID --billing-account=BILLING_ACCOUNT_ID
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Migrate Project to New Organization&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Once the necessary policies are set, you can migrate the project using the following gcloud command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;gcloud beta projects move PROJECT_ID --organization=TARGET_ORG_ID
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Run Test&lt;/strong&gt;&lt;br&gt;
Once the migration and configurationm changes are completed, its important to run tests to verify that the migration was successful, the project is functioning properly in the new organization, and that the required networking configurations (VPC, subnets) are correct.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Conduct smoke test to vaildate that the project’s services and APIs are running smoothly in the target organizaiton.&lt;/li&gt;
&lt;li&gt;Ensure that all dependencies and network connections are intact.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Summary of Steps&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Update Organization policies: Enable export and import permisions between source and target organizations.&lt;/li&gt;
&lt;li&gt;Link Billing Account: Ensure the project is linked to the correct billing account.&lt;/li&gt;
&lt;li&gt;Migrate Project: Use gcloud beta projects move to move the project to the new organization.&lt;/li&gt;
&lt;li&gt;Run Test: Verify that everything works properly in the new organization.&lt;/li&gt;
&lt;li&gt;Analyze migration blockers: Use gcloud asset analyze-move to identify andy potenctional migration blockers.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>googlecloud</category>
      <category>google</category>
      <category>devops</category>
    </item>
    <item>
      <title>AWS Cert Manager integration with Prometheus with Domain Name</title>
      <dc:creator>Aman Singh</dc:creator>
      <pubDate>Tue, 18 Jun 2024 20:29:34 +0000</pubDate>
      <link>https://dev.to/amaze_singh41/aws-cert-manager-integration-with-prometheus-with-domain-name-4a2a</link>
      <guid>https://dev.to/amaze_singh41/aws-cert-manager-integration-with-prometheus-with-domain-name-4a2a</guid>
      <description>&lt;p&gt;&lt;strong&gt;Problem&lt;/strong&gt;&lt;br&gt;
When using CloudWatch metrics for ACM (AWS Certificate Manager), there is a limitation in that only the ARN (Amazon Resource Name) of ACM certificates is displayed. This makes it difficult to easily identify which domain is expiring, as ARNs are not human-friendly and are hard to interpret at a glance. Instead, having the domain name displayed would be more user-friendly and would make it easier to manage and monitor certificate expirations.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Solution&lt;/strong&gt;&lt;br&gt;
One effective solution to this problem is to integrate Prometheus for monitoring ACM certificates. Prometheus allows for more customizable and readable metrics. Here is how you can set up Prometheus to monitor ACM certificates by domain name:&lt;/p&gt;

&lt;p&gt;Install Prometheus: First, install Prometheus on your monitoring server or use a managed service like Amazon Managed Service for Prometheus.&lt;br&gt;
Set Up Exporter: Use a custom exporter or an existing one that can fetch ACM certificates details, including domain names. The exporter will query AWS ACM and transform the ARN-based metrics into domain-based metrics.&lt;br&gt;
Prerequisites&lt;br&gt;
Before we dive into the integration process, ensure you have the following:&lt;/p&gt;

&lt;p&gt;An AWS account with access to AWS Certificate Manager.&lt;br&gt;
A running Prometheus instance.&lt;br&gt;
Basic understanding of AWS IAM roles and permissions.&lt;br&gt;
Step 1: Setting Up AWS IAM Permissions&lt;br&gt;
To allow Prometheus to access ACM, you’ll need to set up appropriate IAM permissions.&lt;/p&gt;

&lt;p&gt;Create an IAM Policy:&lt;br&gt;
Navigate to the IAM console in AWS.&lt;/p&gt;

&lt;p&gt;Click on “Policies” and then “Create policy”.&lt;/p&gt;

&lt;p&gt;Add the following JSON to allow read-only access to ACM:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "acm:ListCertificates",
        "acm:DescribeCertificate"
      ],
      "Resource": "*"
    }
  ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Create an IAM Role:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Go to the IAM console and click on “Roles” &amp;gt; “Create role”.&lt;/p&gt;

&lt;p&gt;Select the “EC2” service, assuming Prometheus is running on an EC2 instance. Attach the policy you created in the previous step and assign role to an instance running Prometheus.&lt;/p&gt;

&lt;p&gt;Step 2: Create a python script to fetch ACM details from AWS.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import boto3
import http.server
import socketserver
import json
import time
import logging

PORT = 9102

# Set up logging
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')

class ACMExporter(http.server.SimpleHTTPRequestHandler):
    def do_GET(self):
        if self.path == '/metrics':
            self.send_response(200)
            self.send_header('Content-type', 'text/plain')
            self.end_headers()
            try:
                metrics = self.generate_metrics()
                self.wfile.write(metrics.encode())
            except Exception as e:
                logging.error(f"Error generating metrics: {e}")
                self.send_response(500)
                self.end_headers()
        else:
            self.send_response(404)
            self.end_headers()

    def generate_metrics(self):
        acm_client = boto3.client('acm', region_name='us-west-1')  # Change the region if necessary
        try:
            response = acm_client.list_certificates()
        except Exception as e:
            logging.error(f"Error listing certificates: {e}")
            raise

        certificates = response.get('CertificateSummaryList', [])

        metrics = []
        for cert in certificates:
            try:
                expiration_time = int(cert['NotAfter'].timestamp())
                cert_arn = cert['CertificateArn']
                domain_name = self.get_certificate_domain(acm_client, cert_arn)
                metrics.append(f'acm_cert_expiration_timestamp{{domain_name="{domain_name}"}} {expiration_time}')
            except Exception as e:
                logging.error(f"Error processing certificate {cert['CertificateArn']}: {e}")

        return '\n'.join(metrics)

    def get_certificate_domain(self, acm_client, certificate_arn):
        try:
            response = acm_client.describe_certificate(CertificateArn=certificate_arn)
            certificate_detail = response['Certificate']
            domain_name = certificate_detail['DomainName']
            return domain_name
        except Exception as e:
            logging.error(f"Error describing certificate {certificate_arn}: {e}")
            raise

def run(server_class=socketserver.TCPServer, handler_class=ACMExporter):
    server_address = ('', PORT)
    httpd = server_class(server_address, handler_class)
    logging.info(f'Starting ACM exporter on port {PORT}...')
    httpd.serve_forever()

if __name__ == "__main__":
    run()
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, run the script using nohupcommand. Before running command install all the required libraries using pip3 .&lt;/p&gt;

&lt;p&gt;&lt;code&gt;nohup python3 cert-manager-metric.py &amp;amp;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Once above script is running using PORT: 9102 . Use curl command to check if this script is getting metric or not.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;curl http://localhost:9102&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Step 3: Configure Prometheus to read metrics&lt;br&gt;
Edit the Prometheus configuration file and following lines to&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;scrape_configs

  - job_name: 'acm-exporter'
    static_configs:
      - targets: ['localhost:9102']
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once you are done with above configuration, restart Prometheus service&lt;/p&gt;

&lt;p&gt;&lt;code&gt;systemctl restart prometheus&lt;br&gt;
&lt;/code&gt;&lt;br&gt;
And you are done. Now go to the Prometheus and check the metrics.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>monitoring</category>
      <category>prometheus</category>
      <category>sre</category>
    </item>
    <item>
      <title>GCP Landing zone for FI</title>
      <dc:creator>Aman Singh</dc:creator>
      <pubDate>Fri, 09 Sep 2022 21:24:46 +0000</pubDate>
      <link>https://dev.to/amaze_singh41/gcp-landing-zone-for-fi-48f9</link>
      <guid>https://dev.to/amaze_singh41/gcp-landing-zone-for-fi-48f9</guid>
      <description>&lt;p&gt;HI Everyone,&lt;/p&gt;

&lt;p&gt;With the continuous efforts and lots of teamwork. We are able to achieve the end product. From past 9 months I was working on a project to create landing zone for Google Cloud Platform which should be Complient to fit for financial institution. &lt;br&gt;
 Today, I will not write much but requesting you all to Kindly visit below GitHub link and try it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/cldcvr/gcifi-lz"&gt;GitHub Link&lt;/a&gt;&lt;/p&gt;

</description>
      <category>gcp</category>
      <category>googlecloud</category>
      <category>terraform</category>
      <category>cicd</category>
    </item>
    <item>
      <title>Scan Terraform with OPA</title>
      <dc:creator>Aman Singh</dc:creator>
      <pubDate>Thu, 02 Jun 2022 13:42:45 +0000</pubDate>
      <link>https://dev.to/amaze_singh41/scan-terraform-with-opa-9d7</link>
      <guid>https://dev.to/amaze_singh41/scan-terraform-with-opa-9d7</guid>
      <description>&lt;p&gt;Recently, I started working on compliance projects, and it became necessity to discover any tool which can show the written IAC Terraform code is compliant to certain standards such as CIS, SOC2 etc. And There I got introduced to regula, and my journey started in the world of compliance.&lt;/p&gt;

&lt;p&gt;In this blog will take about with is Open agent policy ? What is Rego? And how they can be used with regula to generate report.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What is OPA?&lt;/strong&gt;&lt;br&gt;
The Open Policy Agent (OPA, pronounced “oh-pa”) is an open source, general-purpose policy engine that unifies policy enforcement across the stack.OPA provides a high-level declarative language that lets you specify policy as code and simple APIs to offload policy decision-making from your software. You can use OPA to enforce policies in microservices, Kubernetes, CI/CD pipelines, API gateways, and more.&lt;/p&gt;

&lt;p&gt;To understand this better, we need to understand rego.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Introduction to Rego&lt;/strong&gt;&lt;br&gt;
Rego was inspired by Datalog, which is a well understood, decades old query language. Rego extends Datalog to support structured document models such as JSON.&lt;/p&gt;

&lt;p&gt;Rego queries are assertions on data stored in OPA. These queries can be used to define policies that enumerate instances of data that violate the expected state of the system.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why use rego&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;1. Easy to read and write.&lt;/li&gt;
&lt;li&gt;2. Powerful support for referencing nested document&lt;/li&gt;
&lt;li&gt;3. declarative&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Regula&lt;/strong&gt;&lt;br&gt;
Regula is a tool that evaluates infrastructure as code files for potential AWS, Azure, Google Cloud, and Kubernetes security and compliance violations prior to deployment.&lt;/p&gt;

&lt;p&gt;Regula supports the following file types:&lt;/p&gt;

&lt;p&gt;CloudFormation JSON/YAML templates&lt;br&gt;
Terraform HCL code&lt;br&gt;
Terraform JSON plans&lt;br&gt;
Kubernetes YAML manifests&lt;br&gt;
Azure Resource Manager (ARM) JSON templates (in preview)&lt;br&gt;
Install regula&lt;/p&gt;

&lt;p&gt;Download the Regula archive for your platform from the Releases page.&lt;br&gt;
Extract the downloaded archive.&lt;br&gt;
Move the extracted regula binary to somewhere in your PATH:&lt;br&gt;
Also, one can use&lt;code&gt;Homebrew&lt;/code&gt;&lt;br&gt;
&lt;strong&gt;Mac:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#brew install regula
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Linux:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#sudo mv regula /usr/local/bin
OR
#brew install regula
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Windows cmd:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;md C:\regula\bin 
move regula.exe C:\regula\bin 
setx PATH "%PATH%;C:\regula\bin"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, let’s jump into the main scenario of this blog. In today’s world, we are using IAC to write and deploy our IAC to cloud environment. But, for most of the cases we need to check whether our code is compliant and has all parameters secured before deploying over cloud, here comes OPA to help.&lt;/p&gt;

&lt;p&gt;Let write some terraform script. Here, we are creating AWS ec2 instance with certain details.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "aws_instance" "instance" {
  ami                         = "ami-0ddb956ac6be95761"
  instance_type               = "t2.small"
  key_name                    = "key-pair-name"
  vpc_security_group_ids      = "sg-name
  subnet_id                   = "subnet-xxxxxxx"
  associate_public_ip_address = true

  root_block_device {
    volume_size           = 50
    delete_on_termination = true
  }

  tags = {
    Name = "demo-server"
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let, compare above terraform code with OPA rule, to write a rule we need to define a rule with file extension rego&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;package rules.tf_aws_ec2_instance_no_public_ip

__rego__metadoc__ := {
   "id": "POLICY_ID01",
   "title": "Ensure instance have no public IP associated",
   "description": "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX",
   "custom": {"severity": "Low"},
}

resource_type := "aws_instance"

default allow = false

allow {
   input.associate_public_ip_address == false
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;let’s go through the above rule,&lt;/p&gt;

&lt;p&gt;package :- Packages group the rules defined in one or more modules into a particular namespace. Because rules are name-spaced, they can be safely shared across projects.&lt;br&gt;
rego__metadoc :- It enhances rule or policy reporting, with details like id , title description custom severity etc. refer.&lt;br&gt;
resource_type :- the resource you are looking at in terraform.&lt;br&gt;
Above rule will check if associate_public_ip_address is set to false is so, Then it will all the scan to pass. If not, it will prompt an error like below.&lt;/p&gt;

&lt;p&gt;Run regula command against the main.tf file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#regula run main.tf
POLICY_ID01: Ensure instance have no public IP associated [Low]

       in main.tf:1:1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If required, you can get this output in JSON format as well.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#regula run main.tf -f json
"rule_id": "POLICY_ID01",
      "rule_message": "",
      "rule_name": "tf_aws_ec2_instance_no_public_ip",
      "rule_raw_result": false,
      "rule_result": "FAIL",
      "rule_severity": "Low",
      "rule_summary": "Ensure instance have no public IP associated",
      "source_location": [
        {
          "path": "main.tf",
          "line": 1,
          "column": 1
        }
      ]
    },
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can also run regula with tf_plan file. For that, we need to run plan and exact plan output file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#terraform plan -out="plan.tfplan"
#terraform show -json plan.tfplan &amp;gt; plan.json
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;once you have plan.json a file in JSON format, you can compare it with regula. The output will be the same.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#regula run plan.json
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;It is very important to use IAC to make your deployment faster and robust. But, at the same time it’s also import to check whether it's compliant with standards or not and OPA is a good solution to help you get compliant &amp;amp; and one can mold it as per their use case.&lt;/p&gt;

&lt;h2&gt;
  
  
  Reference
&lt;/h2&gt;

&lt;p&gt;OPA — &lt;a href="https://www.openpolicyagent.org/docs/latest/"&gt;https://www.openpolicyagent.org/docs/latest/&lt;/a&gt;&lt;br&gt;
regula — &lt;a href="https://regula.dev/"&gt;https://regula.dev/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>opa</category>
      <category>iac</category>
      <category>terraform</category>
      <category>compliance</category>
    </item>
  </channel>
</rss>
