<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: cloudy</title>
    <description>The latest articles on DEV Community by cloudy (@cloudy).</description>
    <link>https://dev.to/cloudy</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/cloudy"/>
    <language>en</language>
    <item>
      <title>Run RDS start of month</title>
      <dc:creator>cloudy</dc:creator>
      <pubDate>Mon, 30 Aug 2021 11:53:23 +0000</pubDate>
      <link>https://dev.to/cloudy/run-rds-start-of-month-3c2f</link>
      <guid>https://dev.to/cloudy/run-rds-start-of-month-3c2f</guid>
      <description>&lt;p&gt;If you have a requirement to keep the AWS RDS running only few days at the start or end of the month and unable to use Instance Scheduler, setup a lambda function to trigger based on cronjob using the following python script.&lt;/p&gt;

&lt;p&gt;Following lambda python script will stop the RDS based on the environment variable configured on the lambda function.  You can setup another function to start separately by modifying &lt;code&gt;rds.stop_db_instance&lt;/code&gt; to &lt;code&gt;rds.start_db_instance&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import sys
import os
import botocore
import boto3
import datetime
from botocore.exceptions import ClientError


def lambda_handler(event, context):
    # TODO implement

    rds = boto3.client('rds', region_name='ap-southeast-2')
    DBinstance = os.environ.get('DBInstanceName')

    try:
        dt = datetime.datetime.today()
        print(dt)
        if dt.day == 1 or dt.day == 2:
            print("Skip...")
        else:
            print("Continue...")
            response = rds.stop_db_instance(DBInstanceIdentifier=DBinstance)
            print(response)
    except ClientError as e:
        print(e.response['Error']['Message']) 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>aws</category>
    </item>
    <item>
      <title>AWS Security Analytics Bootstrap</title>
      <dc:creator>cloudy</dc:creator>
      <pubDate>Mon, 30 Aug 2021 11:37:36 +0000</pubDate>
      <link>https://dev.to/cloudy/aws-security-analytics-bootstrap-2n01</link>
      <guid>https://dev.to/cloudy/aws-security-analytics-bootstrap-2n01</guid>
      <description>&lt;p&gt;AWS recently released AWS Security Analytics Bootstrap which is an open source framework designed to quickly setup Athena to perform analysis on AWS service logs archived in Amazon S3 buckets.&lt;/p&gt;

&lt;p&gt;More info here: &lt;a href="https://github.com/awslabs/aws-security-analytics-bootstrap/blob/main/AWSSecurityAnalyticsBootstrap/docs/aws_security_analytics_bootstrap_deployment_guide.md"&gt;Link&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This solution currently support CloudTrail, VPC Flowlogs, and Route53 resolver query logs. Ensure S3 bucket policy is modified before deploying the solution.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>security</category>
      <category>serverless</category>
    </item>
    <item>
      <title>AWS Security Hub Findings to Azure Blob</title>
      <dc:creator>cloudy</dc:creator>
      <pubDate>Wed, 25 Aug 2021 10:56:58 +0000</pubDate>
      <link>https://dev.to/cloudy/aws-security-hub-findings-to-azure-blob-hdi</link>
      <guid>https://dev.to/cloudy/aws-security-hub-findings-to-azure-blob-hdi</guid>
      <description>&lt;p&gt;AWS Security Hub provides comprehensive view of your high-priority security alerts and your compliance status across AWS accounts.  &lt;/p&gt;

&lt;p&gt;AWS Security Hub collects data from Guard Duty, Inspector, Config, Macie, Firewall Manager, Systems Manager and other connected partner services to provide a single view of the resources in your account.&lt;/p&gt;

&lt;p&gt;The integration with AWS Organizations allows you to automatically enable Security Hub and its automated security checks in any existing and newly created accounts in the organization. This enable you to centrally view security findings form all the member aws accounts.&lt;/p&gt;

&lt;p&gt;We had a requirement to visualise the finding using PowerBI for this I created the following to send the existing findings to Azure Blob Storage. &lt;/p&gt;

&lt;p&gt;Once the security findings are sent to Azure Blob storage it can be processed and consumed within PowerBI.&lt;/p&gt;

&lt;p&gt;Following requires:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Azure Blobstorage container&lt;/li&gt;
&lt;li&gt;Azure Blob SAS URL for secure authentication
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import json
import boto3
import logging
import os
from datetime import datetime, timedelta
from azure.storage.blob import BlobServiceClient, generate_account_sas, ResourceTypes, AccountSasPermissions, BlobClient, ContainerClient


date = datetime.now().strftime("%Y%m%d%H%M%S")

filename = "findings-"+date+".json"

connect_str = 'BlobEndpoint=https://url'
container = 'containername'

blob_client = BlobClient.from_connection_string(conn_str=connect_str, container_name=container, blob_name =filename)


securityhub_client = boto3.client('securityhub', region_name='ap-southeast-2')

_filter = {
    'ComplianceStatus': [
        {
            'Value': 'FAILED',
            'Comparison': 'EQUALS'
        }
    ],
    'RecordState': [
        {
            'Value': 'ACTIVE',
            'Comparison': 'EQUALS'
        }
    ],
}


results = []
token = ''
while True:
    response = securityhub_client.get_findings(
        NextToken=token,
        Filters=_filter
    )

    results.extend(response["Findings"])

    token = response.get('NextToken')

    if token == '' or token == None:
        break


print (json.dumps(results, indent=4))


blob_client.upload_blob(json.dumps(results))

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>aws</category>
      <category>security</category>
      <category>azure</category>
      <category>python</category>
    </item>
  </channel>
</rss>
