<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: 🍌🍌🍌</title>
    <description>The latest articles on DEV Community by 🍌🍌🍌 (@0xbanana).</description>
    <link>https://dev.to/0xbanana</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/0xbanana"/>
    <language>en</language>
    <item>
      <title>Hackernews clone...but for cloud news!</title>
      <dc:creator>🍌🍌🍌</dc:creator>
      <pubDate>Tue, 29 Sep 2020 18:44:36 +0000</pubDate>
      <link>https://dev.to/0xbanana/hackernews-clone-but-for-cloud-news-39gf</link>
      <guid>https://dev.to/0xbanana/hackernews-clone-but-for-cloud-news-39gf</guid>
      <description>&lt;p&gt;Hello everyone on DEV! &lt;/p&gt;

&lt;p&gt;I'm really excited to share a project I completed with you all called &lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://CloudWeekly.News"&gt;Cloud Weekly ... News!&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;It's a news aggregator for all things related to "The Cloud"&lt;/p&gt;

&lt;p&gt;It features news from the big three cloud vendors &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AWS&lt;/li&gt;
&lt;li&gt;Azure&lt;/li&gt;
&lt;li&gt;GCP&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;As well as posts from the DEV.to community! Any posts that are cloud and devops specific get shared to the larger community and the traffic is brought back to you, the post writers.&lt;/p&gt;

&lt;p&gt;I hope some of you will take a look and give me some feedback!&lt;/p&gt;

&lt;p&gt;I look forward to seeing what more of you write and share with the world! &lt;/p&gt;

</description>
      <category>showdev</category>
      <category>webdev</category>
      <category>news</category>
      <category>devops</category>
    </item>
    <item>
      <title>Automating Deploys with Bash scripting and Google Cloud SDK</title>
      <dc:creator>🍌🍌🍌</dc:creator>
      <pubDate>Sat, 15 Aug 2020 00:10:44 +0000</pubDate>
      <link>https://dev.to/0xbanana/automating-deploys-with-bash-scripting-and-google-cloud-sdk-4976</link>
      <guid>https://dev.to/0xbanana/automating-deploys-with-bash-scripting-and-google-cloud-sdk-4976</guid>
      <description>&lt;p&gt;You don’t have to know terraform, ansible, chef, puppet, or any other Infrastructure-as-Code (IAC) tools to automate your deployments on Google Cloud Platform. &lt;/p&gt;

&lt;p&gt;Google offers a several command line tools in their &lt;a href="https://cloud.google.com/sdk"&gt;Software Development Kit (SDK)&lt;/a&gt; that can be used to create/destroy resources, interact with storage buckets, create firewall rules and more; you can automate your entire deployment! &lt;/p&gt;

&lt;p&gt;Having a solid understanding of the SDK tools will pay dividends in your career and they’re a requirement for every GCP certificate; Its very beneficial to spend some time getting comfortable with them.&lt;/p&gt;

&lt;p&gt;This isn’t a tutorial on how to use gcloud so if you’re generally unfamiliar with the tool, please see the resources below to fill in any knowledge gaps.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://cloud.google.com/sdk/gcloud"&gt;gcloud command-line tool overview  | Cloud SDK Documentation&lt;/a&gt;&lt;br&gt;
&lt;a href="https://cloud.google.com/sdk/docs/images/gcloud-cheat-sheet.pdf"&gt;gcloud Cheat Sheet  |  Cloud SDK Documentation&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To automate calling multiple gcloud commands to build our infrastructure I’m going to use BASH scripting.  If you’re unfamiliar with BASH scripting,  here’s a TLDR;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Shell commands listed in a file that get executed in sequence&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Let’s look at a small deploy script.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;create_instance.sh&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#!/bin/bash

ZONE="us-east1-a"

echo "Welcome to our tiny script - lets create an instance"
gcloud compute instances create instance-1 --zone $ZONE

echo "Done!
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;This script gives us some console output to let us know its running, creates an instance named &lt;code&gt;instance-1&lt;/code&gt; in the zone defined by the variable &lt;code&gt;ZONE&lt;/code&gt; in this case, &lt;code&gt;us-east1-a&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;With this concept we can expand it to larger more complicated deployments; and the same can be used to teardown infrastructure in an expedient programatic fashion.&lt;/p&gt;

&lt;p&gt;To ensure our systems are properly configured once created in the cloud, we’re going to use another feature offered to us by GCP, a “startup script”. A startup script is a script that is executed once the system is created and spins up for the first time.  We can configure a startup script to run a few different ways:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;A remote file stored on a public server&lt;/li&gt;
&lt;li&gt;A file stored in a google cloud bucket&lt;/li&gt;
&lt;li&gt;Inlined into our deploy script&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;An example below shows us setting metadata key value pair in the gcloud command to pull a file from a remote server&lt;/p&gt;

&lt;p&gt;&lt;code&gt;gcloud compute instances create instance-1 —metadata startup-script-url=https://server.com/script.sh&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;script.sh

# Update and install packages
sudo apt-get update
sudo apt-get install git build-essential apache2 -y

# Update index.html page
echo "&amp;lt;html&amp;gt;&amp;lt;body&amp;gt;&amp;lt;h1&amp;gt;We're live!&amp;lt;/h1&amp;gt;&amp;lt;/body&amp;gt;&amp;lt;/html&amp;gt;" &amp;gt; /var/www/html/index.html

echo "DONE!"
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Our script is located on a remote (public) server, pulled down, executed, and our custom server is up and running!&lt;/p&gt;

&lt;p&gt;This overview should give you some new insight into the power of automating network creation and system bootstrapping. &lt;/p&gt;

&lt;p&gt;Taking the idea to the next level, I wanted to have deployment scripts for various use cases ready at my disposal when needed. One of those that I want to walk through with you today is a simple forensics laboratory. A place where files could be dissected, malware stored, and segmented from my personal and work equipment.&lt;/p&gt;

&lt;p&gt;Here are my requirements&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Forensics workstation&lt;/li&gt;
&lt;li&gt;Kali Linux workstation&lt;/li&gt;
&lt;li&gt;Honey pot 

&lt;ul&gt;
&lt;li&gt;Segmented from other workstations&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Bucket for malware and potential bad stuff&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;A small artistic endeavor later this is the network diagram I’ve come up with.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9-Gsc3Po--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/EjRFxqD.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9-Gsc3Po--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/EjRFxqD.png" alt="architecture screenshot"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This network diagram gives us more insight into how we want our resources are structured and grouped, let’s build out our design documentation more.&lt;/p&gt;

&lt;p&gt;A VPC that contains two subnets, safe / unsafe. Safe will contain our instances used for analysis and tooling, tagged with “admin” for firewall rules that allow ingress on 22, 80, 443 and icmp (ping). &lt;/p&gt;

&lt;p&gt;Unsafe subnet will contain one host, a honeypot, and tagged “insecure” for firewall rules to allow connections on all ports and protocols only for hosts with the appropriate tag. &lt;/p&gt;

&lt;p&gt;Not directly an object that will live in our VPC but we’ll also have a Cloud Bucket for storage of any analysis files or reports; this resource will be a project level resource.&lt;/p&gt;

&lt;p&gt;With these design considerations in mind and using the console UI to help us create &lt;code&gt;gcloud&lt;/code&gt; commands we end up with the following deployment script (note: for blog post clarity startup scripts are inlined)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#!/bin/bash

echo "Create VPC and subnets"
gcloud compute networks create lab-net  --subnet-mode=custom --bgp-routing-mode=regional

gcloud compute networks subnets create safe  --range=192.168.0.0/24 --network=lab-net --region=us-east1

gcloud compute networks subnets create unsafe --range=192.168.128.0/29 --network=lab-net --region=us-east1

echo "Creating SIFT instance"
gcloud compute instances create sift-1 --tags=admin \
  --metadata startup-script='
  #! /bin/bash
  sudo su -
  apt-get update
  # TO DO: Install SIFT
  EOF'

echo "Creating Kali instance"
gcloud compute instances create kali-1 --tags=admin \
  --metadata startup-script='
  #! /bin/bash
  sudo su -
  apt-get update
  # TO DO: Install KALI Tools
  EOF'

echo "Creating Honeypot instance"
gcloud compute instances create honeypot-1 --tags=insecure \
  --metadata startup-script='
  #! /bin/bash
  sudo su -
  apt-get update
  # TO DO: Install Honeypots
  EOF'

echo "Create Bucket for storage"
gsutil mb gs://bucket-of-bad-stuff

echo "Adding firewall rules"

gcloud compute firewall-rules create allow-ingress-admin-lab-net --direction=INGRESS --priority=1000 --network=lab-net --action=ALLOW --rules=tcp:22,tcp:80,tcp:443,icmp --source-ranges=0.0.0.0/0 --target-tags=admin

gcloud compute firewall-rules create allow-ingress-insecure-lab-net --direction=INGRESS --priority=1000 --network=lab-net --action=ALLOW --rules=all --source-ranges=0.0.0.0/0 --target-tags=insecure

echo "Done"
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;I will leave script comprehension as an exercise to the reader! &lt;/p&gt;

&lt;p&gt;For the lab to be complete there are still some missing pieces, none of the systems haven’t been configured and are still in their default state; which is going to be an effort for another day.&lt;/p&gt;

&lt;p&gt;Additionally, if our instances don’t have the need to persist for extended periods of time we can cut costs by using preemptible instances, something I talk about in &lt;a href="https://0xbanana.com/blog/cut-your-gcp-compute-costs-by-80-with-this-simple-tip-they-don-t-want-you-to-know/"&gt;this&lt;/a&gt; blog post.&lt;/p&gt;

&lt;p&gt;Once we’ve collected all our samples, performed our exhaustive forensics analysis, written a detailed report, and shipped it off to our client, we can start to tear down our infrastructure.  This is something we can also automate with a small BASH script.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#!/bin/bash

echo "Deleting instances..."
gcloud compute instances delete sift-1 -q
gcloud compute instances delete kali-1 -q
gcloud compute instances delete honeypot-1 -q

echo "Removing bucket..."
gsutil rm -r gs://bucket-of-bad-stuff

echo "Removing firewall rules..."
gcloud compute firewall-rules delete allow-ingress-admin-lab-net -q 
gcloud compute firewall-rules delete allow-ingress-insecure-lab-net -q 

echo "Removing subnets and network..."
gcloud compute networks subnets delete unsafe -q
gcloud compute networks subnets delete safe -q
gcloud compute networks delete lab-net -q

echo "Done"
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Each resource can be referenced by name and deleted using the appropriate &lt;code&gt;gcloud&lt;/code&gt; command module. The &lt;code&gt;-q&lt;/code&gt; flag surpasses the “Are you sure?” Warning and runs the action.&lt;/p&gt;

&lt;p&gt;As a last action I was curious on how long it takes to setup and destroy this VPC I ran some not so scientific timing tests.&lt;/p&gt;

&lt;p&gt;On average VPC and resource creation was completed in under 1m30s*&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;bash DEPLOY_FORENSICS_ENV.sh  7.95s user 1.65s system 11% cpu 1:22.75 total
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;*Note: Creation times do not take into account system setup time, if there are any startup scripts configured, your system may be instantiated but not fully ready for action.&lt;/p&gt;

&lt;p&gt;Destruction took longer.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;bash DESTROY_FORENSICS_ENV.sh  7.81s user 1.69s system 7% cpu 2:10.41 total
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;These times will also vary based on things like: machine type,  if there are any shutdown scripts configured and disk size.  &lt;/p&gt;

&lt;p&gt;I hope this gives you an idea of what’s possible with automated deployments and that you’ll begin thinking about building your own disposable infrastructure to accomplish tasks rather than being tied down to owning a small fortunes worth of hardware. &lt;/p&gt;

</description>
      <category>devops</category>
      <category>googlecloud</category>
      <category>bash</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Using this one simple trick you can cut your GCP compute costs by as much as 80%!</title>
      <dc:creator>🍌🍌🍌</dc:creator>
      <pubDate>Wed, 12 Aug 2020 05:18:38 +0000</pubDate>
      <link>https://dev.to/0xbanana/using-this-one-simple-trick-you-can-cut-your-compute-costs-by-as-much-as-80-42gm</link>
      <guid>https://dev.to/0xbanana/using-this-one-simple-trick-you-can-cut-your-compute-costs-by-as-much-as-80-42gm</guid>
      <description>&lt;p&gt;Okay! Now that I got your attention let me tell you how you can lower your compute costs with minimal change to your existing infrastructure or deployment scripts.&lt;/p&gt;

&lt;h2&gt;
  
  
  Use Preemptible Instances!
&lt;/h2&gt;

&lt;p&gt;Preemptible instances are type of compute engine resources that run at a much lower price than normal instances; they are ephemeral and are terminated (preempted) after 24 hours, if not sooner.&lt;/p&gt;

&lt;p&gt;From the google documentation:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Preemptible instances are excess Compute Engine capacity, so their availability varies with usage.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Preemptible instances offer us massive savings as compared to normal instances of the same machine and hardware specifications. Take a look at the N1 standard machine types pricing page and compare the Price and Preemptible price columns.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jkyP8Lr2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/YLZ5abW.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jkyP8Lr2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/YLZ5abW.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  EIGHTY SEVEN PERCENT !!!
&lt;/h2&gt;

&lt;p&gt;Now you’re probably thinking “&lt;em&gt;Where do I sign up? How do I move everything over to preemptible instances?&lt;/em&gt;”&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--OzPqiV2R--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/KytnEFe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--OzPqiV2R--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/KytnEFe.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Slow down! They’re not great for everything and shouldn’t be used in systems where downtime is unacceptable. Here are some situations where using preemptible instances have an advantage and could cut down on operating costs.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Scientific, medical, or engineering efforts that can tolerate loss and addition of computation nodes without disruption to the overall process.&lt;/li&gt;
&lt;li&gt;Batch image or video rendering workloads where processing power is needed on demand and where jobs that don’t exceed 24 hours.&lt;/li&gt;
&lt;li&gt;Machine Learning modeling that can be distributed across nodes&lt;/li&gt;
&lt;li&gt;Other applications that can restart from a saved state (file).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;On Google Cloud Platform there are three ways to utilize preemptible instances on your compute resources:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Single compute instance with preemptible option set.&lt;/li&gt;
&lt;li&gt;Managed Instance Groups - The configured compute engine template must have the preemptible option set.&lt;/li&gt;
&lt;li&gt;GKE Kubernetes clusters can also leverage preemptible instances as the underlying compute stack. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This sounds like all positives, so let’s talk about the downside, being preempted, aka, having your machine terminated. When its been determined that your instance is headed for destruction you are given a 30 second window to execute any clean up functions, data synchronization, and get out of there before the machine is forcefully deleted.  You can accomplish this by configuring a shutdown script, which will automatically be executed when the system receives the signal.  The script as part of its execution triggers a shutdown of any services you want and copies files to a remote location, like a cloud bucket for the next restart. Below is a link to a great sample of a shutdown script and a python script catching the event for a graceful exit.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://cloud.google.com/compute/docs/shutdownscript"&gt;Running shutdown scripts on Compute Engine &lt;/a&gt;&lt;br&gt;
&lt;a href="https://cloud.google.com/compute/docs/instances/create-start-preemptible-instance#handle_preemption"&gt;Sample Shutdown script for preemptible instances&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Preemptible instances aren’t a panacea for your billing woes; they aren’t ideal for all workloads; availability may become a concern. This is just one of many ways to cut your cloud costs and one of the first steps to knowing which direction to go in is to understand what you’re trying to do (knowing what direction to go in helps too).&lt;/p&gt;




&lt;p&gt;Enjoyed the post? Let me know! 💛🦄🔖&lt;/p&gt;

</description>
      <category>devops</category>
      <category>googlecloud</category>
      <category>sre</category>
    </item>
    <item>
      <title>I’m a certified Associate Cloud Engineer!</title>
      <dc:creator>🍌🍌🍌</dc:creator>
      <pubDate>Fri, 07 Aug 2020 19:56:06 +0000</pubDate>
      <link>https://dev.to/0xbanana/i-m-a-certified-associate-cloud-engineer-130f</link>
      <guid>https://dev.to/0xbanana/i-m-a-certified-associate-cloud-engineer-130f</guid>
      <description>&lt;h2&gt;
  
  
  It’s official everyone!
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;I passed the Google Cloud Platform Associate Cloud Engineer exam!&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I’m proud to have gotten this certification and I wanted to share with you what the test is like, my thoughts on my experience, and what my methods and tools were that helped me achieve success and pass on my first attempt.&lt;/p&gt;

&lt;h3&gt;
  
  
  About the exam.
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Exam format:&lt;/strong&gt; - Multiple choice  and multiple select, taken in person at a test center.&lt;br&gt;
&lt;strong&gt;Length:&lt;/strong&gt; 2 hours&lt;br&gt;
&lt;strong&gt;Registration fee:&lt;/strong&gt; $125  + taxes ≈ $137,  There is also a discount for those from countries with lower purchasing power parity.&lt;br&gt;
&lt;strong&gt;Recommended experience:&lt;/strong&gt; - 6 months+ hands-on experience with GCP.&lt;/p&gt;

&lt;h3&gt;
  
  
  My experience and thoughts
&lt;/h3&gt;

&lt;p&gt;The test and studying was a fantastic primer into the world of managed services and cloud technology.  The core concepts needed to succeed in  “the cloud” are (more or less) the same as traditional/enterprise security.&lt;br&gt;
Technical concepts like, subnetting, CIDR ranges, least privilege principle, and others are things I learned throughout my career and without a decent understanding the learning curve is going to be steep.&lt;/p&gt;

&lt;p&gt;I can honestly say even with all the preparation I did the day of the exam I still felt nervous; I guess that means it was important to me. You can take the exam in person, or digitally. I opted for a in person exam because the next virtual exam was weeks away and I didn’t want to wait that long.&lt;/p&gt;

&lt;p&gt;Due to the pandemic we’re still facing in the United States I didn’t expect there to be so many people taking exams! I was walked into a room with cubicles and workstations, there were many other people in the room taking various other exams, and I was pleased with the health and safety precautions that were taken by the testing center.&lt;/p&gt;

&lt;p&gt;I finished the exam in ~45 minutes and when I was done I felt confident in my answers, but not necessarily that they were correct. I expected a grade once I clicked “submit exam” but the next window was a feedback form, which honestly, how can I give you feedback, I’m anxious about the score!?! I was presented with the 4 characters that lifted so much weight off my shoulders, PASS. There was no indication of my score, what I got wrong, what I got right, just a pass or fail grade; the result also isn’t “official” for 7-10 days while Google performs their validation. Minutes later an exam proctor walked me out and sent me on my way.&lt;/p&gt;

&lt;p&gt;Two days later I received an email saying my exam score was accepted and I received a digital certificate. I was also added to the Google Cloud certificate holder website which was a nice added surprise!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://googlecloudcertified.credential.net/profile/d1594704a60aaa147a8dd8af4a19168465bd2ee4"&gt;Jason Schorr - Google Cloud Credential Holders&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  What I used
&lt;/h3&gt;

&lt;p&gt;LinuxAcademy was a huge contributor to my exam success. The videos total 14 hours with 7 hands on labs to get practical experience.  At the end of the course there is a Practice Exam which I made sure to get to the point where I was getting 100% consistently, its not enough to know the right answer, but to also understand why. I didn’t use the labs provided with LinuxAcademy, but hands on practice is critical to success; I had lab access through another service, Qwiklabs.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://linuxacademy.com/course/google-cloud-certified-professional-cloud-security-engineer/"&gt;Course: Google Cloud Certified Professional Cloud Security Engineer | Linux Academy&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Google offers practice exams for their certs and I highly recommend going through it until you’re getting 100% - the wrong answers are highlighted and the correct answer is supplied along with the reasoning behind them.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://cloud.google.com/certification/practice-exam/cloud-engineer"&gt;Practice exam offered by Google&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Earlier this year Google was offering free Qwiklabs training for anyone who signed up via their links. It cost me nothing, was extremely quick to get through, and allotted me 3 months of unlimited training. This hands on training allowed for operational experience that was instrumental to exam success. There are questions of the exam where your operational command line knowledge is tested.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.qwiklabs.com/"&gt;Qwiklabs provides real cloud environments that help developers and IT professionals learn cloud platforms&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Although the Google offer for 3 months of lab time has ended, keep an eye on the medium article below, Sathish VJ updates it often with some codes for free months; also his training material is also top notch!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://medium.com/@sathishvj/qwiklabs-free-codes-gcp-and-aws-e40f3855ffdb"&gt;Watch this for updates to get for qwiklabs credits&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Graces flowcharts is a fantastic resource for really understanding the situations you would want to use a specific service for. Do you understand the difference between BigTable, BigQuery, Spanner? Do you know when you should use a SSL Proxy vs HTTPS load balancer? I didn’t and these flowcharts helped me make sense of what to use and when.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://grumpygrace.dev/posts/gcp-flowcharts/"&gt;Grace’s flowcharts&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally we have a collection of study notes. I don’t know who compiled them but they’re verbose and cover everything you’ll need to know for the exam. I didn’t find these notes until a few days leading up to my exam so they were really good as a quick reference and having links to the official G docs. I wouldn’t use this doc on its own but its a great thing to have as an adjunct to your studies.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.google.com/document/d/1u6pXBiGMYj7ZLBN21x6jap11rG6gWk7n210hNnUzrkI/edit#heading=h.drjuxxqatwi0"&gt;Associate Cloud Engineer - Study Notes - Google Docs&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Now what?
&lt;/h3&gt;

&lt;p&gt;Now I start preparing for the next certification I want to get and that’s the GCP: Professional Cloud Security Engineer. Based on my skillset and experience I think its the most logical step with the least amount of new material to learn. Diving deeper into the nuts and bolts of GCP is exciting and I’m looking forward to gaining a more complete understanding of the tools and options available to us.&lt;/p&gt;

&lt;h3&gt;
  
  
  Hey reader!
&lt;/h3&gt;

&lt;p&gt;Do you have any questions on my experience?&lt;br&gt;
Did I cover everything?&lt;br&gt;
What do YOU want to know?&lt;/p&gt;




&lt;p&gt;Enjoyed the post? Let me know! 💛🦄🔖&lt;/p&gt;

</description>
      <category>devops</category>
      <category>googlecloud</category>
      <category>sre</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Social Media Accounts Shouldn't Be Static! 💃🏽</title>
      <dc:creator>🍌🍌🍌</dc:creator>
      <pubDate>Wed, 22 Jul 2020 02:52:42 +0000</pubDate>
      <link>https://dev.to/0xbanana/social-media-accounts-shouldn-t-be-static-5327</link>
      <guid>https://dev.to/0xbanana/social-media-accounts-shouldn-t-be-static-5327</guid>
      <description>&lt;h3&gt;
  
  
  Inspired by other posts across the Internet
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--OXj0rgYh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/KEPOTIe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--OXj0rgYh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/KEPOTIe.png" alt="This Video Has 15,608,459 Views - YouTube"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.youtube.com/watch?v=BxV14h0kFs0"&gt;This Video Has 15,608,459 Views - YouTube&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jzcBFtoJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/ArOsz0z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jzcBFtoJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/ArOsz0z.png" alt="This Post Has 2,233 Views, 154 Reactions And 23 Comments"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dev.to/wobsoriano/this-post-has-22-views-5ed6"&gt;This Post Has 2,233 Views, 154 Reactions And 23 Comments - DEV&lt;/a&gt;&lt;/p&gt;



&lt;p&gt;I decided to try to accomplish the same on Twitter with my username field and the number of followers I have.&lt;/p&gt;

&lt;p&gt;The process is pretty straight forward, lets outline it&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;1. Connect to twitter API
2. Get my current number of followers
3. Update my username field
4. Repeat every 10 minutes
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;
&lt;p&gt;Now that we’ve outlined the process lets breakdown the code and infrastructure&lt;/p&gt;

&lt;p&gt;For this project I decided to use Python3. I really like using it for quick scripts over node.js because I don’t want to deal with asynchronous code; its nice to have code that runs one line, after the next, with no interruptions.&lt;/p&gt;

&lt;p&gt;You will need Twitter API credentials for this project, if you don’t have them yet, you’ll have to apply for them at &lt;a href="https://developers.twitter.com"&gt;https://developers.twitter.com&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;To connect to the Twitter API I’ll be using Tweepy. It’s great, it works easily and has built in support for Twitter API’s rate limiting.&lt;/p&gt;

&lt;p&gt;For hosting and scheduling I like to use Heroku. Heroku allows you to run your code and have an exposed web service, capable of handling web requests and serving up content.  In this project we won’t be handling any external request, nor serving up data, but we will still need to make sure our code plays well with Heroku, so we’ll be sure to add a web server, in this case Flask easily fills the gap.&lt;/p&gt;

&lt;p&gt;For our code to run on a schedule or every X minutes, we will enable a Heroku add-on for our project, &lt;code&gt;Heroku Scheduler&lt;/code&gt; which once configured will launch our script on our defined schedule. &lt;/p&gt;

&lt;p&gt;Awesome, let’s dive into the code!&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import tweepy
from flask import Flask
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;First up we have our  &lt;code&gt;import&lt;/code&gt; statements, these statements allow us to use functionality provided by 3rd party libraries.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;os&lt;/code&gt;  library gives us access to the underlying operating system and allows us to run commands directly on the host.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Tweepy&lt;/code&gt; is going to enable us to connect and communicate with the Twitter API.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Flask&lt;/code&gt; is a micro web framework enabling us to handle http requests and many other features, for this project we’re using it to satisfy heroic deployment needs.&lt;/p&gt;

&lt;p&gt;Next up is authentication&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Authenticate to Twitter
auth = tweepy.OAuthHandler("consumer_key", "consumer_secret")
auth.set_access_token("token_key", "token_secret")
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;We need two sets of keys to authenticate, both can be found on the twitter developers page on your app, similar to the screenshot below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mSLWyEf5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/f5Cg4q7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mSLWyEf5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/f5Cg4q7.png" alt="Twitter Developers API Project Page"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once you’ve added those keys, the next step is connecting to the API server and getting some data.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Create API object
api = tweepy.API(auth)

#get my user object
iam = api.me()
followers_count = iam.followers_count

&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;me()&lt;/code&gt; method of the &lt;code&gt;tweepy&lt;/code&gt; object connects to the twitter API and pulls down the current authenticated users user data. &lt;/p&gt;

&lt;p&gt;The user data object thats returned to us can accessed by using the dot &lt;code&gt;.&lt;/code&gt; operator, in this case we want the  &lt;code&gt;followers_count&lt;/code&gt; attribute.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wD7jJLIL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/VTS5SIb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wD7jJLIL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/VTS5SIb.png" alt="Micro Pretty Print Example"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you don’t know the attributes of an object you can use these builtins:&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;dir(object)
vars(object)
__dict(object)__
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Combine that with PrettyPrint for some legibility:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;BONUS CODE SNIPPET&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import pprint
pp = pprint.PrettyPrinter(indent=4)
pp.pprint(vars(object))
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jVBM7NIl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/LIvHFNH.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jVBM7NIl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/LIvHFNH.png" alt="Sample of twitter API output"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now that we’re done with that little detour, let’s get back on track. &lt;/p&gt;

&lt;p&gt;The next part of the code should update our username&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt; api.update_profile(name=“Jason Alvarez is {} new friends away from 10k".format(10000 - int(followers_count)))
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Tweepy provides us a function called &lt;code&gt;update_profile&lt;/code&gt; which will update our profile depending on the parameters supplied to it. In this case we’re passing the &lt;code&gt;name&lt;/code&gt; parameter and giving it the following string&lt;/p&gt;

&lt;p&gt;&lt;code&gt;“Jason Alvarez is {} new friends away from 10k”.format(10000 - int(followers_count)))&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Rather than insert just a number, I decided to use the space as a countdown to a specific target.&lt;/p&gt;




&lt;p&gt;Format takes a string with placeholders &lt;code&gt;{}&lt;/code&gt; and inserts the values supplied &lt;code&gt;(value1, value2…)&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;code&gt;string.format(value1, value2…)&lt;/code&gt;&lt;/p&gt;




&lt;p&gt;This same technique can be applied to any editable field in a profile and any data that the API won’t reject for input; please refer to &lt;a href="https://developer.twitter.com/en/docs"&gt;Docs — Twitter      Developers&lt;/a&gt; for more details on valid and invalid characters. &lt;/p&gt;

&lt;p&gt;The last segment of code is needed for the script to gracefully run on Heroku’s infrastructure. While not specific for this project, let’s go over what it does and what it’s not doing.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;app = Flask(__name__)
app.run()
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Flask as mentioned above is a micro web framework, it’s used normally to create servers that handle http(s) requests and other really neat things. In this case we need to start a server so that Heroku knows our deployment was successful. &lt;/p&gt;

&lt;p&gt;One of the big uses for Flask, is to create your own API Microservices, where you can define your own HTTP routes, a feature we are not using in this case, and is outside the scope of this writeup but I highly suggest you look into it!&lt;/p&gt;

&lt;h2&gt;
  
  
  OKAY THE CODE IS DONE!
&lt;/h2&gt;

&lt;p&gt;Great! We’ve finished writing our script and that means we’re all done and can deploy, right? ALMOST! &lt;/p&gt;

&lt;p&gt;To deploy to Heroku successfully we need two more files:&lt;br&gt;
    &lt;code&gt;requirements.txt&lt;/code&gt;&lt;br&gt;
    &lt;code&gt;Procfile&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;code&gt;requirements.txt&lt;/code&gt; outlines the libraries needed for our code to run successfully and this isn’t something you need to create manually. &lt;/p&gt;

&lt;p&gt;To generate your &lt;code&gt;requirements.txt&lt;/code&gt; run this command in your project directory &lt;code&gt;pip3 freeze &amp;gt; requirements.txt&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;From Heroku’s documentation&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Heroku apps include a Procfile that specifies the commands that are executed by the app on startup.&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Our’s is going to be real simple and just launch that web server we declared in our script. &lt;/p&gt;

&lt;p&gt;Edit your &lt;code&gt;Procfile&lt;/code&gt; so it has the following contents&lt;br&gt;
&lt;code&gt;web: FLASK_APP=update_username.py flask run --port $PORT --host 0.0.0.0&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$PORT&lt;/code&gt; will be supplied to us by the operating system shell environment, and will be random, so don’t try and set anything here. &lt;/p&gt;

&lt;p&gt;Great, if you’ve made it this far your directory should look like the following&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.
├── Procfile
├── requirements.txt
└── update_username.py
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;h3&gt;
  
  
  NOW WE CAN FINALLY DEPLOY!
&lt;/h3&gt;

&lt;p&gt;Create a new project on heroku.com and follow the onscreen instructions, it’s a lot like setting up a GitHub repo.&lt;br&gt;
    * Create a project&lt;br&gt;
    * init&lt;br&gt;
    * add&lt;br&gt;
    * push&lt;br&gt;
(Follow their instructions for more details)&lt;/p&gt;

&lt;p&gt;If your push and deploy were successful, you should see something like the following in your terminal&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;app[scheduler.5678]: * Running on http://0.0.0.0:48547/ (Press CTRL+C to quit)
heroku[scheduler.5678]: State changed from starting to up
heroku[scheduler.5678]: Cycling
heroku[scheduler.5678]: State changed from up to complete
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;At this point your Twitter user name should now be updated with the string you used in the script. In my use case I want my user name to be pretty in sync with the number of followers I have, so let’s set it up to run on a schedule.&lt;/p&gt;

&lt;p&gt;To accomplish this we’re going to use a Heroku project add-on called “Heroku Scheduler”, it’s free and shouldn’t bump up the cost of your dyno, add it to the project, then click it to configure the jobs.&lt;/p&gt;

&lt;p&gt;To stay as in sync as I can I chose a “Every 10 minutes” job and for the run command, I used the same content that’s in the &lt;code&gt;Procfile&lt;/code&gt; - &lt;br&gt;
&lt;code&gt;FLASK_APP=update_username_num_followers.py flask run —port $PORT —host 0.0.0.0&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Save the job and you’re all done.&lt;/p&gt;




&lt;p&gt;Let’s go over what we’ve accomplished.&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;1. We have a script that will update our Twitter username
2. It communicates with Twitter API for data
3. Our code project is deployed on a managed service
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Thats not bad for a little fun script! &lt;/p&gt;




&lt;p&gt;Enjoyed the post? Let me know! 💛🦄🔖&lt;/p&gt;

</description>
      <category>twitter</category>
      <category>api</category>
      <category>beginners</category>
      <category>python</category>
    </item>
    <item>
      <title>🚀TOP Beginner ReactJS Resources 🎊2020🎊🚀</title>
      <dc:creator>🍌🍌🍌</dc:creator>
      <pubDate>Fri, 10 Jan 2020 03:36:47 +0000</pubDate>
      <link>https://dev.to/0xbanana/top-beginner-reactjs-resources-2020-1nic</link>
      <guid>https://dev.to/0xbanana/top-beginner-reactjs-resources-2020-1nic</guid>
      <description>&lt;p&gt;React is a JavaScript User Interface library for creating interactive web applications.&lt;/p&gt;

&lt;p&gt;These are the resources I used to learn to write clean efficient code!&lt;/p&gt;

&lt;h3&gt;
  
  
  create-react-app
&lt;/h3&gt;

&lt;p&gt;Create React App works on macOS, Windows, and Linux. Create React apps with no build configuration.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6PuT8cT---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://github.com/facebook/create-react-app/blob/master/README.md" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6PuT8cT---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://github.com/facebook/create-react-app/blob/master/README.md" alt="https://github.com/facebook/create-react-app/blob/master/README.md"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  The Beginner's Guide to React by Kent C. Dodds
&lt;/h3&gt;

&lt;p&gt;This course is for React newbies and anyone looking to build a solid foundation. It’s designed to teach you everything you need to start building web applications in React right away.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://egghead.io/courses/the-beginner-s-guide-to-reactjs"&gt;https://egghead.io/courses/the-beginner-s-guide-to-reactjs&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  React Bits
&lt;/h3&gt;

&lt;p&gt;A free book that talks about design patterns/techniques used while developing with React. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://vasanthk.gitbooks.io/react-bits/"&gt;https://vasanthk.gitbooks.io/react-bits/&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  🚀Absolutely Awesome React Components &amp;amp; Libraries 🚀
&lt;/h3&gt;

&lt;p&gt;Curated List of React Components &amp;amp; Libraries.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/brillout/awesome-react-components"&gt;https://github.com/brillout/awesome-react-components&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;These resources are ones I go back to regularly to and were instrumental in getting me from zero to writing fully functional and responsible web applications.&lt;/p&gt;

&lt;p&gt;What stuff did I miss?&lt;/p&gt;




&lt;p&gt;Enjoyed the post? Let me know! 💛🦄🔖&lt;/p&gt;

</description>
      <category>beginners</category>
      <category>react</category>
      <category>webdev</category>
      <category>javascript</category>
    </item>
    <item>
      <title>Bucket Hunting 101 - Bounties, Glory, and Fun!</title>
      <dc:creator>🍌🍌🍌</dc:creator>
      <pubDate>Thu, 02 Jan 2020 00:37:20 +0000</pubDate>
      <link>https://dev.to/0xbanana/bucket-hunting-for-bounties-glory-and-fun-5g6f</link>
      <guid>https://dev.to/0xbanana/bucket-hunting-for-bounties-glory-and-fun-5g6f</guid>
      <description>&lt;p&gt;As the internet grows and services become increasingly “server-less” one of the big issues facing companies and startups are cloud security misconfigurations. Servers now can be spun up, torn down, and scaled with a few clicks, as a result security can be overlooked.&lt;/p&gt;

&lt;p&gt;Here are some common bucket uses:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Publicly accessible data  - Google LandSat &lt;/li&gt;
&lt;li&gt;Phone or Web App storage - Instagram, Homeroom, Cluster &lt;/li&gt;
&lt;li&gt;Websites - RottenTomatoes, IMDB&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;There are many tools that exist for bucket hunting the ones below are my favorite methods for finding open buckets.&lt;/p&gt;

&lt;h3&gt;
  
  
  Amazon S3 Buckets
&lt;/h3&gt;

&lt;p&gt;S3 was launched March 14, 2006 and is currently the largest datastore on the internet. The current URL format for all S3 buckets is &lt;br&gt;
    &lt;code&gt;https://&amp;lt;BucketName&amp;gt;.s3.amazonaws.com&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;It’s simple enough to write your own script to brute force the existence of a bucket with a GET request, but rather than reinvent the wheel let’s use vetted and weathered tools.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;S3Scanner&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;A simple and effective brute forcer that permutes user supplied wordlist, checks for the existence of, and dumps the contents of the bucket. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/sa7mon/S3Scanner" rel="noopener noreferrer"&gt;GitHub - sa7mon/S3Scanner: Scan for open AWS S3 buckets and dump the contents&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fi.imgur.com%2FDh5jVAF.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fi.imgur.com%2FDh5jVAF.png" alt="https://i.imgur.com/Dh5jVAF.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Bucket-Stream&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Find interesting Amazon S3 Buckets by watching certificate transparency logs. This tool listens to various certificate transparency logs  and attempts to find buckets from permutations of the certificates domain name.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/eth0izzle/bucket-stream" rel="noopener noreferrer"&gt;GitHub - eth0izzle/bucket-stream: Find interesting Amazon S3 Buckets by watching certificate transparency logs.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://camo.githubusercontent.com/375d21f8b8139e2df9cb9cb0040d03304b640b95/68747470733a2f2f692e696d6775722e636f6d2f5a466b495968442e6a7067" class="article-body-image-wrapper"&gt;&lt;img src="https://camo.githubusercontent.com/375d21f8b8139e2df9cb9cb0040d03304b640b95/68747470733a2f2f692e696d6775722e636f6d2f5a466b495968442e6a7067" alt="https://camo.githubusercontent.com/375d21f8b8139e2df9cb9cb0040d03304b640b95/68747470733a2f2f692e696d6775722e636f6d2f5a466b495968442e6a7067"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Google Cloud Buckets
&lt;/h3&gt;

&lt;p&gt;Google’s cloud service offerings and environment grows daily. They offer free credits for new users and incentives for small businesses. The URL structure doesn’t follow the same pattern as S3 and as such there are no certificates to listen for, more computationally expensive methods are needed.&lt;br&gt;
    &lt;code&gt;http://storage.cloud.google.com/&amp;lt;BucketName&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;GCPBucketBrute&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;A script to enumerate Google Storage buckets, determine what access you have to them, and determine if they can be privilege escalated.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/RhinoSecurityLabs/GCPBucketBrute/" rel="noopener noreferrer"&gt;GitHub - RhinoSecurityLabs/GCPBucketBrute: A script to enumerate Google Storage buckets, determine what access you have to them, and determine if they can be privilege escalated.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.sectechno.com%2Fwp-content%2Fuploads%2F2019%2F05%2Fgcpbucketbrute.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.sectechno.com%2Fwp-content%2Fuploads%2F2019%2F05%2Fgcpbucketbrute.png" alt="https://www.sectechno.com/wp-content/uploads/2019/05/gcpbucketbrute.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Microsoft Azure Buckets
&lt;/h3&gt;

&lt;p&gt;Not wanting to be left out of the race, Microsoft started their Azure cloud early 2010.  Offering as many if not more services as its competitors, its bucket url’s are harder to find and enumerate. Fortunately there are tools and documentation that can help us. The URL format is as follows:&lt;br&gt;
    &lt;code&gt;https://&amp;lt;ACCOUNT&amp;gt;.blob.core.windows.net/&amp;lt;CONTAINER&amp;gt;/&amp;lt;BLOB&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Brute forcing multiple parts of a string together is computationally complex, so we'll break it up into smaller parts. &lt;/p&gt;

&lt;p&gt;For brute forcing we are going to use gobuster to help us enumerate DNS and directory entries. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/OJ/gobuster/" rel="noopener noreferrer"&gt;GitHub - OJ/gobuster: Directory/File, DNS and VHost busting tool written in Go&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Starting from smallest set of enumeration targets to largest set&lt;/p&gt;

&lt;p&gt;Account &amp;lt; Container &amp;lt; Blob&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Brute force Account &lt;br&gt;
Use gobuster DNS module &lt;br&gt;
gobuster -m dns -u “blob.core.windows.net” -w &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Brute force Container&lt;br&gt;
Use gobuster DIR module&lt;br&gt;
gobuster -m dir -u “.blob.core.windows.net” -e -s 200,204 -fw &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Finding BLOBs&lt;br&gt;
If listing is enabled you can use the same endpoint with some added query parameters.&lt;br&gt;
&lt;code&gt;https://&amp;lt;ACCOUNT&amp;gt;.blob.core.windows.net/&amp;lt;CONTAINER&amp;gt;?restype=container&amp;amp;comp=list&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Otherwise you can attempt to brute force filenames using gobuster. (Not recommended)&lt;/p&gt;

&lt;h3&gt;
  
  
  Digital Ocean Spaces
&lt;/h3&gt;

&lt;p&gt;A relative newcomer to the space digital ocean has a “S3 compatible” system they call spaces, and with this compatibility comes the same issues and enumeration methods as S3.&lt;br&gt;
    &lt;code&gt;https://&amp;lt;BUCKETNAME&amp;gt;.&amp;lt;DATACENTER&amp;gt;.digitaloceanspaces.com&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Digital Ocean offers 5 different global data centers which impact the URL&lt;/p&gt;

&lt;p&gt;sfo2 - &lt;code&gt;https://&amp;lt;BUCKETNAME&amp;gt;.sfo2.digitaloceanspaces.com&lt;/code&gt; &lt;br&gt;
nyc3 - &lt;code&gt;https://&amp;lt;BUCKETNAME&amp;gt;.nyc3.digitaloceanspaces.com&lt;/code&gt; &lt;br&gt;
ams3 - &lt;code&gt;https://&amp;lt;BUCKETNAME&amp;gt;.ams3.digitaloceanspaces.com&lt;/code&gt; &lt;br&gt;
sgp1 - &lt;code&gt;https://&amp;lt;BUCKETNAME&amp;gt;.sgp1.digitaloceanspaces.com&lt;/code&gt; &lt;br&gt;
fra1 - &lt;code&gt;https://&amp;lt;BUCKETNAME&amp;gt;.fra1.digitaloceanspaces.com&lt;/code&gt; &lt;/p&gt;

&lt;p&gt;Bucket names can contain [a-z][0-9][-] and must be between 3 - 63 characters in length. &lt;/p&gt;

&lt;h3&gt;
  
  
  Spray &amp;amp; Pray
&lt;/h3&gt;

&lt;p&gt;Don’t care about targeting a specific platform? These tools below will provide you a plethora of data covering (maybe?) every cloud storage provider. Don’t expect them to work the same or as well as some of the tools above but they are great when targeting specific websites. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://gist.github.com/TweekFawkes/13440c60804e68b83914802ab43bd7a1" rel="noopener noreferrer"&gt;LolrusLove - Spider for Bucket Enumeration (AWS S3 Bucket, Azure Blob Storage, DigitalOcean Spaces, etc…) - Alpha v0.0.6 · GitHub&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/jordanpotti/CloudScraper" rel="noopener noreferrer"&gt;GitHub - jordanpotti/CloudScraper: CloudScraper: Tool to enumerate targets in search of cloud resources. S3 Buckets, Azure Blobs, Digital Ocean Storage Space.&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Visualization
&lt;/h3&gt;

&lt;p&gt;All of the tools listed above are expected to be executed from the command line; the power users of these services interact with them via the command line; yet the overwhelming majority of the files stored need a GUI to be viewed for content and context analysis. &lt;/p&gt;

&lt;p&gt;Bucket Miner brings you that capability, enter a bucket name and if it is publicly accessible will be presented to you for easy and intuitive viewing. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://miner.datadrifter.xyz" rel="noopener noreferrer"&gt;https://miner.datadrifter.xyz&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fi.imgur.com%2FmqCPxU0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fi.imgur.com%2FmqCPxU0.png" alt="https://i.imgur.com/mqCPxU0.png"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;These tools should get you started hunting for open buckets. Did I miss a tool you like or some new methods I didn’t talk about? Comment below and let’s chat! &lt;/p&gt;




&lt;p&gt;Enjoyed the post? Let me know! 💛🦄🔖&lt;/p&gt;

</description>
      <category>security</category>
      <category>azure</category>
      <category>aws</category>
      <category>google</category>
    </item>
    <item>
      <title>Easy and FREE ways to publish a website in 2020!🥳</title>
      <dc:creator>🍌🍌🍌</dc:creator>
      <pubDate>Sun, 22 Dec 2019 03:19:14 +0000</pubDate>
      <link>https://dev.to/0xbanana/easy-and-free-ways-to-publish-a-website-in-2020-44lo</link>
      <guid>https://dev.to/0xbanana/easy-and-free-ways-to-publish-a-website-in-2020-44lo</guid>
      <description>&lt;p&gt;There are too-many-to-count ways to publish a website in 2019 and there will be more in 2020. I've scoured the internet for free web hosting and deployment services. I've tried many and these are my personal picks for easy and FREE web hosting in 2020!&lt;/p&gt;

&lt;h4&gt;
  
  
  GitHub pages - &lt;a href="https://pages.github.com/"&gt;https://pages.github.com/&lt;/a&gt;
&lt;/h4&gt;

&lt;blockquote&gt;
&lt;p&gt;Websites for you and your projects. Hosted directly from your  GitHub repository . Just edit, push, and your changes are live.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Free site hosting, custom domain support and free SSL certificates to show you care about security! &lt;/p&gt;

&lt;h4&gt;
  
  
  surge.sh - &lt;a href="https://surge.sh"&gt;https://surge.sh&lt;/a&gt;
&lt;/h4&gt;

&lt;blockquote&gt;
&lt;p&gt;Simple, single-command web publishing. Publish HTML, CSS, and JS for free!&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Don’t like to leave the command line? Fear not! Surge.sh to the rescue. Easy publishing without leaving your terminal. &lt;/p&gt;

&lt;h4&gt;
  
  
  Zeit - &lt;a href="https://zeit.co"&gt;https://zeit.co&lt;/a&gt;
&lt;/h4&gt;

&lt;blockquote&gt;
&lt;p&gt;ZEIT Host your web projects with zero configuration, automatic SSL, and global CDN.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;ZEIT is a cloud platform for static sites and Serverless Functions. Easy, free, what more do you want?&lt;/p&gt;

&lt;h4&gt;
  
  
  Netlify - &lt;a href="https://netlify.com"&gt;https://netlify.com&lt;/a&gt;
&lt;/h4&gt;

&lt;blockquote&gt;
&lt;p&gt;Deploy modern static websites with Netlify. Get CDN, Continuous deployment, 1-click HTTPS, and all the services you need.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;I’m a fan of how simple it is to deploy a site with netlify. Connect to your GitHub repo or drop a folder onto their website; super simple!&lt;/p&gt;

&lt;h4&gt;
  
  
  AWS Amplify - &lt;a href="https://aws.amazon.com/amplify/console/"&gt;https://aws.amazon.com/amplify/console/&lt;/a&gt;
&lt;/h4&gt;

&lt;blockquote&gt;
&lt;p&gt;Hosting for fullstack serverless web apps with continuous deployment_A fantastic and free (for a year) service from Amazon.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;If you’re familiar with AWS products you’ll love amplify, integrates with the rest of the amazon stack; do it all from one browser tab!&lt;/p&gt;

&lt;h4&gt;
  
  
  Google Cloud Platform - &lt;a href="https://cloud.google.com"&gt;https://cloud.google.com&lt;/a&gt;
&lt;/h4&gt;

&lt;blockquote&gt;
&lt;p&gt;Build, Test &amp;amp; Deploy With Ease. Get $300 To Try Google Cloud Now. Deploy in Minutes.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Google cloud platform is overkill for small and simpler projects. If you need server less services like, databases or lambda functions; GCP is a solid choice.&lt;/p&gt;

&lt;p&gt;--&lt;br&gt;
Enjoyed the post? Let me know! 💛🦄🔖&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>beginners</category>
      <category>devops</category>
      <category>banana</category>
    </item>
    <item>
      <title>Building a url shortener for fun and no profit!</title>
      <dc:creator>🍌🍌🍌</dc:creator>
      <pubDate>Sat, 21 Dec 2019 22:47:01 +0000</pubDate>
      <link>https://dev.to/0xbanana/building-a-url-shortener-for-fun-and-no-profit-28j1</link>
      <guid>https://dev.to/0xbanana/building-a-url-shortener-for-fun-and-no-profit-28j1</guid>
      <description>&lt;p&gt;This was a fun full stack project weeknight project and I learned a lot while doing it and would love to share the process with you all!&lt;/p&gt;

&lt;p&gt;When starting any project it we should start by working through the SDLC to maximize our productivity and minimize wasted time.&lt;/p&gt;

&lt;h2&gt;
  
  
  Stage One - Planning
&lt;/h2&gt;

&lt;p&gt;Q: What are we trying to build?&lt;/p&gt;

&lt;p&gt;A: A URL Shortener.&lt;/p&gt;

&lt;p&gt;Q: Why?&lt;/p&gt;

&lt;p&gt;A: It's a fun project!&lt;/p&gt;

&lt;h2&gt;
  
  
  Stage Two - Analysis
&lt;/h2&gt;

&lt;p&gt;Q: Is this something people would use?&lt;/p&gt;

&lt;p&gt;A: Totally! There are many out there, but this one is mine.&lt;/p&gt;

&lt;p&gt;Q: Potential for abuse?&lt;/p&gt;

&lt;p&gt;A: Yes, must consider mitigations.&lt;/p&gt;

&lt;p&gt;Q: Am I worried about monetization?&lt;/p&gt;

&lt;p&gt;A: Not so much.&lt;/p&gt;

&lt;h2&gt;
  
  
  Stage Three - Design
&lt;/h2&gt;

&lt;p&gt;Let's think about what we want our user experience (UX) to be once they land on our page. I want this to be simple and easy&lt;/p&gt;

&lt;h3&gt;
  
  
  Front End
&lt;/h3&gt;

&lt;p&gt;Going for a very simple google-esque design and using a design mockup tool I settled on something like this.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fi.imgur.com%2Fx6dfucu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fi.imgur.com%2Fx6dfucu.png" alt="bnon.xyz mockup image"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Back End
&lt;/h3&gt;

&lt;p&gt;The backend is the workhorse of this application, it has TWO main jobs.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Shorten Links&lt;/li&gt;
&lt;li&gt;Unshorten Links
&lt;/li&gt;
&lt;/ol&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;shorten&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="c1"&gt;# Take a "long" value and return a "short" value
&lt;/span&gt;    &lt;span class="k"&gt;pass&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;unshorten&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="c1"&gt;# Take a "short" value and return its corresponding "long" value
&lt;/span&gt;    &lt;span class="k"&gt;pass&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;All of our data will be stored in a database and for our purposes I chose to use SQLite, although for this implementation any database where you can have a unique primary key for every entry will work fine. If you want to use another method, you're on your own. For our short url purposes we only want to have valid, non special characters; upper and lowercase alphabet a-z,A-Z and numbers 0-9. This gives us a key space of 62 characters or base62; our database uses just numbers, or base10.&lt;/p&gt;

&lt;p&gt;Cool! Heavy lifting done!&lt;/p&gt;

&lt;p&gt;Long to short =&amp;gt; base10 -&amp;gt; base62&lt;/p&gt;

&lt;p&gt;Short to long =&amp;gt; base62 -&amp;gt; base10&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;baseconvert&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;base&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;shorten&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="c1"&gt;# Take a "long" value and return a "short" value
&lt;/span&gt;    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;base&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;62&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;string&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;unshorten&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="c1"&gt;# Take a "short" value and return its corresponding "long" value
&lt;/span&gt;    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;base&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;62&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;string&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Lets add some API endpoints using Flask&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;flask&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Flask&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;sqlite3&lt;/span&gt;

&lt;span class="n"&gt;app&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Flask&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;__name__&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;db&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;sqlite3&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;connect&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;file.db&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="nd"&gt;@app.route&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;/add&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;methods&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;POST&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]):&lt;/span&gt;
&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;add_url&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="c1"&gt;# Add url to database &amp;amp; get ID
&lt;/span&gt;    &lt;span class="n"&gt;row_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;db&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="n"&gt;lastrowid&lt;/span&gt;
    &lt;span class="c1"&gt;# shorten
&lt;/span&gt;    &lt;span class="n"&gt;short_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;shorten&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;row_id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="c1"&gt;# return to user
&lt;/span&gt;    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;short_id&lt;/span&gt;

&lt;span class="nd"&gt;@app.route&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;/&amp;lt;string:shortURL&amp;gt;&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;methods&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;GET&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]):&lt;/span&gt;
&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_url&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;shortURL&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="c1"&gt;# lengthen row id value (shortURL)
&lt;/span&gt;    &lt;span class="n"&gt;row_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;unshorten&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;shortURL&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="c1"&gt;# get url from database
&lt;/span&gt;    &lt;span class="n"&gt;row&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;db&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;row_id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="c1"&gt;# return to user
&lt;/span&gt;    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;row&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;url&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We've got our big pieces designed out and some psudocode written; it's time to move to the next stage!&lt;/p&gt;

&lt;h2&gt;
  
  
  Stage Four - Implementation
&lt;/h2&gt;

&lt;p&gt;There is no public repo for my code but check out the running implementation here - &lt;a href="https://bnon.xyz" rel="noopener noreferrer"&gt;https://bnon.xyz&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Front end is hosted for free on netlify, backend &amp;amp; database is hosted on Digital Ocean on small droplet.&lt;/p&gt;

&lt;p&gt;Security was kept in mind during the implementation of the project. Recaptcha is used to minimize automated shortlinking and of course this is validated on the backend as well as front.&lt;/p&gt;

&lt;p&gt;A feature I would like to add at a later date is VirusTotal link rating to mitigate any malicious actors using this service to deliver payloads or bypass network filtering devices.&lt;/p&gt;

&lt;h2&gt;
  
  
  Stage Five - Testing &amp;amp; Integration
&lt;/h2&gt;

&lt;p&gt;I spent an afternoon testing and securing this application. This was a really fun project and I'm receptive to any and all bug or vulnerability reports, there's always something to learn and improve on.&lt;/p&gt;

&lt;h2&gt;
  
  
  Stage Six - Maintenance
&lt;/h2&gt;

&lt;p&gt;So far so good, I've knocked down all the items and bugs on my list. Did I miss any?&lt;/p&gt;

&lt;p&gt;--&lt;/p&gt;

&lt;p&gt;Enjoyed the post? Let me know! 💛🦄🔖&lt;/p&gt;

</description>
      <category>saas</category>
      <category>microservices</category>
      <category>beginners</category>
      <category>webdev</category>
    </item>
    <item>
      <title>#30DaysOfThreads - The Cyber Attack Lifecycle</title>
      <dc:creator>🍌🍌🍌</dc:creator>
      <pubDate>Fri, 13 Dec 2019 02:14:29 +0000</pubDate>
      <link>https://dev.to/0xbanana/30daysofthreads-the-cyber-attack-lifecycle-14i2</link>
      <guid>https://dev.to/0xbanana/30daysofthreads-the-cyber-attack-lifecycle-14i2</guid>
      <description>&lt;p&gt;The Cyber Attack Lifecycle describes the actions taken by an attacker from initial identification and recon to mission complete. This helps us understand and combat bad actors, ransomware, and others.&lt;/p&gt;

&lt;p&gt;Let’s break down the steps !&lt;/p&gt;

&lt;h3&gt;
  
  
  Initial Reconnaissance 🔎
&lt;/h3&gt;

&lt;p&gt;Intruder selects target, researches it, and attempts to identify vulnerabilities in the target network. Some things attackers use and look for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Whois&lt;/li&gt;
&lt;li&gt;Target IP Ranges&lt;/li&gt;
&lt;li&gt;Web Properties, Domains &amp;amp; Subdomains&lt;/li&gt;
&lt;li&gt;Open Cloud Buckets&lt;/li&gt;
&lt;li&gt;Google dorking &lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Initial Compromise 📬
&lt;/h3&gt;

&lt;p&gt;Attacker compromises a vulnerable host. This may be a DMZ host or something in a higher security group via email phish. This is the first step into a network and why security people always say:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Don't click email links!&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Don't open email attachments!&lt;/strong&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Establish Foothold 🧗🏼‍♀️
&lt;/h3&gt;

&lt;p&gt;A compromised system is good, one that you can access is even better. Initial access or a foothold is an attackers first steps in your network. If there are network rules to block various network traffic, the attack may die here. &lt;/p&gt;

&lt;h3&gt;
  
  
  Escalate Privileges 📈
&lt;/h3&gt;

&lt;p&gt;Attackers often need more privileges on a system to get access to more data and permissions: for this, they need to escalate their privileges often to an Admin. &lt;/p&gt;

&lt;h3&gt;
  
  
  Internal Recon 👀
&lt;/h3&gt;

&lt;p&gt;Where are we internally , what are we looking for, and how can I get there?&lt;br&gt;
Here we apply the OODA loop - a simple strategy to help you find your way forward.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Observe - What do I see&lt;/li&gt;
&lt;li&gt;Orient - Where am I&lt;/li&gt;
&lt;li&gt;Decision - What do I need to do?&lt;/li&gt;
&lt;li&gt;Action - DO&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Move Laterally 👣
&lt;/h3&gt;

&lt;p&gt;Once they’re in a system, attackers can move laterally to other systems and accounts in order to gain more leverage: whether that’s higher permissions, more data, or greater access to systems. &lt;/p&gt;

&lt;h3&gt;
  
  
  Maintain Persistence 🏠
&lt;/h3&gt;

&lt;p&gt;Being able to return to networks again and again is one of an attackers main goals. They may not find what they’re looking for during in the first compromise and they will want to return. &lt;/p&gt;

&lt;h3&gt;
  
  
  Repeat (4-7) until (Mission) Complete 🔁 ✅
&lt;/h3&gt;

&lt;p&gt;Mission complete can be any number of things, anything your mind can think up from any spy or heist movie. Real data gets stolen every day. The current “average time to detect a breach” is 197 days.&lt;/p&gt;

&lt;p&gt;Stay safe out there!&lt;/p&gt;

&lt;p&gt;--&lt;br&gt;
Enjoyed the post? Let me know! 💛🦄🔖&lt;/p&gt;

</description>
      <category>security</category>
      <category>beginners</category>
      <category>privacy</category>
      <category>safety</category>
    </item>
    <item>
      <title>Starting off #30DaysOfThreads talking about the Software Development Lifecycle (SDLC)</title>
      <dc:creator>🍌🍌🍌</dc:creator>
      <pubDate>Thu, 12 Dec 2019 06:04:23 +0000</pubDate>
      <link>https://dev.to/0xbanana/starting-off-30daysofthreads-talking-about-the-software-development-lifecycle-sdlc-hd7</link>
      <guid>https://dev.to/0xbanana/starting-off-30daysofthreads-talking-about-the-software-development-lifecycle-sdlc-hd7</guid>
      <description>&lt;p&gt;Programming is just one part of software development and having a plan beats not having a plan.&lt;/p&gt;

&lt;h3&gt;
  
  
  Stage 1 - Planning
&lt;/h3&gt;

&lt;p&gt;We start with figuring out what we’re trying to do.&lt;/p&gt;

&lt;p&gt;Who will be using the app?&lt;br&gt;
How will it be used?&lt;br&gt;
What data will be collected?&lt;/p&gt;

&lt;p&gt;Planning is the most important phase of the lifecycle. Here changes are the easiest &amp;amp; get exponentially more difficult as we go. &lt;/p&gt;

&lt;h3&gt;
  
  
  Stage 2 - Analysis
&lt;/h3&gt;

&lt;p&gt;It’s great to have an idea and plan a project, it’s better to know if people are interested and willing to pay for it.&lt;/p&gt;

&lt;p&gt;Poll your communities, post on forums, even setup a landing page with a email form to gather leads and measure interest. &lt;/p&gt;

&lt;h3&gt;
  
  
  Stage 3 - Design
&lt;/h3&gt;

&lt;p&gt;Design is how you envision your app to look and what your user experience will be like.&lt;/p&gt;

&lt;p&gt;You know what data you’ll be collecting and presenting so now’s the time to make that experience be as easy and enjoyable for the user as possible. &lt;/p&gt;

&lt;h3&gt;
  
  
  Stage 4 - implementation
&lt;/h3&gt;

&lt;p&gt;DON’T START HERE! This is the part everyone can’t wait to jump into, actually making something.&lt;/p&gt;

&lt;p&gt;By this point you now have a carefully considered, thoughtful, and peer reviewed idea and design. Assemble your team, plan milestones and get to work! &lt;/p&gt;

&lt;h3&gt;
  
  
  Stage 5 - Testing &amp;amp; Integration
&lt;/h3&gt;

&lt;p&gt;Your app is completed! Does it work the way you expect? This stage, testing, commonly called QA (quality assurance). Bugs get sent back to developers to review and fix.&lt;/p&gt;

&lt;p&gt;You can find more information if you look up “unit testing”. &lt;/p&gt;

&lt;h3&gt;
  
  
  Stage 6 - Maintenance
&lt;/h3&gt;

&lt;p&gt;Now that your app has been released there isn’t much to do other than make sure it continues to work as expected. Accept bug and vulnerability reports and focus energies into other aspects surrounding your app; marketing, sales, user acquisition, etc. &lt;br&gt;
That’s the end of this thread! I hope you learned something!&lt;/p&gt;

&lt;p&gt;Comments and questions below!&lt;/p&gt;

&lt;p&gt;--&lt;br&gt;
Enjoyed the post? Let me know! 💛🦄🔖&lt;/p&gt;

</description>
      <category>startup</category>
      <category>beginners</category>
      <category>productivity</category>
    </item>
    <item>
      <title>New Site Made With HUGO!</title>
      <dc:creator>🍌🍌🍌</dc:creator>
      <pubDate>Wed, 11 Dec 2019 01:19:09 +0000</pubDate>
      <link>https://dev.to/0xbanana/new-site-made-with-hugo-3j10</link>
      <guid>https://dev.to/0xbanana/new-site-made-with-hugo-3j10</guid>
      <description>&lt;p&gt;Hugo is the world's fastest static website engine. It's written in Go and is used to generate all the HTML, JavaScript, and CSS used to display this site. Using a simple template syntax you can quickly create gorgeous websites with minimal code. In fact, every content page on this site is written in Markdown and then rendered by HUGO before being minifed for production.&lt;/p&gt;

&lt;p&gt;Feeling constantly behind the curve I spent the day reading and learning about Static Site Generators and what once was old is now new again. &lt;/p&gt;

&lt;p&gt;We've grown accustomed to dynamic sites and their many engaging features: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Comments&lt;/li&gt;
&lt;li&gt;Up/Down Votes&lt;/li&gt;
&lt;li&gt;Likes&lt;/li&gt;
&lt;li&gt;Retweets/Shares&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These features make sites fun, interactive, and increase the likelyhood of return users; they also provide a wonderful attack surface for abuse from trolls and other bad actors.&lt;/p&gt;

&lt;p&gt;Not needing many interactive elements, just a place to write out my thoughts, projects, and experiences, trying out a static site generator seemed like a great idea. I also didn't want to reinvent the wheel so I used a theme provided by the community and tweaked it to my liking.&lt;/p&gt;

&lt;p&gt;After following the HUGO quickstart setting the rest up was a snap. I'm taking this as an opportunity to try out netlify as a hosting service. They have some interesting featuers and options; at the moment I push any site changes to git and netlify picks it up from there. &lt;/p&gt;

&lt;p&gt;Find my blog at 👉👉👉 &lt;a href="https://0xbanana.com"&gt;https://0xbanana.com&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Super quick, super easy!&lt;/p&gt;

&lt;p&gt;--&lt;br&gt;
Enjoyed the post? Let me know! 💛🦄🔖&lt;/p&gt;

</description>
      <category>hugo</category>
      <category>webdev</category>
      <category>beginners</category>
      <category>ux</category>
    </item>
  </channel>
</rss>
