<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Liz Benton</title>
    <description>The latest articles on DEV Community by Liz Benton (@e_liz_the_best).</description>
    <link>https://dev.to/e_liz_the_best</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/e_liz_the_best"/>
    <language>en</language>
    <item>
      <title>Building a professional website with Splunk integration on AWS (Part 1)</title>
      <dc:creator>Liz Benton</dc:creator>
      <pubDate>Sun, 02 Jul 2023 20:37:21 +0000</pubDate>
      <link>https://dev.to/e_liz_the_best/building-a-professional-website-with-splunk-integration-on-aws-part-1-3jd7</link>
      <guid>https://dev.to/e_liz_the_best/building-a-professional-website-with-splunk-integration-on-aws-part-1-3jd7</guid>
      <description>&lt;p&gt;Goal: To set up a Dashboard in Splunk where I can monitor network activity and alerts regarding the security of my professional website hosted on AWS.&lt;/p&gt;

&lt;p&gt;List of AWS services and how they will be used:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Visual Studio Code: HTML/CSS for website&lt;/li&gt;
&lt;li&gt;AWS S3 buckets - where files and policies for the website will be stored and served, as well as CloudTrail logs for Splunk to pull from&lt;/li&gt;
&lt;li&gt;AWS Route 53 - Used to handle Domain Name Service assignment (Domain name purchased originally on Cloudflare)&lt;/li&gt;
&lt;li&gt;AWS CloudFront - Used to host the S3 hosted website, enabling HTTPS protocol with added benefits and authentication certificate for both www version and non-www version redirects for the web page&lt;/li&gt;
&lt;li&gt;Splunk - installed on an AWS EC2 instance and configured to accept logs generated from AWS CloudTrail&lt;/li&gt;
&lt;li&gt;AWS CloudTrail - Generate logs whenever an API call is made. Logs files will be in JSON format and configured to be sent to an S3 bucket for storing and analyzing via Splunk using add-on.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Before beginning any project, it's always nice to have a simple diagram or idea jotted down or drawn somewhere so that we have a good grasp of what we will need to achieve our goal.&lt;/p&gt;

&lt;p&gt;Since I am admittedly still learning, I did not opt to use anything fancy here, and just built a simple architecture diagram over on the free web UI app, draw.io.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqvf8s6ufxe80i880u9vb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqvf8s6ufxe80i880u9vb.png" alt="Image description" width="759" height="793"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Isn't it pretty?&lt;/p&gt;

&lt;p&gt;I explained a bit above how each of these services will be used, so I hope it makes sense! We'll dive into each individual part as well.&lt;/p&gt;

&lt;p&gt;First, I needed to code my website! For this, I used Visual Studio Code, a free coding application provided by Microsoft. If you're looking for a solid program to build your next web application on, I highly recommend it.&lt;/p&gt;

&lt;p&gt;Since I had only ever used HTML before and never CSS, this was quite the learning process and took me a few days and tons of googling. I even asked the help of chatGPT but by the end, I had taught myself divs and CSS to the point where I could build the last remaining parts all on my own, which felt like an amazing achievement all on its own! &lt;/p&gt;

&lt;p&gt;Once my code was ready, I headed over to AWS, opened up the management console and opened up S3, where I would need to upload all my files.&lt;/p&gt;

&lt;p&gt;Important note:&lt;br&gt;
For added security, at this point in the project, I made sure not to use my root account and instead created an IAM user, added rules and permissions needed for the services I would be using, and added MFA to login. It's very important to add MFA since if anyone were to gain access to your account, they could abuse services and you may be surprised by a very high bill! &lt;/p&gt;

&lt;p&gt;I would also encourage you to add a billing alert with a threshold of $5 or so, so that if any of your costs go above $5, you are sent an email or however you would like to be alerted. It takes little to no time to set up and could save you the pain of paying hundreds, if not thousands of dollars.&lt;/p&gt;

&lt;p&gt;I'm assuming if you're wanting to build your own website, you'll already have a domain name purchased. In some cases, a domain can be transferred to Route53 for authentication, but in the event it cannot be transferred, you may need to purchase a new domain on Route53. Sadly, I had to go this route since the domain I had paid for on Cloudflare could not be transferred. It was only $13 to purchase a new domain on Route53, so not too bad, but something to keep in mind if you've already purchased a domain you want to use elsewhere.&lt;/p&gt;

&lt;p&gt;Once that is sorted, we'll create the bucket that stores the files that will make-up our webpage.&lt;/p&gt;

&lt;p&gt;Creating a bucket is really simple, first click on "create bucket". You'll be presented with a form to fill out:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuli5qrqpneg0f4xb91yv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuli5qrqpneg0f4xb91yv.png" alt="Image description" width="800" height="726"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Service Overview: S3 (Simple Storage Service)&lt;/strong&gt;&lt;br&gt;
S3 is really nice file storage service that costs little to nothing to store files and data. There are many ways to utilize it, depending on the size and/or types of files being stored, as well as the frequency the files need to be accessed. It's great for large databases and for files that only exist for auditing or law-abiding purposes, but can also be used for everyday, secure file storage, like some often use things like google drive for. Today, we'll be using it to host the files for our website using the static website hosting option in the settings for the bucket.&lt;/p&gt;

&lt;p&gt;Be sure to set your bucket to PUBLIC so that people will be able to view the literal contents of your website. It will give you a bunch of scary messages warning you that it's public, but don't worry about it!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpxkyhybom40lr5czk7h5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpxkyhybom40lr5czk7h5.png" alt="Image description" width="800" height="698"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;All other settings for your bucket can be customized if need-be, but I chose to keep them all default. For example, if you want to enable versioning, to be able to go back and retrieve older versions of files, that is an option available to you. Just know that often times when adding additional functionality to a service, the useage cost will increase. I'm not sure if versioning adds any additional cost and sadly, I did not look into it since I won't be using it, but if you're planning on using S3 for development purposes, it doesn't sound like a bad idea to have the option to access all added versions of a file.&lt;/p&gt;

&lt;p&gt;Once your bucket is created, it will show up in a list. You can access the properties and other fun things about your bucket by clicking on its name.&lt;/p&gt;

&lt;p&gt;Next we need to add a policy to our bucket, so once your bucket is created, go ahead and click on the bucket name &amp;gt; Permissions &amp;gt; where it says "Bucket Policy" you'll need to click "Edit" and add your own policy.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4lpfnnppk0rr0hnhvuvj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4lpfnnppk0rr0hnhvuvj.png" alt="Image description" width="800" height="120"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I'll add the one I used here for convenience, but essentially it allows the public to be able to read/view the contents of your bucket, which we need for people to be able to view our website once it goes live!&lt;/p&gt;

&lt;p&gt;Bucket Policy:&lt;br&gt;
{&lt;br&gt;
    "Version": "2012-10-17",&lt;br&gt;
    "Statement": [&lt;br&gt;
        {&lt;br&gt;
            "Sid": "PublicReadGetObject",&lt;br&gt;
            "Effect": "Allow",&lt;br&gt;
            "Principal": "&lt;em&gt;",&lt;br&gt;
            "Action": "s3:GetObject",&lt;br&gt;
            "Resource": "arn:aws:s3:::[YOURBUCKETNAMEHERE]/&lt;/em&gt;"&lt;br&gt;
        }&lt;br&gt;
    ]&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;The above is known as a JSON format policy. Where it says "[YOURBUCKETNAMEHERE]" you'll enter your bucket name that's hosting your site files, just remove the brackets (but keep the "/*" so that it applies to all files inside the bucket)!&lt;/p&gt;

&lt;p&gt;Save the new bucket policy and then return to the bucket overview at the top of the page. You should see a tab called "Properties" and at the very bottom of that tab, you should see a section where it says, "Static Website Hosting".&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhyu3nlhlioaos55ytp8m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhyu3nlhlioaos55ytp8m.png" alt="Image description" width="800" height="71"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We will need to click on "Edit" here as well and enable the static website hosting. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8xaom0jlsyrlcm6dl2rz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8xaom0jlsyrlcm6dl2rz.png" alt="Image description" width="800" height="555"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Toggle the enable option and then add the index.html file for your website in the field that requests it. You don't need to worry about adding an error document or redirect.&lt;/p&gt;

&lt;p&gt;After this, your files should be hosted live via the S3 bucket static hosting, so open up your favorite web browser and enter your domain. &lt;/p&gt;

&lt;p&gt;Problem is, while the files are hosted, the domain tied to it is the S3 bucket's own, unalterable, directory. Obviously, we took the time to think of and purchase our own domain and we want that to be the way our visitors access our site, so in order to configure this addressing, we will need to use our next service, Route53!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Service Overview: Route53 (DNS or Domain Name management service)&lt;/strong&gt;&lt;br&gt;
Route53 is possibly named after the routing protocol, 53, used for DNS. Regardless, it makes assigning a valid certificate to your website a breeze! We'll also be using a service called Certificate Manager later to manage our TLS/SSL certificates that will verify that the website being displayed is our own and not someone else's. For now, we just need to prove that the domain we are planning on applying to our S3 bucket page as a new address of sorts, is owned by us!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmo1hdjsidut7cu1p38mx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmo1hdjsidut7cu1p38mx.png" alt="Image description" width="711" height="263"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can get to Route53 by searching for it in the search bar at the top of the AWS management console. I would recommend opening a new tab for each service so that if you need to go back and forth to copy-paste, it makes life easier.&lt;/p&gt;

&lt;p&gt;Once there, you'll need to register a domain. Just follow the process as it is a simple step-by-step to either transfer an existing domain (if applicable--it will let you know via search) or process to purchase a new domain.&lt;/p&gt;

&lt;p&gt;Take the time to make your domain your own! Give it a name that is unique and not complex, something easily remembered, fits on a business card easily, and you can tell what its purpose is just by reading it. &lt;/p&gt;

&lt;p&gt;"io" is often used by those in the developer industry, but can be a bit more expensive to purchase. There's nothing wrong with starting off as a commercial, ".com" so don't feel bad if you're not ready or prepared to make that investment on your first webpage.&lt;/p&gt;

&lt;p&gt;Someday, when you've become a master of your class, you can even apply ".ninja"! I can only ever hope to achieve something so cool...but I'm working toward it!&lt;/p&gt;

&lt;p&gt;Once your Domain has been created (this may take a few minutes) you are ready to create some DNS records in Route53!&lt;/p&gt;

&lt;p&gt;Click on your domain and then click on "create record". The process is pretty straight-forward and at the end you'll have 2 recorded by default, SOA and NS. These are basic requirements when it comes to record management.&lt;/p&gt;

&lt;p&gt;We'll need to add a new "A" type record for our S3 bucket domain, so be sure to go to your bucket and copy the ARN and add it to the new record. &lt;/p&gt;

&lt;p&gt;Once your domain is all set up, open your favorite web browser and test it out! Does it work? If it does, then you've successfully hosted a website on AWS using S3 and Route53!&lt;/p&gt;

&lt;p&gt;You may be noticing however, that the site is not secure and you may have even been warned about this when visiting it for the first time. Maybe you even needed to allow an exception. We obviously don't want that and want to keep ourselves and our visitors safe, so we will now go through the process of adding HTTPS instead of HTTP protocol to our domain via two services: Certificate Manager and AWS CloudFront.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Service Overview: AWS CloudFront&lt;/strong&gt;&lt;br&gt;
CloudFront is a global distribution service or content delivery network that utilizes Edge Locations within designated regions to make the performance of applications more expedient, while also increasing security through the use of HTTPS, rather than HTTP protocol for web traffic, which basically means that network traffic is encrypted and not easily read by prying eyes. This is crucial when handling usernames, passwords, and any other sensitive data. It also enables the use of a WAF, or Web Application Firewall. More on this in a moment.&lt;/p&gt;

&lt;p&gt;Before we can have something as awesome as HTTPS for our application, we'll need to request certificates from the Certificate Manager. This can be found by searching for it at the top of the management console.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftoqkrvfxd97h6ugtlv19.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftoqkrvfxd97h6ugtlv19.png" alt="Image description" width="785" height="608"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once there, you'll click on "Request Certificate" and follow the instructions to create a "public" certificate, then you will be given the steps to apply the certificate with your Route53 configurations. The result desired at the end will be what is known as a "CNAME" or "Canonical Name" record that will help us route the domain we chose to the CloudFront distribution.&lt;/p&gt;

&lt;p&gt;Once the certificate(s) have been created, we'll head on over to CloudFront and complete the configuration process.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft7pqen48mbo97l3wuh6v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft7pqen48mbo97l3wuh6v.png" alt="Image description" width="800" height="316"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click on "Create CloudFront distribution" to begin. Now, there are a LOT of bubbles and boxes here, but don't let it intimidate you. You've come this far, you can't lose now!&lt;/p&gt;

&lt;p&gt;I'll walk you through each section:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9e8sbr3dbrxpnfi9u9c8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9e8sbr3dbrxpnfi9u9c8.png" alt="Image description" width="800" height="734"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For this section, "Origin" it will try to auto-populate the domain based on the bucket you previously created, but DON'T USE THIS! Instead, manually type your FULL domain (&lt;a href="http://www.example.com" rel="noopener noreferrer"&gt;www.example.com&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;The rest of the fields should auto-populate after adding the domain. Unless you're expecting a lot of demand that taxes your website's availability, I wouldn't worry about enabling Origin Shield, so feel free to say no.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff0fg2ums20x86fe63mpc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff0fg2ums20x86fe63mpc.png" alt="Image description" width="800" height="1172"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For this section, the only things we need to make sure we have selected are "HTTPS only" and "GET, HEAD" options. The rest can be set as default.&lt;/p&gt;

&lt;p&gt;Leave the "Function Association" alone and enable a WAF if you'd like the added layer of protection (just keep in mind, it will result in additional costs..I pay around $8/month to maintain mine).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhfpj85t25blqp286aiq8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhfpj85t25blqp286aiq8.png" alt="Image description" width="800" height="1502"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This section is very important. Price class can be chosen based on your own preferences, but having global edge locations accessible will increase cost, as it increases performance. Everything has a price, remember!&lt;/p&gt;

&lt;p&gt;Next, we'll click on "Add item" to add the CNAME record we created earlier to the form. And immediately after that, we'll want to include the certificate we created using the Certificate Manager. It should be visible to you in the dropdown menu, so just click to add it.&lt;/p&gt;

&lt;p&gt;For supported HTTP versions, either is fine, I just went with the default HTTP/2. Everything else was left on default.&lt;/p&gt;

&lt;p&gt;Once everything looks good, click on "Create distribution" and wait for it to go live. You should be able to see the status on the far right when the page loads and you can check the most recent status by clicking on the little refresh button on the page.&lt;/p&gt;

&lt;p&gt;Once the status is "Enabled" you are good to go! You should now be able to access your secure webpage. Give it a go! I hope you're feeling proud about what you've been able to accomplish. I know I did/do!&lt;/p&gt;

&lt;p&gt;You did it! You made your very own website and hosted it, securely, on AWS! It's in the cloud now for everyone to enjoy, so be sure to share your accomplishment with all your friends and family!&lt;/p&gt;

&lt;p&gt;If you'd like to see the site I built during this project, be sure to check it out over at Lizbuildsgames.com! And of course, I'd love to see any of your websites you've created, my dear readers!&lt;/p&gt;

&lt;p&gt;Stay tuned for part 2 of this project where we'll be adding Splunk log integration using AWS CloudTrail, SNS (Simple Notification Service), as well as SQS (Simple Queue Service).&lt;/p&gt;

&lt;p&gt;And of course, if you find yourself stuck or have any questions, comments or concerns, feel welcome to comment and I will respond as quickly as I can. (I am very busy, so apologies if you're left hanging for a bit--honestly, google is your best friend when developing anything imo--and it always feels amazing when you are able to troubleshoot things on your own!)&lt;/p&gt;

&lt;p&gt;Until next time~! :)&lt;/p&gt;

</description>
      <category>aws</category>
      <category>splunk</category>
      <category>webdev</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Splunk: Building a Secure Monitoring Solution (Part 2)</title>
      <dc:creator>Liz Benton</dc:creator>
      <pubDate>Thu, 08 Jun 2023 17:09:00 +0000</pubDate>
      <link>https://dev.to/e_liz_the_best/splunk-building-a-secure-monitoring-solution-part-2-208m</link>
      <guid>https://dev.to/e_liz_the_best/splunk-building-a-secure-monitoring-solution-part-2-208m</guid>
      <description>&lt;p&gt;Welcome to part 2 of building a secure monitoring solution in Splunk!&lt;/p&gt;

&lt;p&gt;In this part of the project, I'll be going over the analysis of the reports, alerts and dashboard visualizations before and after the addition of the attack logs.&lt;/p&gt;

&lt;p&gt;I'll be using the following formatting to make things more efficient, since this is a lot to dive into:&lt;/p&gt;

&lt;p&gt;Results before attack &amp;gt; Results after attack &amp;gt; Conclusion&lt;/p&gt;

&lt;p&gt;Let's begin!&lt;/p&gt;

&lt;p&gt;We'll start with our Windows Server and then move onto Apache.&lt;/p&gt;

&lt;p&gt;-Windows Server Reports-&lt;/p&gt;

&lt;p&gt;Severity Report Analysis&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq0mvhtp6pljprk9tgkav.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq0mvhtp6pljprk9tgkav.png" alt="Image description" width="800" height="155"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc2tp77wchps2gnbsojtq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc2tp77wchps2gnbsojtq.png" alt="Image description" width="800" height="155"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
Informational count from 11k to 4k, while high severity reports increased from 987 to 1k. Since the drop from informational server requests fell 7k by count, it could indicate that for some reason accounts that should be using the information are unable to, due to lockout or some other reason that should be investigated.&lt;/p&gt;

&lt;p&gt;Success and Failure of Windows activities Report Analysis&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0cgtwlworo8le8mlx380.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0cgtwlworo8le8mlx380.png" alt="Image description" width="800" height="162"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyzobmkgvqpcf7w2ohuc8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyzobmkgvqpcf7w2ohuc8.png" alt="Image description" width="800" height="162"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
Successful events decreased from 13,848 to 5,854, while failures decreased from 426 to only 93. The difference seems most jarring for the successful events and may indicate abnormal activity on accounts.&lt;/p&gt;

&lt;p&gt;-Windows Server Alerts-&lt;/p&gt;

&lt;p&gt;-Alert for Failed Windows Activities-&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcrwv7dyawellm08cf63q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcrwv7dyawellm08cf63q.png" alt="Image description" width="800" height="120"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3uksx9jhavqi1q1up8t7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3uksx9jhavqi1q1up8t7.png" alt="Image description" width="800" height="116"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
At 8AM there was a very large number of failed activities, totaling 35 for the hour. &lt;br&gt;
Since the threshold we set was to create an alert for anything &amp;gt; 18, this alert would have picked up on this suspicious activity.&lt;br&gt;
Whether or not this was a false-positive remains to be seen.&lt;/p&gt;

&lt;p&gt;-Alert for successful logins-&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpj99qwmjugqjypgc3qa1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpj99qwmjugqjypgc3qa1.png" alt="Image description" width="800" height="117"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0n9ujwofv8vkg3udjg52.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0n9ujwofv8vkg3udjg52.png" alt="Image description" width="800" height="118"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
My Alert was also triggered for suspicious activities taking place at 2AM, 9AM and 10AM. Anything &amp;gt; 26 would trigger an alert, so the successful logins from user_a equaling 91 counts at 2AM would have definitely triggered. The others would have also triggered the alert--70 logins, 67 of which were by user_k at 9AM, followed by 54 logins at 10AM, 52 of which were by user_k.&lt;br&gt;
This tells us that user_k may be compromised, and a security admin may need to step in to temporarily disable access to the account until they can isolate and remove any malicious entity.&lt;/p&gt;

&lt;p&gt;-Alert for account deletions-&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzkrjiffrodn87xno90qy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzkrjiffrodn87xno90qy.png" alt="Image description" width="800" height="121"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq1tbp1df8wvr38pplzuw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq1tbp1df8wvr38pplzuw.png" alt="Image description" width="800" height="115"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
A large number of accounts appear to have been deleted at the same times we previously saw a suspicious login activity from user_a and user_k. &lt;br&gt;
We can see 45 events of account deletion at 2AM, with 42 of them originating from user_a. We also see 75 events at 9AM, 72 of which were by user_k, followed by 44 events, 43 of which were also by user_k at 10AM.&lt;/p&gt;

&lt;p&gt;It would appear that action may need to be taken to temporarily cut off access from user_a and user_k to prevent any potential damage until the cause of this activity can be determined and eradicated.&lt;/p&gt;

&lt;p&gt;-Dashboard Analysis-&lt;/p&gt;

&lt;p&gt;Full-view before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhnmg29sl6mg6f1dpjnym.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhnmg29sl6mg6f1dpjnym.png" alt="Image description" width="800" height="319"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Full-view after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqb6ibb6sxqchvot0oxbs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqb6ibb6sxqchvot0oxbs.png" alt="Image description" width="800" height="319"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
As our previous data showed via our alerts and reports, users user_a and user_k had the most hourly activity. The activities specifically outlined via signature monitoring being attempts to reset account passwords and locking out other users.&lt;/p&gt;

&lt;p&gt;The hourly activity monitoring dashboards also paint an accurate depiction of the activity that was occurring during 2AM and 8AM. &lt;br&gt;
We can see an attack taking place between 12AM and 2AM locking users out of their accounts. Based on what we already know, this was done by user_a, or someone compromising their account to lock out other users.&lt;/p&gt;

&lt;p&gt;We can also see between 8AM and 10AM where user_a and user_k made a combined 761 attempts to reset account passwords.&lt;/p&gt;

&lt;p&gt;This data tells us that a brute force attack may have taken place during the hours of 12AM and 2AM that resulted in users being locked out due to security concerns due to failed login attempts, which eventually led to successful login attempts once the passwords were successfully cracked by the attackers.&lt;/p&gt;

&lt;p&gt;The password resets may be in response to the mass account lock outs, since all of the affected accounts would require password resets before they could be used again.&lt;/p&gt;

&lt;p&gt;-Individual Dashboard panel comparisons-&lt;/p&gt;

&lt;p&gt;-User Analysis-&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fele6wae12oqyr21sol8k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fele6wae12oqyr21sol8k.png" alt="Image description" width="527" height="284"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc85wqumdhvvjqdzh6zbj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc85wqumdhvvjqdzh6zbj.png" alt="Image description" width="525" height="291"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
User_a and User_k dominated and may have been compromised.&lt;/p&gt;

&lt;p&gt;-Source Domain-&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3dbnshqaenopmpth9mwd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3dbnshqaenopmpth9mwd.png" alt="Image description" width="798" height="294"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F96l4wjqxlrenc2u6z3qd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F96l4wjqxlrenc2u6z3qd.png" alt="Image description" width="774" height="270"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
This chart confirms the activity by user_a and user_k during the hours summarized earlier.&lt;/p&gt;

&lt;p&gt;-Windows Activities (based on signatures)-&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcf9bng1z2ff4sq1h7ew7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcf9bng1z2ff4sq1h7ew7.png" alt="Image description" width="611" height="336"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4099w3p5f0kq2qkze424.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4099w3p5f0kq2qkze424.png" alt="Image description" width="615" height="321"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
This chart change shows the activities conducted in response to the behaviors of user_a and user_k: user accounts being locked out and attempts made to change account passwords.&lt;/p&gt;

&lt;p&gt;As you can see above, there is definitely some abnormal activity originating from user_a and user_k. &lt;/p&gt;

&lt;p&gt;Now we will analyze our Apache server log data...&lt;/p&gt;

&lt;p&gt;-Reports-&lt;/p&gt;

&lt;p&gt;-HTTP Methods Report-&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdhjma8s1c2oxzj7areha.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdhjma8s1c2oxzj7areha.png" alt="Image description" width="800" height="169"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq7qavv6we07dibsirpcn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq7qavv6we07dibsirpcn.png" alt="Image description" width="800" height="169"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
GET requests fell from 30k to around 3k, which is a huge difference. POST requests did the opposite and increased from around 400 to well over 1000.&lt;/p&gt;

&lt;p&gt;-Report Analysis for Referrer Domains-&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe2ieuqhc527bc7wuphqf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe2ieuqhc527bc7wuphqf.png" alt="Image description" width="800" height="256"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9g04cueh169wzk6sdmvd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9g04cueh169wzk6sdmvd.png" alt="Image description" width="800" height="252"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
The numbers across all traffic decreased substantially, going from the high thousands to mere hundreds, which could indicate that systems are not working as intended or are being brought down/inaccessible. &lt;/p&gt;

&lt;p&gt;-Report Analysis for HTTP Response Codes-&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy6f3kwhshp3wl099tuh6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy6f3kwhshp3wl099tuh6.png" alt="Image description" width="800" height="223"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnjk18ozph96attonxdsh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnjk18ozph96attonxdsh.png" alt="Image description" width="800" height="211"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
Once again we see numbers falling drastically. This may be indicative of a Denial of Service attack, since these numbers are not within the normal expected range.&lt;/p&gt;

&lt;p&gt;-Alerts-&lt;/p&gt;

&lt;p&gt;-HTTP POST Alert-&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fguese223qybscc2khmvn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fguese223qybscc2khmvn.png" alt="Image description" width="800" height="112"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpgohwrne6rp8h9ank1kh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpgohwrne6rp8h9ank1kh.png" alt="Image description" width="800" height="116"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
1,296 POST method requests came in at 9AM. Our alert was anything higher than 10, so this was definitely caught, though I do see other alerts with much lower numbers.&lt;br&gt;
Considering the POST method requests outside of 9AM were in the single digits, I may increase from 10 to something a bit higher. I think anything greater than 100 within the span of 1 hour may be a better threshold.&lt;/p&gt;

&lt;p&gt;-Dashboard Analysis-&lt;/p&gt;

&lt;p&gt;-HTTP Methods per hour-&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fysmll6iipqwtxwkyoeq1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fysmll6iipqwtxwkyoeq1.png" alt="Image description" width="800" height="137"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqz9p2fou5rfr3g587h30.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqz9p2fou5rfr3g587h30.png" alt="Image description" width="800" height="141"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
A high number of POST requests seem to have occurred around 8PM, while GET requests fell tremendously. This maybe be indicative of an attack meaning to deny service but will need to correlate with other data points to confirm.&lt;/p&gt;

&lt;p&gt;-Geolocation Cluster Map Analysis-&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fopsi4sth9k2o1hr0zsuw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fopsi4sth9k2o1hr0zsuw.png" alt="Image description" width="800" height="351"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpn1lftxwngzqf6zctbgq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpn1lftxwngzqf6zctbgq.png" alt="Image description" width="800" height="319"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl1aq2o2kwzo9rm8v3fpr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl1aq2o2kwzo9rm8v3fpr.png" alt="Image description" width="800" height="356"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fshkozldfuoofilfcg0ii.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fshkozldfuoofilfcg0ii.png" alt="Image description" width="800" height="352"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa85iyeu0vjnjyg23bw2u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa85iyeu0vjnjyg23bw2u.png" alt="Image description" width="800" height="437"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqrfukpr7nxp1utcnln69.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqrfukpr7nxp1utcnln69.png" alt="Image description" width="800" height="384"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
We can clearly see a noticeable uptick in requests coming from Ukraine and Sweden, as well as within the US, with the majority coming in from the East Coast.&lt;/p&gt;

&lt;p&gt;-Dashboard Analysis of URI data-&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fre65jft0vmznyoy6lfty.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fre65jft0vmznyoy6lfty.png" alt="Image description" width="734" height="325"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F04qlk829gk3f9xwa2k9p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F04qlk829gk3f9xwa2k9p.png" alt="Image description" width="744" height="306"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
Malicious actor may be running a script coded in java (php) to automate login attempts, possible indicating a brute force attack. We also see a large number of access to an archive, which may be an effort on someone’s part to exfiltrate log data or other important records.&lt;/p&gt;

&lt;p&gt;-HTTP POST Requests-&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5vjoxy1s9a3846uut2ju.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5vjoxy1s9a3846uut2ju.png" alt="Image description" width="332" height="185"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkmxievs4ktpyrzwp37sd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkmxievs4ktpyrzwp37sd.png" alt="Image description" width="323" height="105"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
This POST method increase matches what we saw in our HTTP Method bar graph.&lt;/p&gt;

&lt;p&gt;-HTTP GET Requests-&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvzldjk5kfb0wj9aejjm9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvzldjk5kfb0wj9aejjm9.png" alt="Image description" width="334" height="184"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzcdnoeayw4o6cixywezg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzcdnoeayw4o6cixywezg.png" alt="Image description" width="333" height="110"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
The number of GET requests decreased, which may be indicative of an attack limiting access to resources. Will need to correlate with other data to confirm.&lt;/p&gt;

&lt;p&gt;-User Agents Analysis-&lt;/p&gt;

&lt;p&gt;Results before attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz0xjmor09ex66zethqwp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz0xjmor09ex66zethqwp.png" alt="Image description" width="785" height="300"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Results after attack:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj2uirzdq2h0f5c758zyz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj2uirzdq2h0f5c758zyz.png" alt="Image description" width="781" height="293"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
A dominant user agent arose as illustrated in our pie chart: Mozilla/4.0 –possibly using a downgraded browser version vulnerability.&lt;/p&gt;

&lt;p&gt;If you made it to the end of this report, I appreciate you!&lt;/p&gt;

&lt;p&gt;This was a lot of work, but I felt happy at what I was able to create and how I was able to give myself a glimpse of what a day-in-the-life of a SOC Analyst might be like. That said, I know I still have a lot to learn!&lt;/p&gt;

&lt;p&gt;Stay tuned~!&lt;/p&gt;

</description>
      <category>splunk</category>
      <category>analytics</category>
      <category>cybersecurity</category>
      <category>project</category>
    </item>
    <item>
      <title>Splunk: Building a Secure Monitoring Solution (Part 1)</title>
      <dc:creator>Liz Benton</dc:creator>
      <pubDate>Thu, 08 Jun 2023 03:54:06 +0000</pubDate>
      <link>https://dev.to/e_liz_the_best/splunk-building-a-secure-monitoring-solution-part-1-gci</link>
      <guid>https://dev.to/e_liz_the_best/splunk-building-a-secure-monitoring-solution-part-1-gci</guid>
      <description>&lt;p&gt;During the last several weeks of my Cybersecurity boot camp, one of our final projects was to build a secure monitoring environment for a fictitious organization called VSI (Virtual Space Industries) using Splunk Enterprise, which for those who may not know, is a SIEM (Security Information and Event Manager). SIEMs are essential tools that companies can use to detect, analyze, and respond to potential threats against their organization.&lt;/p&gt;

&lt;p&gt;Since this was a big project with a lot of steps, I'll be breaking it up into 2 parts:&lt;/p&gt;

&lt;p&gt;Part 1:&lt;br&gt;
Creating Reports, Alerts, and Dashboards for Windows server log data as well as Apache webserver log data that can help point out any abnormal activity.&lt;/p&gt;

&lt;p&gt;Part 2:&lt;br&gt;
Checking to see if the solutions created in part 1 were effective against a fictitious attack by uploading the attack log data and seeing if our reports, alerts, and dashboards, picked up anything that would have helped the organization take the appropriate action as quickly as possible.&lt;/p&gt;

&lt;p&gt;Part 1&lt;br&gt;
I started by launching Splunk, which had been pre-installed in my ubuntu VM. I logged into the application and uploaded the files I would be using to create reports, alerts and Dashboards for.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fefbw8o0lwguxl0tnuovt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fefbw8o0lwguxl0tnuovt.png" alt="Image description" width="800" height="586"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzi8movf9q3mjkpqgat5r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzi8movf9q3mjkpqgat5r.png" alt="Image description" width="496" height="408"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once the logs were uploaded, I briefly took notice of and analyzed the following fields:&lt;br&gt;
o   signature&lt;br&gt;
o   signature_id&lt;br&gt;
o   user&lt;br&gt;
o   status&lt;br&gt;
o   severity&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgn3b0oenngpi0bt6863r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgn3b0oenngpi0bt6863r.png" alt="Image description" width="800" height="348"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Apologies that some images might be a bit small and difficult to read. Luckily, I also took some screenshots of the data inside each individual field as well:&lt;/p&gt;

&lt;p&gt;signature:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh8bjgkquuxki7f9spr93.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh8bjgkquuxki7f9spr93.png" alt="Image description" width="800" height="683"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;signature_id:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmg70ydob3o91x6squkhk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmg70ydob3o91x6squkhk.png" alt="Image description" width="800" height="708"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;user:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft7jyxitqi6gj9rb2f8uu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft7jyxitqi6gj9rb2f8uu.png" alt="Image description" width="800" height="657"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;status:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9fwmev1eps3kvrmt20cv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9fwmev1eps3kvrmt20cv.png" alt="Image description" width="800" height="484"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;severity:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6vcybeta54xl6jqvf0n3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6vcybeta54xl6jqvf0n3.png" alt="Image description" width="800" height="462"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;These would be the main points of interest we will be using to create our reports, alerts, and dashboard. Let's start with the reports!&lt;/p&gt;

&lt;p&gt;Report 1: A report with a table of signatures and their associated signature_id. This would allow VSI to view reports that show the ID number associated with each specific signature for a Windows activity.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv70j7susmgixuh8tlcwp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv70j7susmgixuh8tlcwp.png" alt="Image description" width="800" height="318"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Report 2: A report that displays severity levels and the count and percentage of each. This would allow VSI to quickly understand the severity levels of Windows logs being viewed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frq3doyxpsc5221i4ynii.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frq3doyxpsc5221i4ynii.png" alt="Image description" width="800" height="184"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Report 3: A report that provides a comparison between the success and failure of Windows activities. This would show VSI is there is any suspicious level of failed activities on their Windows server.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2y2l0yhutt1kw2h51kve.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2y2l0yhutt1kw2h51kve.png" alt="Image description" width="800" height="200"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now that those were done, it was time to move onto creating the Alerts!&lt;/p&gt;

&lt;p&gt;Alerts would all trigger an email to be sent to the fictitious company at &lt;a href="mailto:SOC@VSI-company.com"&gt;SOC@VSI-company.com&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Alert 1: An alert that is triggered when a threshold for hourly failed Windows activities has been reached. This would help VSI see if any failed logins or any other activities occurred an excessive amount of times within an hour, which could be indicative of someone trying to do something they shouldn't be able to do, such as trying to login and failing repeatedly.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc8sj1sq452nm5crlthtu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc8sj1sq452nm5crlthtu.png" alt="Image description" width="800" height="217"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The threshold for this alert I chose was &amp;gt; 18 per hour.&lt;/p&gt;

&lt;p&gt;Alert 2: An alert that is triggered when a threshold has been reached for the amount of successful logins per hour.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhfo0ifuc4csmdz2k6ye8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhfo0ifuc4csmdz2k6ye8.png" alt="Image description" width="800" height="224"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The threshold I chose for this alert was anything &amp;gt; 26 per hour.&lt;/p&gt;

&lt;p&gt;Alert 3: An alert that is triggered when a threshold is met for the signature count when a user's account has been deleted, once again in an hourly window.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwczq9dv0nk96uhzgeu9v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwczq9dv0nk96uhzgeu9v.png" alt="Image description" width="800" height="192"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For this alert, I chose anything &amp;gt; 35 within one hour.&lt;/p&gt;

&lt;p&gt;Now for the fun part, creating Dashboards to monitor Windows Server Activity at a quick glance!&lt;/p&gt;

&lt;p&gt;I always have a lot of fun creating dashboards in Splunk, this project being no exception. I made:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;A line chart that displays account deletion signatures over time within the span of 1h.&lt;/li&gt;
&lt;li&gt;A line char that displays the different user field values over time.&lt;/li&gt;
&lt;li&gt;A pie chart of the different signatures based on windows activities.&lt;/li&gt;
&lt;li&gt;Another pie chart showing the different users who are active.&lt;/li&gt;
&lt;li&gt;A final pie chart that tracks the source domain.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Line charts 1 &amp;amp; 2:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F90um9adzdngufsrnufz5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F90um9adzdngufsrnufz5.png" alt="Image description" width="800" height="348"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Pie Charts:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7u4hoc3001sgnkivr95q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7u4hoc3001sgnkivr95q.png" alt="Image description" width="800" height="285"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F01othm94woqdb55263jr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F01othm94woqdb55263jr.png" alt="Image description" width="800" height="184"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Full-view Windows Server Monitoring Dashboard:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8kf56rf6gktbc848jxqa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8kf56rf6gktbc848jxqa.png" alt="Image description" width="800" height="316"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, we needed to repeat this process, but for the Apache log data. I went ahead and uploaded the log files and got to work on the reports first.&lt;/p&gt;

&lt;p&gt;This time, the important fields we wanted to pay special attention to were:&lt;br&gt;
o   method&lt;br&gt;
o   referrer_domain&lt;br&gt;
o   status&lt;br&gt;
o   clientip&lt;br&gt;
o   useragent&lt;/p&gt;

&lt;p&gt;Report 1: A report that shows a table of the different HTTP Methods (GET, POST, HEAD, etc.). This would show VSI the types of HTTP requests being made to the VSI webserver.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3nbwd94mg7dkuujmku61.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3nbwd94mg7dkuujmku61.png" alt="Image description" width="800" height="170"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Report 2: A report that displays the top 10 domains that refer to VSI's website, to help VSI identify any suspicious referrers.&lt;/p&gt;

&lt;p&gt;Note: I found it kind of funny that 'referer' was spelled incorrectly (should be 'referrer' in the fields list), but knew it had to match the data, so spelled it incorrectly to match when I had to.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frps75oivmin0vjfq4xx9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frps75oivmin0vjfq4xx9.png" alt="Image description" width="800" height="267"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Report 3: A report that shows the count of each HTTP response code. This will help VSI to quickly gauge the overall health of their webserver and activities taking place on it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhlu4wn9v0okszq6y8nr2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhlu4wn9v0okszq6y8nr2.png" alt="Image description" width="800" height="233"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With that, it was time to create some Alerts!&lt;/p&gt;

&lt;p&gt;Alert 1: The project called for an alert that triggers whenever a connection is made from any IP address outside of the United States. I chose France for this example, which had an IP Address of 176.31.39.30 (Roubaix, France).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqbjpr880yg6qit8csnyd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqbjpr880yg6qit8csnyd.png" alt="Image description" width="800" height="201"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Alert 2: An alert that triggers whenever a threshold was met for the count of HTTP POST methods within the span of 1 Hour.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F91mvelmfyt1v0c4vist6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F91mvelmfyt1v0c4vist6.png" alt="Image description" width="800" height="191"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The threshold I chose for HTTP POST requests was anything &amp;gt; 10 within 1 hour.&lt;/p&gt;

&lt;p&gt;And now back to the fun of creating visuals for our Dashboard!&lt;/p&gt;

&lt;p&gt;HTTP GET Method requests per hour:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F759rltkxf7sxu31vc4gt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F759rltkxf7sxu31vc4gt.png" alt="Image description" width="595" height="328"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;HTTP POST Method requests per hour:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8s9she5tza69i8i7g7my.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8s9she5tza69i8i7g7my.png" alt="Image description" width="613" height="342"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;HTTP Methods by type per hour:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhcko34pwrw6sw9xwev87.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhcko34pwrw6sw9xwev87.png" alt="Image description" width="800" height="140"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Top Countries connecting to the server:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqao1l64ngohqwyu2f38e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqao1l64ngohqwyu2f38e.png" alt="Image description" width="800" height="351"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Top User agents:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm5mmb7zhfd7tvs477mcx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm5mmb7zhfd7tvs477mcx.png" alt="Image description" width="800" height="305"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Top URI:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdl6s2zkm3tp1u4thqv3f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdl6s2zkm3tp1u4thqv3f.png" alt="Image description" width="727" height="322"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Full-view Apache Server Monitoring Dashboard:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiw94emh678vw9vo7a93c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiw94emh678vw9vo7a93c.png" alt="Image description" width="800" height="347"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So there you have it! I created Reports, Alerts, and Monitoring Dashboards for VSI's Windows and Apache servers. &lt;/p&gt;

&lt;p&gt;In Part 2 we will see whether or not the solutions I made protected VSI.&lt;/p&gt;

&lt;p&gt;Part 2: &lt;a href="https://dev.to/r33keeper/splunk-building-a-secure-monitoring-solution-part-2-208m"&gt;https://dev.to/r33keeper/splunk-building-a-secure-monitoring-solution-part-2-208m&lt;/a&gt;&lt;/p&gt;

</description>
      <category>splunk</category>
      <category>siem</category>
      <category>cybersecurity</category>
      <category>analyst</category>
    </item>
    <item>
      <title>Project: AWS containerized webserver management using Docker and Ansible</title>
      <dc:creator>Liz Benton</dc:creator>
      <pubDate>Wed, 07 Jun 2023 00:32:35 +0000</pubDate>
      <link>https://dev.to/e_liz_the_best/project-aws-containerized-webserver-management-using-docker-and-ansible-gd0</link>
      <guid>https://dev.to/e_liz_the_best/project-aws-containerized-webserver-management-using-docker-and-ansible-gd0</guid>
      <description>&lt;p&gt;Goal: &lt;br&gt;
To run an Ansible Playbook on top of a Bastion Host Server to configure two EC2 instances into webservers on AWS. We want to use our host machine’s IP for the bastion host, and we want to make sure everything is connected securely over SSH. We will also secure our setup through the use of firewalls AKA security group policies.&lt;/p&gt;

&lt;p&gt;Terminology to be familiar with for this project:&lt;/p&gt;

&lt;p&gt;What is a Bastion Host?&lt;br&gt;
Also known as a ‘jump box’, a bastion host is a dedicated server that lets authorized users access a private network from an external network, such as the internet. Because of its exposure to potential attack, a bastion host must minimize the chances of penetration.&lt;/p&gt;

&lt;p&gt;What is Ansible?&lt;br&gt;
Ansible is a tool used to provision the underlying infrastructure of an environment, virtualized hosts and hypervisors, network devices, and bare metal servers.&lt;/p&gt;

&lt;p&gt;What is an Ansible Playbook?&lt;br&gt;
Ansible playbooks are used as a way to repeat, re-usable, simple configurations and multi-machine deployment systems. It is well suited to deploying complex applications. If you need to execute a task more than once, write a playbook and put it under source control.&lt;/p&gt;

&lt;p&gt;What is Docker &amp;amp; Docker Hub?&lt;br&gt;
Docker is the world’s largest repository of container images.&lt;br&gt;
Docker Hub is a container image library where you can create an account and access the repository.&lt;/p&gt;

&lt;p&gt;You can also be cool and create your own libraries and upload them to add to the repository for other people to use!&lt;/p&gt;

&lt;p&gt;NOTE:&lt;br&gt;
Before building any project, you want to plan out and understand what it is you’re trying to do and not only how to make it happen, but how to do it right. Security is job zero and it starts from the beginning of a project.&lt;/p&gt;

&lt;p&gt;How do we fulfill our goal?&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create our Bastion Host Server using AWS EC2 instance creation&lt;/li&gt;
&lt;li&gt;Connect our Host Machine to the Bastion Host Server using Gitbash and allow the Bastion Host to use our Host machine’s IP Address&lt;/li&gt;
&lt;li&gt;We will have a private key made for us and saved to our Host machine when we create our first instance on EC2. We will need to share the private key access with our Bastion Host in order to communicate. The CSP takes care of handling our public key for the pair.&lt;/li&gt;
&lt;li&gt;We will need to install Docker on our Bastion Host so that we can obtain a specific Ansible container from the repository as well as accessing our playbook.&lt;/li&gt;
&lt;li&gt;Once Docker is installed and we have installed our Ansible ISO image file, we need to share our private key with our Ansible container so that it can securely communicate with our two EC2 instances/VMs and configure them into webservers.&lt;/li&gt;
&lt;li&gt;We need to establish connection from our Ansible container to our two VMs and once that is done via ping, then we can run our Ansible Playbook to do whatever we need it to do (in this case, just to install a couple applications and convert our two instances into webservers).&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Now that we have a pretty clear idea about what we want to accomplish, it's always helpful to create a nice little diagram for visual reference. I used draw.io since I'm a noob, but you can use whatever you're familiar with.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxjjvgdzsujc7y1kxaspn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxjjvgdzsujc7y1kxaspn.png" alt="Image description" width="800" height="490"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We start by launching our Bastion Host using Amazon EC2, which is as easy as searching for 'EC2' from the management console and clicking 'Launch Instance'. During the process I also set up our first firewall/security group rule of allowing only my IP to be able to SSH into the instance over port 22 and create a new key pair for security.&lt;/p&gt;

&lt;p&gt;Once that's done, we add two more rules to the same security group:&lt;br&gt;
1 allowing HTTP traffic over port 80 from anywhere (IPv4)&lt;br&gt;
and another allowing All ICMP IPv4 traffic from anywhere&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqhepe8oqutnggxidako6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqhepe8oqutnggxidako6.png" alt="Image description" width="800" height="250"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now that our Bastion Host Server has been created and our security group is configured appropriately, we will click 'connect' to open up a page with connection options. We'll be using SSH, so we copy our SSH address info from the instance SSH tab and open Gitbash to connect to it from our Host Machine.&lt;/p&gt;

&lt;p&gt;CMD: ssh -i “[key-name.pem” ec2-user@[copied instance ID from SSH tab]&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3p6lcv82ecvokim6b375.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3p6lcv82ecvokim6b375.png" alt="Image description" width="800" height="436"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Pretty bird~&lt;/p&gt;

&lt;p&gt;Note: Make sure you are in the directory where your private key is stored on your host machine so that it can be accessed to establish the connection.&lt;/p&gt;

&lt;p&gt;Next, we’ll open a new terminal so that we maintain connection with our Bastion Host in one window and use our Host machine in the other.&lt;/p&gt;

&lt;p&gt;On our Host machine, we’ll use ‘secure copy’ to copy our private key over to our Bastion Host server. While doing so, we will also create a directory for the key to exist inside its new location.&lt;/p&gt;

&lt;p&gt;CMD: scp -i “key-name.pem” key-name.pem ec2-user@[copied instance ID]:/home/ec2-user&lt;/p&gt;

&lt;p&gt;Note: Be sure not to forget the new directory at the end of the command. You may name the directories however you'd like. This is just what I used.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwnaw3f2k7a09ry7nnoin.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwnaw3f2k7a09ry7nnoin.png" alt="Image description" width="800" height="193"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If this worked, when we go back to our connected Bastion Host window, we should see the key when we use ‘ls’ to search the directory.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9jx7ryyvzb17s778hct7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9jx7ryyvzb17s778hct7.png" alt="Image description" width="492" height="303"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Before we close our Host terminal, let's check our IP address using ipconfig command and make a note of it. It should match our Private IPv4 Address on the instance home page.&lt;/p&gt;

&lt;p&gt;Returning to our Bastion Host, we will now install Docker and start it on our machine!&lt;/p&gt;

&lt;p&gt;Commands to update VM and install Docker package:&lt;br&gt;
CMD: sudo yum update&lt;br&gt;
CMD: sudo yum install docker -y&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feyw4vu12wvu6c3ciyx2d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feyw4vu12wvu6c3ciyx2d.png" alt="Image description" width="800" height="405"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once complete, we can start Docker:&lt;br&gt;
CMD: sudo service docker start&lt;/p&gt;

&lt;p&gt;Now that docker is installed and started, we can check the networks on it:&lt;/p&gt;

&lt;p&gt;CMD: sudo docker network ls&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxy1r1xmq7cit5wim9jxu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxy1r1xmq7cit5wim9jxu.png" alt="Image description" width="800" height="250"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Since we want Ansible to run inside of our host network and we want our Bastion Host to be running off of the same IP address as our machine we’re hosting the instance on, we should check the addresses associated with it as well.&lt;/p&gt;

&lt;p&gt;Once the networks match, it's time to pull our Ansible container from docker.&lt;/p&gt;

&lt;p&gt;CMD: sudo docker run -it --network host cyberxsecurity/ansible /bin/bash&lt;/p&gt;

&lt;p&gt;To break this command down, ‘docker run’ means to create the ansible container using the docker image we define later in the command (cybersecurity/ansible), connect to or initialize it (it), place it within this specified network (--network host), and run it for me (run). &lt;/p&gt;

&lt;p&gt;We’re downloading the provided ISO image file for our Ansible container that someone else created in Docker as a container and running it—in this case ‘cyberxsecurity/ansible’. –network host tells it which network we want to add Ansible under and ‘bin/bash’ tells it what programming language we want to use to communicate with Ansible when we want to run configuration commands on our machines.&lt;/p&gt;

&lt;p&gt;Once the image is added on top of our Bastion Host within our designated network, we should automatically see that our username has changed to the root of the Ansible container. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqpi4gd345qpmt0ocb6h3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqpi4gd345qpmt0ocb6h3.png" alt="Image description" width="800" height="161"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It should also be followed by the network address we provided. Let’s check and see if they match!&lt;/p&gt;

&lt;p&gt;CMD: ifconfig&lt;/p&gt;

&lt;p&gt;In the terminal, the root should show the IP of your host machine. If it doesn't, you'll have to re-do the previous steps.&lt;/p&gt;

&lt;p&gt;Now that our Host machine and our Bastion Host machine are (hopefully) on the same network along with our Ansible container, if we were to add additional servers, they should all be able to communicate with each other over SSH using our security group firewall settings, right? At least, that's the idea!&lt;/p&gt;

&lt;p&gt;Before that can happen though, we need to copy our private key into our newly added Ansible container, so that it can communicate with our servers.&lt;/p&gt;

&lt;p&gt;Keeping our Ansible window open and connected, if we open up a new terminal via command prompt and type in the command to copy our private key and then using another specific command, we can check the containerID for our new Ansible container that we added on top of our Bastion Host inside of our network.&lt;/p&gt;

&lt;p&gt;First, we'll open a new gitbash terminal to connect once again using SSH to our Bastion Host.&lt;/p&gt;

&lt;p&gt;We will then use a command to check our Container ID:&lt;br&gt;
CMD: sudo docker ps&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb68q5b3fe1dcxgqha2py.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb68q5b3fe1dcxgqha2py.png" alt="Image description" width="800" height="281"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we need to use a command to copy our key into our Ansible container (using the ID above). We will also define the path it gets copied to.&lt;/p&gt;

&lt;p&gt;CMD: sudo docker cp key-name.pem [container ID goes here]:/root/key-name.pem&lt;/p&gt;

&lt;p&gt;To check if the key made it to our container, we can return to our terminal window running Ansible and use ‘ls’.&lt;/p&gt;

&lt;p&gt;Now the fun part! Let’s create our two servers that we'll be configuring using our Ansible playbook!&lt;/p&gt;

&lt;p&gt;The two servers will be the same as our Bastion Host, with the same private key pair and security group, just different names.&lt;/p&gt;

&lt;p&gt;Note:&lt;br&gt;
If you want to save time, create a new instance and then while you're in the window viewing your instances, check the box for the one you just created, click on the 'Actions' Dropdown, then click 'Image and Templates' then click 'Launch More Like this'. Just remember to change the name of the instance and ensure it has the same key pair and security group, then create!&lt;/p&gt;

&lt;p&gt;You should have something like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fatmcgr62udvxxfhytumb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fatmcgr62udvxxfhytumb.png" alt="Image description" width="800" height="158"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now that we have our new servers, we need to test their connectivity within our network and if our Ansible container can communicate with them.&lt;/p&gt;

&lt;p&gt;We’ll start with testing out our first server, aptly named, ‘Server1’.&lt;/p&gt;

&lt;p&gt;We’ll do this by copying the private IPv4 address from the instances page after checking the box for our instance.&lt;/p&gt;

&lt;p&gt;Then we will return to our Ansible machine and from root, we will ping our machine!&lt;/p&gt;

&lt;p&gt;CMD: ping [IPv4 address of 1st machine]&lt;/p&gt;

&lt;p&gt;You should know right away if it worked or not.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmqoq7iw46bdev8jw34h4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmqoq7iw46bdev8jw34h4.png" alt="Image description" width="800" height="320"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we just need to SSH from our Ansible root to 'Server1'.&lt;/p&gt;

&lt;p&gt;CMD: ssh -i key-name.pem ec2-user@[IPv4 private server address]&lt;/p&gt;

&lt;p&gt;Huh, it isn't connecting, is it? Any idea why that might be?&lt;/p&gt;

&lt;p&gt;If you guessed it had something to do with our security group aka firewall configurations, then you’d be correct!&lt;/p&gt;

&lt;p&gt;Let’s fix that!&lt;/p&gt;

&lt;p&gt;As of now, our inbound rules say that anything trying to connect with our network via SSH outside of our own IP address will be denied.&lt;/p&gt;

&lt;p&gt;We need to add a new inbound rule that allows traffic traveling through SSH from our Ansible container. For that, we will need to find out the IP address tied to Ansible. We can see the IP right after the root, so just copy that. We'll also add '/32' to the end of it to make sure we have enough ground to cover.&lt;/p&gt;

&lt;p&gt;Now we can try again!&lt;/p&gt;

&lt;p&gt;CMD: ssh -i key-name.pem ec2-user@[IPv4 private Server1 address]&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh6nv3skzp51b0fvp217i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh6nv3skzp51b0fvp217i.png" alt="Image description" width="800" height="295"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Success!&lt;/p&gt;

&lt;p&gt;Repeat this process until you are able to SSH into all servers you wish to connect.&lt;/p&gt;

&lt;p&gt;Everything Ansible does is through SSH, so the procedure of connecting to each individual server using SSH cannot be skipped. As demonstrated, just because you can ping something, doesn’t mean it is going to be able to communicate across the network.&lt;/p&gt;

&lt;p&gt;We were able to SSH successfully into our two instances! So now we can use ansible to configure our servers into webservers for hosting whatever we want!&lt;/p&gt;

&lt;p&gt;Let’s return to our Bastion Host and Ansible machine terminals.&lt;/p&gt;

&lt;p&gt;In the Ansible terminal, we’ll look at two important files that we need to configure. Remember to back out of the Server using the 'exit' command to get back to ansible root before using the following command.&lt;/p&gt;

&lt;p&gt;CMD: ls /etc/ansible&lt;/p&gt;

&lt;p&gt;You should see a hosts file. We need to edit this file...&lt;/p&gt;

&lt;p&gt;CMD: nano /etc/ansible/hosts&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7w600rixdjxg08hjix2d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7w600rixdjxg08hjix2d.png" alt="Image description" width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let’s locate the header ‘[webservers]’, remove the hashtag so that it isn’t ignored, and paste the two Private IPv4 addresses from our servers on their own lines below the header.&lt;/p&gt;

&lt;p&gt;Adding these addresses tells Ansible which machines will receive the configurations we add to Ansible’s configuration file. &lt;/p&gt;

&lt;p&gt;It should end up looking like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3n1p2o4js17f37n6akq5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3n1p2o4js17f37n6akq5.png" alt="Image description" width="800" height="387"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Remember to save the changes made to the hosts and configuration files! You can do so multiple ways, but I like to use CTRL+X (exit), Y (yes) and then Enter.&lt;/p&gt;

&lt;p&gt;Next, we’ll try a special ping command unique to ansible to make sure our settings are correct:&lt;/p&gt;

&lt;p&gt;CMD: ansible -m ping all --key-name.pem&lt;/p&gt;

&lt;p&gt;You should get an error, because we haven’t edited the configuration file. Let’s do that now.&lt;/p&gt;

&lt;p&gt;CMD: nano /etc/ansible/ansible.cfg&lt;/p&gt;

&lt;p&gt;We are going to locate ‘remote user’ and change it from ‘root’ to our AWS username, ‘ec2-user’&lt;/p&gt;

&lt;p&gt;Before:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj9571o6grabk64uv71l2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj9571o6grabk64uv71l2.png" alt="Image description" width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6mjwgnxmr291o52gcxt0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6mjwgnxmr291o52gcxt0.png" alt="Image description" width="800" height="421"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Save and exit.&lt;/p&gt;

&lt;p&gt;Try to run the command again. If successful, you should now be able to create a Playbook and run it on your connected machines!&lt;/p&gt;

&lt;p&gt;Note:&lt;br&gt;
It’s important to know that the commands associated with playbooks change depending on the Operating System and the configurations you want to perform. Keep this in mind whenever working with Ansible Playbooks on different operating systems and cloud environments!&lt;/p&gt;

&lt;p&gt;Let’s use our playbook command to run our code that will convert our two instances into webservers!&lt;/p&gt;

&lt;p&gt;First, we’ll create a file called webservers.yml in the /etc/ansible directory:&lt;br&gt;
CMD: nano /etc/ansible/webservers.yml&lt;/p&gt;

&lt;p&gt;The following will be copied and pasted into our new yml file:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2ogrcxwgvbwiw4xfnlxv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2ogrcxwgvbwiw4xfnlxv.png" alt="Image description" width="800" height="280"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is all we need to do here, so we can save the file, and now we are ready to convert our servers!&lt;/p&gt;

&lt;p&gt;Before that, let’s check and see the results of what we have done so far!&lt;/p&gt;

&lt;p&gt;Go ahead and copy-paste the PUBLIC IP addresses for each of your servers into your web browser to check and see if they are Apache webservers. &lt;/p&gt;

&lt;p&gt;They aren’t, right? But they are about to be with a single command that runs our playbook!&lt;/p&gt;

&lt;p&gt;CMD: ansible-playbook /etc/ansible/webservers.yml --key-name.pem -u ec2-user&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdfoq3a8xo7e6f5g3eywk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdfoq3a8xo7e6f5g3eywk.png" alt="Image description" width="800" height="1092"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Congratulations! You have just used Ansible to configure two EC2 instance servers into webservers to host a website or other fun stuff on AWS! Do with it what you will! Or delete all your resources and do it all over again for practice!&lt;/p&gt;

&lt;p&gt;NOTE: If you don't plan on using them right away, I would suggest deleting your resources so that you do not incur an unwanted bill. :)&lt;/p&gt;

</description>
      <category>aws</category>
      <category>ansible</category>
      <category>docker</category>
      <category>project</category>
    </item>
    <item>
      <title>Cloud Resume Challenge - Parts 2 &amp; 3: Building the Front End (HTML/CSS)</title>
      <dc:creator>Liz Benton</dc:creator>
      <pubDate>Thu, 27 Apr 2023 23:29:56 +0000</pubDate>
      <link>https://dev.to/e_liz_the_best/cloud-resume-challenge-parts-2-3-building-the-front-end-htmlcss-4f7p</link>
      <guid>https://dev.to/e_liz_the_best/cloud-resume-challenge-parts-2-3-building-the-front-end-htmlcss-4f7p</guid>
      <description>&lt;p&gt;I'm probably spending way too much time on this part of the challenge, but I see this website as an extension of myself, so I felt very strongly that it needed to come from me. I could have easily gone and purchased someone else's website code layout as a base and gotten this step over and done with and on to the actual challenge (the back end), but I wanted to see if I could build my resume site with my own two hands! &lt;/p&gt;

&lt;p&gt;I had ChatGPT to help too, of course, so I confidently dove into the 2nd part of this challenge.&lt;/p&gt;

&lt;p&gt;First, I carefully wrote out what I envisioned on a notepad and gave ChatGPT step-by-step instructions that were as clear and as literal as I could make them. I've studied a bit of HTML and CSS on Mimo before and written a bit of code back in middle school on Neopets, so I could at least speak the language, even if I hadn't attempted building my own webpage before.&lt;/p&gt;

&lt;p&gt;After a bit of back and forth with Chat, I had something that looked a bit like what I wanted, but needed some major tweaking. For one, my navigation bar was obstructing the rest of the page's contents and not everything was lining up the same way or with the same text stylings, since each section had been executed in different batches instead of from one cohesive standpoint. No longer in love with my initial concept, I ended up completely scrapping the first attempt and starting over completely. By the end of our secondary back-and-forth, I once again encountered navigation bar, text, sizing and section alignment issues.&lt;/p&gt;

&lt;p&gt;After hours of studying my code, getting frustrated, going out to dinner, coming back with a fresh mind and learning about padding, margins and divs, things finally started clicking. The feeling I got when I figured out the solution and saw everything was positioned exactly how I had imagined was beyond happiness. I couldn't help but smile and bask in what I had been able to create from nothing!&lt;/p&gt;

&lt;p&gt;It took one failed iteration and more time than I wanted, but the results were worth every second! I'm excited to create something like this again someday and already have some ideas, but that needs to be put on the backburner for now.&lt;/p&gt;

&lt;p&gt;I spent at least 3 ~ 4 days working on these steps, on and off. On the 3rd day, I was so close to calling for help, for reaching out to one of my web developer friends for an easy way out, but I didn't! I wanted to prove to myself that I could figure it out and I did! &lt;/p&gt;

&lt;p&gt;It's also amazing to me how such small changes can impact the entire structure in such large ways. There were many times where I would erase something to test its impact, only to find multiple sections were now broken. At the end of this project, I saw it all like a delicate house of cards. It took me a while, but this one was built to endure, just like its creator.&lt;/p&gt;

&lt;p&gt;The most satisfying moment of all for me was when I realized what I liked on the page and started erasing and writing HTML and CSS to get the same results for each section using div tags WITHOUT needing to consult ChatGPT! I was ACTUALLY coding my own site! :D That was the greatest reward. &lt;/p&gt;

&lt;p&gt;This is only the beginning of this project, but I'm already so proud of myself!&lt;/p&gt;

&lt;p&gt;Now let's see what else I can do~&lt;/p&gt;

</description>
      <category>cloud</category>
      <category>html</category>
      <category>css</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Cloud Resume Challenge - Part 1: Earning my AWS Cloud Practitioner Certification</title>
      <dc:creator>Liz Benton</dc:creator>
      <pubDate>Thu, 27 Apr 2023 22:57:53 +0000</pubDate>
      <link>https://dev.to/e_liz_the_best/cloud-resume-challenge-part-1-earning-my-aws-cloud-practitioner-certification-1kn1</link>
      <guid>https://dev.to/e_liz_the_best/cloud-resume-challenge-part-1-earning-my-aws-cloud-practitioner-certification-1kn1</guid>
      <description>&lt;p&gt;Back around October of 2022, I did something that I could never have seen myself doing, which was to sign up for the UCI Continuing Education Division's Cybersecurity Boot Camp.&lt;/p&gt;

&lt;p&gt;Before the class began I did a lot of research in the different areas of Cybersecurity and learned a lot of new terminology, like "pentesting" and "white hat".&lt;/p&gt;

&lt;p&gt;Seeing some sources siting the need for coding in some of the roles and not wanting to be intimidated by this, I started studying python on the Mimo app and got my entry-level PCEP certificate in November.&lt;/p&gt;

&lt;p&gt;Wanting to have a good idea of what I was getting myself into, I reached out to some friends who were already in the industry, who strongly urged me to focus on Cloud.&lt;/p&gt;

&lt;p&gt;While doing more research, I came across the Cloud Resume Challenge by Forrest Brazeal, and it peaked my interest. Though I completely forgot about it once my course began, I did make a mental note that the first step was to obtain the AWS CCP.&lt;/p&gt;

&lt;p&gt;In December, I studied as much as I could for about 2 months between my course work and my full-time job using the extremely fun and helpful &lt;em&gt;AWS Skill Builder Cloud Practitioner Essentials Course&lt;/em&gt; and was able to secure the CCP last month on March 16th!&lt;/p&gt;

&lt;p&gt;I remember being incredibly nervous, but confident on the day of the exam and was smiling ear-to-ear when I walked back to my car, victorious. &lt;/p&gt;

&lt;p&gt;(&lt;em&gt;If you ever need a confidence boost, I highly recommend passing a certification exam!&lt;/em&gt;)&lt;/p&gt;

&lt;p&gt;I'm currently studying for my Security+ exam now and look forward to graduating in June!&lt;/p&gt;

&lt;p&gt;After doing more research, specifically into Cloud roles, I followed a little activity I found on youtube by a channel called, "Open Up The Cloud" and started feeling more and more that Cloud Engineering with a strong Cybersecurity foundation was the path for me.&lt;/p&gt;

&lt;p&gt;Highly recommend the video (and the channel in general) if you too are struggling with choosing a role in Cloud. It really helped me wrap my head around things and gave me some much-needed insight and perspective: &lt;a href="https://youtu.be/E0haz6mymxY" rel="noopener noreferrer"&gt;https://youtu.be/E0haz6mymxY&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;While looking through my saved bookmarks a couple weeks ago, I once again stumbled upon the Cloud Resume Project and decided to try and complete it before graduation. &lt;/p&gt;

&lt;p&gt;After all, I have the first step done, and how cool would it be to have my own little happy home on the internet where I can display all the things I've learned and all the certificates I've earned at the end of the program?!&lt;/p&gt;

&lt;p&gt;The steps for the challenge are as follows (AWS Version):&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;del&gt;Earn the AWS Cloud Practitioner Certification and add it to the website.&lt;/del&gt;&lt;/li&gt;
&lt;li&gt;Write your resume in HTML.&lt;/li&gt;
&lt;li&gt;Add CSS to add styling to your web-page.&lt;/li&gt;
&lt;li&gt;Add the resources of the website to an S3 bucket on AWS and deploy it as a static website.&lt;/li&gt;
&lt;li&gt;Utilize Amazon CloudFront to ensure you are using HTTPS instead of HTTP so that data is transferred securely between the webserver and the user's browser.&lt;/li&gt;
&lt;li&gt;Obtain a domain from any provider and use Amazon Route 53 to set up the DNS registry.&lt;/li&gt;
&lt;li&gt;Use Javascript to add a visitor counter to your website.&lt;/li&gt;
&lt;li&gt;In order for the visitor counter to function, we will be using a database. I have chosen to use DynamoDB.&lt;/li&gt;
&lt;li&gt;Using AWS API Gateway and Lambda, we will communicate with DynamoDB using an API that accepts requests from the web app and communicates them to the database.&lt;/li&gt;
&lt;li&gt;Our Lambda function will require either more Javascript or Python, but honestly I would rather do this entire thing in Python, so I will use Python and its boto3 library in AWS.&lt;/li&gt;
&lt;li&gt;Write some good Python tests...guess I will learn what that means! lol&lt;/li&gt;
&lt;li&gt;Learning Infrastructure as Code--the challenge asks that we do not configure our API resources (DynamoDB, API Gateway, and the Lambda function) manually in our AWS Management Console. Instead we will define them in AWS SAM (Serverless Application Model) and deploy them using the CLI (Command-Line Interface).&lt;/li&gt;
&lt;li&gt;Creating a GitHub Repo for our backend code.&lt;/li&gt;
&lt;li&gt;Setting up Github Actions in order to start working with CI/CD (Continuous Integration and Deployment) for the back end of the project. It should work to where when you push updates to SAM or in Python, the Python tests get run. If the tests pass, SAM should get the packages from the repo and deploy to AWS.&lt;/li&gt;
&lt;li&gt;Creating a second GitHub Repo for the front end code so that when new code gets pushed, the S3 bucket's front end contents get updated.
*There's a note in this step: "You may need to invalidate your AWS CloudFront cache in the code. And DO NOT commit AWS credentials to source control. Bad hats will find them and use them against you!"&lt;/li&gt;
&lt;li&gt;Blog Post--the step mentions to wait until the end to write a short blog post describing what you learned while working on the project. But I want to write about my experience, frustrations, obstacles, etc. for each major step in the challenge and then wrap up with a summary blog post.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Anyway...&lt;/p&gt;

&lt;p&gt;Challenge Accepted! Let's do this! &amp;gt;:) &lt;/p&gt;

&lt;p&gt;If you would also like to take the challenge, it can be found here: &lt;a href="https://cloudresumechallenge.dev/docs/the-challenge/" rel="noopener noreferrer"&gt;https://cloudresumechallenge.dev/docs/the-challenge/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>beginners</category>
      <category>cloud</category>
      <category>resume</category>
      <category>challenge</category>
    </item>
    <item>
      <title>My Hack The Box Cyber Apocalypse 2023 CTF Experience</title>
      <dc:creator>Liz Benton</dc:creator>
      <pubDate>Tue, 25 Apr 2023 18:23:57 +0000</pubDate>
      <link>https://dev.to/e_liz_the_best/my-hackthebox-cyber-apocalypse-2023-ctf-experience-4nmk</link>
      <guid>https://dev.to/e_liz_the_best/my-hackthebox-cyber-apocalypse-2023-ctf-experience-4nmk</guid>
      <description>&lt;p&gt;Hello, World!&lt;/p&gt;

&lt;p&gt;Between March 18th ~ March 23rd 2023, I formed a team composed of myself and a few of my classmates to participate in the HTB Cyber Apocalypse 2023 Capture the Flag competition—our first ever CTF, in fact(aside from one member)! &lt;/p&gt;

&lt;p&gt;Together, we formed “DR34M_CH@$ERS”! &lt;/p&gt;

&lt;p&gt;Our motto:&lt;br&gt;
&lt;em&gt;“All our dreams can come true if we have the courage to pursue them.”&lt;/em&gt; – Walt Disney&lt;/p&gt;

&lt;p&gt;As Team Captain, I was up bright and early at 6AM PST and was able to secure our first flag! Which honestly wasn’t too challenging, since you just needed to join the HackTheBox discord channel.&lt;/p&gt;

&lt;p&gt;It wasn’t exactly a requirement, per se, but it did add points to the grand total, so that has to count for something, right?&lt;/p&gt;

&lt;p&gt;I then skipped over the beginner challenges and got a bit more adventurous by diving right into the first Hardware challenge. It took me a couple hours to figure out what was being asked of me, and even a support ticket to ensure that things were working as intended, (read up on that funny story in my Critical Flight Walkthrough) but before I knew it, I had found a 2nd flag for us!&lt;/p&gt;

&lt;p&gt;After that, I helped one of my teammates in the ‘Pwn’ category solve the ‘Questionnaire’ and then went on to solve ‘Getting Started’.&lt;/p&gt;

&lt;p&gt;Feeling comfortable in the ‘Web’ category, I solved ‘Trapped Source’ with little difficulty for my 5th and final flag of this competition--half the total amount we obtained by the end.&lt;/p&gt;

&lt;p&gt;I spent most of my time trying to solve the Web challenge, “Passman” and the Blockchain challenge, “Navigating the Unknown”. In the end, I was unable to break through and complete those challenges, but I still plan on reading up on them so that I have the knowledge needed to conquer them next time!&lt;/p&gt;

&lt;p&gt;Closing thoughts:&lt;br&gt;
This competition was really fun, but more than anything, it really gave me a much wider perspective of how little I know about the world of Cybersecurity and technology. I expected mostly hacking challenges, but ‘pwn’ was only a single category! Sure, you used Linux and VMs with hacking tools for a lot of the challenges, but a lot of them involved just knowing different programming languages, or areas of study. They had a Hardware category, and even one for Machine Learning!&lt;/p&gt;

&lt;p&gt;I think it’s fantastic that there was such a broad range of knowledge to dive into and get more familiar with. I enjoy learning in general, so this was very enjoyable, even if I felt a bit lost for the most part and unsure of what to do next.&lt;/p&gt;

&lt;p&gt;That eagerness to know the ‘how’ and ‘why’ drives me ever onward in my pursuit of knowledge!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foa4kci3shmgr5m8auulb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foa4kci3shmgr5m8auulb.png" alt="DR34M_CH@$ERS Certificate" width="800" height="566"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Final Stats:&lt;br&gt;
Total Players: 12,543&lt;br&gt;
Total Teams: 6,482&lt;br&gt;
Our Team: 8/20&lt;/p&gt;

&lt;p&gt;Team Ranking  | Solved Challenges | Total Points&lt;br&gt;
2,070 / 6,483 | 10/74             | 2,725 / 23,125 &lt;/p&gt;

&lt;p&gt;Overall, I am SO proud of our team and of myself! Not too bad for our very first Capture the Flag!&lt;/p&gt;

&lt;p&gt;I also quite enjoyed being Team Captain, making sure my team stayed confident, well-rested, and hydrated throughout the competition.&lt;/p&gt;

&lt;p&gt;My Collection of Captured Flags:&lt;br&gt;
&lt;strong&gt;HTB{l3t_th3_tr3asur3_hunt1ng_b3g1n!}&lt;br&gt;
HTB{533_7h3_1nn32_w02k1n95_0f_313c720n1c5#$@}&lt;br&gt;
HTB{th30ry_bef0r3_4cti0n}&lt;br&gt;
HTB{b0f_s33m5_3z_r1ght?}&lt;br&gt;
HTB{V13w_50urc3_c4n_b3_u53ful!!!}&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I know the above is just text, but they all mean so much to me. Like little mini, serotonin-infused, digital trophies that I can look at whenever I feel like I can’t do something. Haha! Proof that with a little grit and persistence, you can get to the bottom of anything! :)&lt;/p&gt;

</description>
      <category>ctf</category>
      <category>hackthebox</category>
      <category>beginners</category>
      <category>learning</category>
    </item>
  </channel>
</rss>
