<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Nathan Ferguson</title>
    <description>The latest articles on DEV Community by Nathan Ferguson (@tangerinetrain).</description>
    <link>https://dev.to/tangerinetrain</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/tangerinetrain"/>
    <language>en</language>
    <item>
      <title>Creating a GitHub Action to Update S3</title>
      <dc:creator>Nathan Ferguson</dc:creator>
      <pubDate>Tue, 19 Aug 2025 22:51:22 +0000</pubDate>
      <link>https://dev.to/tangerinetrain/creating-a-github-action-to-update-s3-2bdf</link>
      <guid>https://dev.to/tangerinetrain/creating-a-github-action-to-update-s3-2bdf</guid>
      <description>&lt;p&gt;I set out the next challenge for myself: to upload my index.html page containing my resume to a GitHub repository, and have that automatically update the S3 bucket.&lt;/p&gt;

&lt;p&gt;I watched a few videos on LinkedIn Learning and got the overall gist of it, and dove in! This is the YML file I came up with:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# .github/workflows/deploy-to-s3.yml
name: Deploy to S3

on:
  push:
    branches: [ main ]  # or whatever your default branch is
  pull_request:
    branches: [ main ]

jobs:
  deploy:
    runs-on: ubuntu-latest

    steps:
    - name: Checkout code
      uses: actions/checkout@v4

    - name: Configure AWS credentials
      uses: aws-actions/configure-aws-credentials@v4
      with:
        aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
        aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
        aws-region: us-east-1  # Change to your bucket's region

    - name: Upload specific file to S3
      run: |
        aws s3 cp ./website/index.html s3.us-east-1.amazonaws.com/nathanferguson.me/index.html

    # Optional: Invalidate CloudFront cache if you're using it
    - name: Invalidate CloudFront
      run: |
        aws cloudfront create-invalidation --distribution-id E1IU0KCJ892E9E --paths "/*"
      # Only run this step if you have CloudFront
      if: false  # Change to true if you want to use this
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;At a high level, this is an action set up so that every time I make a push to the repository, it will also run a linux container through GitHub Actions that will pull the repo contents, and upload the index.html file back up to the S3 bucket. Even better - it will also invalidate the cloudfront cache so the updates will be visible on the site!&lt;/p&gt;

&lt;p&gt;A few bumps I ran into:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Access tokens

&lt;ul&gt;
&lt;li&gt;In order for the GitHub Action to communicate with AWS and authenticate without storing my access keys publicly, you can store them as repository variables. I was trying to use the SSO user I had created previously, but ran into issues with the action not running successfully due to the tokens. I then found out that SSO users only get temporary credentials which expire, due to the session. I needed a traditional IAM user that could have more permanent keys which could be used long term. Once I updated and used the IAM user, it worked!&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;Honestly this was all smoother than expected, and fun to see it actually working. It definitely is a time saver now so I can easily update my resume with just a few clicks. &lt;/p&gt;

</description>
      <category>githubactions</category>
      <category>aws</category>
      <category>git</category>
    </item>
    <item>
      <title>PATHing out</title>
      <dc:creator>Nathan Ferguson</dc:creator>
      <pubDate>Thu, 14 Aug 2025 20:26:16 +0000</pubDate>
      <link>https://dev.to/tangerinetrain/pathing-out-550f</link>
      <guid>https://dev.to/tangerinetrain/pathing-out-550f</guid>
      <description>&lt;p&gt;So I'm still getting back into things when it comes to CLI, and more importantly, terraform. I'm gonna break down the steps I was doing to get this working.&lt;/p&gt;

&lt;p&gt;Previously I installed Terraform on my machine using Homebrew:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Install Homebrew if you don't have it
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"

# Install Terraform
brew install terraform

# Verify installation
terraform version
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This worked fine, however later when I went to use the &lt;code&gt;terraform&lt;/code&gt; command, it gave me the dreaded:&lt;br&gt;
&lt;code&gt;command not found: terraform&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Checking further, it seems that this is only stored in $PATH temporarily. I discovered this is what's happening when you try to use a command like &lt;code&gt;terraform&lt;/code&gt;:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Your shell looks at the PATH variable&lt;/li&gt;
&lt;li&gt;It searches each directory in PATH (from left to right)&lt;/li&gt;
&lt;li&gt;It looks for a file named terraform that's executable&lt;/li&gt;
&lt;li&gt;Once found, it runs that program&lt;/li&gt;
&lt;li&gt;If not found in any PATH directory → "command not found" error&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;I then used the following to verify that terraform was indeed installed:&lt;br&gt;
&lt;code&gt;find / -name terraform 2&amp;gt;/dev/null&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;I was able to then verify it was at &lt;code&gt;/opt/homebrew/bin/terraform&lt;/code&gt;. The issue is that /opt/homebrew/bin isn't in my PATH.&lt;/p&gt;

&lt;p&gt;I needed to add this to PATH, which I did via this command: &lt;br&gt;
&lt;code&gt;echo 'export PATH="/opt/homebrew/bin:$PATH"' &amp;gt;&amp;gt; ~/.zshrc&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;I then tried to restart using&lt;br&gt;
&lt;code&gt;source ~/.zshrc&lt;/code&gt; but I was getting another error! This time it was &lt;code&gt;command not found: compdef&lt;/code&gt;. After some additional research, I discovered this happens when there's an issue with zsh completion system or when completion scripts are being loaded before zsh is fully initialized.&lt;/p&gt;

&lt;p&gt;To fix, I ran these commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Create a backup first
cp ~/.zshrc ~/.zshrc.backup

# Add the lines to the beginning
echo -e "autoload -Uz compinit\ncompinit\n$(cat ~/.zshrc)" &amp;gt; ~/.zshrc.tmp
mv ~/.zshrc.tmp ~/.zshrc

# Reload
source ~/.zshrc
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I then was able to use &lt;code&gt;source ~/.zshrc&lt;/code&gt; which reloads the console.&lt;/p&gt;

&lt;p&gt;After all this, since the terraform location it was added successfully to PATH, I was able to confirm terraform was working:&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;terraform version&lt;br&gt;
Terraform v1.12.2&lt;br&gt;
on darwin_arm64&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;Next steps are to start playing around with Terraform!&lt;/p&gt;

</description>
      <category>terraform</category>
      <category>shell</category>
      <category>aws</category>
    </item>
    <item>
      <title>Logging in with SSO via AWS CLI</title>
      <dc:creator>Nathan Ferguson</dc:creator>
      <pubDate>Thu, 14 Aug 2025 19:42:08 +0000</pubDate>
      <link>https://dev.to/tangerinetrain/logging-in-with-sso-via-aws-cli-1b0</link>
      <guid>https://dev.to/tangerinetrain/logging-in-with-sso-via-aws-cli-1b0</guid>
      <description>&lt;p&gt;This is mostly a reminder for myself on how to use the service account user I set up in AWS to access resources via the AWS CLI.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Set up your user according to the steps listed here: &lt;a href="https://docs.aws.amazon.com/singlesignon/latest/userguide/quick-start-default-idc.html" rel="noopener noreferrer"&gt;Configure user access with the default IAM Identity Center directory&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Once you've got the user, make sure to save their unique URL to login to.&lt;/li&gt;
&lt;li&gt;To login, use the above link to login.&lt;/li&gt;
&lt;li&gt;Under the &lt;strong&gt;AWS Access Portal&lt;/strong&gt; &amp;gt; &lt;strong&gt;AWS Accounts&lt;/strong&gt;, find the user, and select &lt;strong&gt;Access Keys&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;From here, copy the &lt;strong&gt;SSO Start URL&lt;/strong&gt;, and the &lt;strong&gt;SSO Region&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Use &lt;code&gt;aws configure SSO&lt;/code&gt;, and set the name and details from above.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Once your profile is saved, all the login details for that will be saved to your machine. &lt;/p&gt;

&lt;p&gt;To login with this profile, use the command &lt;code&gt;aws sso login --profile your-profile-name&lt;/code&gt;. It should open a browser page where it will authenticate you and give you a successful message.&lt;/p&gt;

&lt;p&gt;You can also use &lt;code&gt;aws sts get-caller-identity&lt;/code&gt; to then verify that you're logged in.&lt;/p&gt;

&lt;p&gt;The session duration will depend on the expiration set in the permission set within the AWS console.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cli</category>
      <category>sso</category>
      <category>iam</category>
    </item>
    <item>
      <title>API calls and Testing</title>
      <dc:creator>Nathan Ferguson</dc:creator>
      <pubDate>Tue, 12 Aug 2025 22:45:50 +0000</pubDate>
      <link>https://dev.to/tangerinetrain/api-calls-and-testing-2def</link>
      <guid>https://dev.to/tangerinetrain/api-calls-and-testing-2def</guid>
      <description>&lt;p&gt;After some more work, I have results!&lt;/p&gt;

&lt;p&gt;You should be able to check now at the bottom of my page for a little Visitor Counter!&lt;/p&gt;

&lt;p&gt;I added in JavaScript to my page like so:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;script&amp;gt;
    const API_URL = "https://******.execute-api.us-east-1.amazonaws.com/default/visitorcount";

    async function fetchAndShowCount() {
      try {
        const res = await fetch(API_URL, { method: "POST" });
        if (!res.ok) throw new Error(`HTTP ${res.status}`);
        const data = await res.json(); // expects {"count": 123}
        document.getElementById("visitor-count").textContent = `Visitor count: ${data.count}`;
      } catch (err) {
        console.error("Failed to fetch visitor count:", err);
        document.getElementById("visitor-count").textContent = "Visitor count: Error";
      }
    }

    window.addEventListener("DOMContentLoaded", fetchAndShowCount);
  &amp;lt;/script&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Essentially, I grabbed the public URL of my API Gateway that triggers the function and set that as a variable. Then, the fetchAndShowCount will make a POST call to the URL. It checks to see if the response is ok, and stores that as data. From there, it can display the count from that call that was returned by the API.&lt;/p&gt;

&lt;p&gt;To actually call the function in the page, the event listener checks for the page load, and then calls it, effectively increasing the visitor count on page refresh.&lt;/p&gt;

&lt;p&gt;I'd definitely love to make this to only count unique visitors, possibly by day, but that's for the future!&lt;/p&gt;

&lt;h2&gt;
  
  
  Testing
&lt;/h2&gt;

&lt;p&gt;I am completely new to writing tests for my code, so I asked for a little help (thanks chatgpt!). I was able to come up with three tests for this like so:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# test_lambda_handler.py
import json
import pytest
from unittest.mock import patch, MagicMock
import lambda_function  # replace with your filename (without .py)

@pytest.fixture
def mock_table():
    """Fixture to mock DynamoDB table."""
    mock_table = MagicMock()
    return mock_table

@pytest.fixture
def mock_boto3_resource(mock_table):
    """Fixture to patch boto3.resource."""
    with patch("lambda_function.dynamo") as mock_dynamo:
        mock_dynamo.Table.return_value = mock_table
        yield mock_dynamo

def test_lambda_handler_success(mock_boto3_resource, mock_table):
    # Arrange
    mock_table.update_item.return_value = {
        "Attributes": {"visitorCount": 42}
    }
    event = {}
    context = {}

    # Act
    response = lambda_function.lambda_handler(event, context)

    # Assert
    mock_table.update_item.assert_called_once_with(
        Key=lambda_function.ITEM_KEY,
        UpdateExpression="SET visitorCount = if_not_exists(visitorCount, :zero) + :inc",
        ExpressionAttributeValues={":zero": 0, ":inc": 1},
        ReturnValues="UPDATED_NEW"
    )
    assert response["statusCode"] == 200
    body = json.loads(response["body"])
    assert body["count"] == 42

def test_lambda_handler_failure(mock_boto3_resource, mock_table):
    # Arrange: simulate DynamoDB throwing exception
    mock_table.update_item.side_effect = Exception("DB is down")
    event = {}
    context = {}

    # Act
    response = lambda_function.lambda_handler(event, context)

    # Assert
    assert response["statusCode"] == 500
    body = json.loads(response["body"])
    assert body["error"] == "Internal Server Error"
    assert "DB is down" in body["message"]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will check two things:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Confirms that the response is 200 OK and that the response contains the "count: 42"&lt;/li&gt;
&lt;li&gt;Simulates the DynamoDB table being down and whether or not the error message conveys the same&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;I'd still like to do some additional finagling with tests but this seemed like a good place to start!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Next steps&lt;/strong&gt;&lt;br&gt;
I may work on some more testing, but I'm excited to get some hands on with Automation and CI! &lt;/p&gt;

</description>
    </item>
    <item>
      <title>Lambda functions!</title>
      <dc:creator>Nathan Ferguson</dc:creator>
      <pubDate>Tue, 12 Aug 2025 19:16:40 +0000</pubDate>
      <link>https://dev.to/tangerinetrain/lambda-functions-3387</link>
      <guid>https://dev.to/tangerinetrain/lambda-functions-3387</guid>
      <description>&lt;p&gt;I was able to get my Lambda function working! This is my current code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import json
import os
import boto3

dynamo = boto3.resource("dynamodb")
TABLE_NAME = os.environ.get("TABLE_NAME", "nathanferguson-visitorcounter")
ITEM_KEY = {"id": "1"}  # keep this here if id is always "1"

def lambda_handler(event, context):
    table = dynamo.Table(TABLE_NAME)
    try:
        # Atomic increment (single call)
        resp = table.update_item(
            Key=ITEM_KEY,
            UpdateExpression="SET visitorCount = if_not_exists(visitorCount, :zero) + :inc",
            ExpressionAttributeValues={":zero": 0, ":inc": 1},
            ReturnValues="UPDATED_NEW"
        )

        new_count = int(resp["Attributes"]["visitorCount"])

        return {
    "statusCode": 200,
    "headers": {
        "Access-Control-Allow-Origin": "*",
        "Access-Control-Allow-Methods": "OPTIONS,POST,GET",
        "Access-Control-Allow-Headers": "Content-Type",
        "Content-Type": "application/json"
    },
    "body": json.dumps({"count": new_count})
}


    except Exception as e:
        # log for CloudWatch
        print("Error updating visitor count:", str(e))
        return {
            "statusCode": 500,
            "headers": {"Access-Control-Allow-Origin": "*", "Content-Type": "application/json"},
            "body": json.dumps({"error": "Internal Server Error", "message": str(e)})
        }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;From this code, it's able to interact with my DynamoDb table to increment the current value by 1. In order to more easily invoke this function, I added an API gateway to the function. So now I can access it by using that URL, pretty cool!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Issues&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Returning the correct number

&lt;ul&gt;
&lt;li&gt;When I returned the count from the function initially, it was just a plain number. This was causing issues down the line of not being able to read that number and display it in the visitor count section of my page. So I updated it to return as JSON which I was able to use more easily in the HTML page&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;I also ended up creating a very simple test for the function within the AWS console, basically by just putting "{}", since the function just needed to run. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Coming up next&lt;/strong&gt;&lt;br&gt;
I need to figure out how to call that counter on page load, as well as displaying it within the site itself. Onward!&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Back again!</title>
      <dc:creator>Nathan Ferguson</dc:creator>
      <pubDate>Tue, 12 Aug 2025 15:55:32 +0000</pubDate>
      <link>https://dev.to/tangerinetrain/back-again-5528</link>
      <guid>https://dev.to/tangerinetrain/back-again-5528</guid>
      <description>&lt;p&gt;Well, after a recent job change, I am back at this again!&lt;/p&gt;

&lt;p&gt;I am picking back up on the Cloud Resume Challenge once again. I had done about half of it, and it was working! But it needs more work for sure.&lt;/p&gt;

&lt;p&gt;So I am back at it again, and will use this blog to continue documenting my progress!&lt;/p&gt;

&lt;p&gt;So what's next?&lt;/p&gt;

&lt;p&gt;I am now working on adding in the JavaScript counter with DynamoDB!&lt;/p&gt;

&lt;p&gt;I am going to first work on my Lambda function which will add interact with the DynamoDB table, and will add a new post once I'm finished. Wish me luck!&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>dynamodb</category>
      <category>lambda</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Houston, we have a website</title>
      <dc:creator>Nathan Ferguson</dc:creator>
      <pubDate>Sun, 08 Oct 2023 16:45:48 +0000</pubDate>
      <link>https://dev.to/tangerinetrain/houston-we-have-a-website-3ch0</link>
      <guid>https://dev.to/tangerinetrain/houston-we-have-a-website-3ch0</guid>
      <description>&lt;p&gt;We are making progress!&lt;/p&gt;

&lt;p&gt;After making several attempts and getting a hold removed from my AWS account for Route53, we have secured a domain! I had some issues even after the hold was removed, because it kept erroring with the payments. Just another learning curve that I realized my default payment method was with an expired card. I then added another payment method but forgot to set it as default! Once I finally removed the expired card and set my new one to be default, we were in the clear.&lt;/p&gt;

&lt;p&gt;I found a super helpful tutorial from AWS on how to set up my newly created resume HTML file and host it on S3. I also performed my first alias record creation!&lt;/p&gt;

&lt;p&gt;Now for the final product...&lt;br&gt;
nathanferguson.me&lt;/p&gt;

&lt;p&gt;Next steps!&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;HTTPS&lt;/li&gt;
&lt;li&gt;A visitor counter with Javascript/DynamoDB&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For anyone interested, here's the tutorial I followed to get it created: &lt;a href="https://docs.aws.amazon.com/AmazonS3/latest/userguide/website-hosting-custom-domain-walkthrough.html"&gt;https://docs.aws.amazon.com/AmazonS3/latest/userguide/website-hosting-custom-domain-walkthrough.html&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>HTML writing and DNS woes</title>
      <dc:creator>Nathan Ferguson</dc:creator>
      <pubDate>Fri, 06 Oct 2023 18:55:21 +0000</pubDate>
      <link>https://dev.to/tangerinetrain/html-writing-and-dns-woes-22hc</link>
      <guid>https://dev.to/tangerinetrain/html-writing-and-dns-woes-22hc</guid>
      <description>&lt;p&gt;Small update here!&lt;/p&gt;

&lt;p&gt;I've finished converting my resume into an HTML page formatted with CSS, and I like it so far. Will likely end up adding some more detail/formatting as it goes on but it's fine for a v1.0.&lt;/p&gt;

&lt;p&gt;I'm working on getting this page hosted on my new website, but unfortunately I'm running into issues on the Route53 side. For some reason my request to register a new domain wasn't able to be completed, so I am stuck waiting on AWS support at this point (free tier unfortunately). &lt;/p&gt;

&lt;p&gt;I am hoping this gets resolved soon so I can keep going!&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Starting my journey to the cloud!</title>
      <dc:creator>Nathan Ferguson</dc:creator>
      <pubDate>Thu, 05 Oct 2023 21:29:39 +0000</pubDate>
      <link>https://dev.to/tangerinetrain/starting-my-journey-to-the-cloud-4794</link>
      <guid>https://dev.to/tangerinetrain/starting-my-journey-to-the-cloud-4794</guid>
      <description>&lt;p&gt;Hi all!&lt;/p&gt;

&lt;p&gt;I am starting this blog to document my journey to learning and practicing cloud technologies. &lt;/p&gt;

&lt;p&gt;I want to get some more hands on projects under my belt, so I'm going to start with the Cloud Resume Challenge (&lt;a href="https://cloudresumechallenge.dev/docs/the-challenge/aws/#2-html"&gt;https://cloudresumechallenge.dev/docs/the-challenge/aws/#2-html&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;I have two certifications so far that I've earned while working, &lt;br&gt;
Microsoft Certified: Azure Fundamentals and AWS Solutions Architect - Associate. I am working on studying to re-certify my AWS certificate since I earned that in 2017 and it's expired since then. &lt;/p&gt;

&lt;p&gt;I'm going to do my best to keep this updated with my progress and things I've learned, so stay tuned!&lt;/p&gt;

</description>
      <category>career</category>
      <category>cloud</category>
      <category>aws</category>
    </item>
  </channel>
</rss>
