<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Ebenezer Lamptey</title>
    <description>The latest articles on DEV Community by Ebenezer Lamptey (@nlankwei5).</description>
    <link>https://dev.to/nlankwei5</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/nlankwei5"/>
    <language>en</language>
    <item>
      <title>I Built a Real-Time Stock Price Tracker with Django, Redis and WebSockets</title>
      <dc:creator>Ebenezer Lamptey</dc:creator>
      <pubDate>Sat, 14 Feb 2026 22:46:44 +0000</pubDate>
      <link>https://dev.to/nlankwei5/i-built-a-real-time-stock-price-tracker-with-django-redis-and-websockets-1l95</link>
      <guid>https://dev.to/nlankwei5/i-built-a-real-time-stock-price-tracker-with-django-redis-and-websockets-1l95</guid>
      <description>&lt;p&gt;I wanted to have a a niche in backend engineering and i was drawn to real-time systems. I wanted understand how real-time systems actually work under the hood not just use them, but build one myself. So I built a stock price tracker that fetches live prices every 60 seconds, calculates SMAs, detects crossover alerts, and pushes everything to connected clients over WebSocket.&lt;/p&gt;

&lt;p&gt;Here's what I learned.&lt;/p&gt;




&lt;h2&gt;
  
  
  What it does
&lt;/h2&gt;

&lt;p&gt;Every 60 seconds:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Fetches live prices for 15 stocks from Finnhub API&lt;/li&gt;
&lt;li&gt;Saves them to PostgreSQL&lt;/li&gt;
&lt;li&gt;Caches the last 5 prices per stock in Redis&lt;/li&gt;
&lt;li&gt;Calculates a 5-period SMA from the cache&lt;/li&gt;
&lt;li&gt;Detects bullish/bearish crossover alerts&lt;/li&gt;
&lt;li&gt;Broadcasts everything to connected WebSocket clients in one message&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  The stack
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Django + DRF&lt;/li&gt;
&lt;li&gt;Celery + Celery Beat (task scheduling)&lt;/li&gt;
&lt;li&gt;Redis (caching + Channels backend)&lt;/li&gt;
&lt;li&gt;Django Channels (WebSocket)&lt;/li&gt;
&lt;li&gt;Uvicorn (ASGI server)&lt;/li&gt;
&lt;li&gt;Finnhub API&lt;/li&gt;
&lt;li&gt;SQLite for db since i was having issues with postgreql on my mac&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  The part that clicked for me: Redis could be used for other stuff and I also realized how DSA is very important when I had to implement redis-list.
&lt;/h2&gt;

&lt;p&gt;I had used Redis quite a few times in other projects (mostly as a broker in a message queue that is Celery for background tasks.) but i was exposed to it's other capabilities. I had already watched videos and hearing Redis is for caching but I hadn't implemented one myself. &lt;/p&gt;

&lt;p&gt;In this project Redis is doing three different jobs:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Celery broker&lt;/strong&gt; — passes tasks between Celery Beat and the worker&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Price cache&lt;/strong&gt; — stores the last 5 prices per stock as a Redis List&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Channel Layer backend&lt;/strong&gt; — lets Celery talk to Django Channels to broadcast WebSocket messages&lt;/p&gt;

&lt;p&gt;Same Redis instance, three completely different purposes. That was a lightbulb moment.&lt;/p&gt;




&lt;h2&gt;
  
  
  How the caching works
&lt;/h2&gt;

&lt;p&gt;Each stock has a Redis List that holds its last 5 prices. On every update I use two commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;RPUSH stock:AAPL:prices 255.78   → add new price to the end
LTRIM stock:AAPL:prices -5 -1    → remove anything older than last 5
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's it. The list always has a maximum of 5 items and the oldest price falls off automatically.&lt;/p&gt;

&lt;p&gt;I also used Redis pipelines to batch these commands together. Instead of making separate round-trips to Redis for each stock, I queue all the commands and execute them in one go. With 15 stocks that's the difference between 60 round-trips and 2.&lt;/p&gt;




&lt;h2&gt;
  
  
  How the SMA and alerts work
&lt;/h2&gt;

&lt;p&gt;Once 5 prices are cached, calculating SMA is just:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;sma&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;last_5_prices&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For alerts I compare the previous price against the previous SMA, and the current price against the current SMA. If the price crossed the line between those two readings, that's a crossover:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Bullish: price was below SMA, now above
&lt;/span&gt;&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;previous_price&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;previous_sma&lt;/span&gt; &lt;span class="ow"&gt;and&lt;/span&gt; &lt;span class="n"&gt;current_price&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;current_sma&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;alert&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;bullish&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

&lt;span class="c1"&gt;# Bearish: price was above SMA, now below
&lt;/span&gt;&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;previous_price&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;previous_sma&lt;/span&gt; &lt;span class="ow"&gt;and&lt;/span&gt; &lt;span class="n"&gt;current_price&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;current_sma&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;alert&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;bearish&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You need the previous values to detect a crossing. That's why caching the SMA matters so you have something to compare against next time.&lt;/p&gt;




&lt;h2&gt;
  
  
  How real-time broadcasting works
&lt;/h2&gt;

&lt;p&gt;This was the trickiest part to understand. Celery and Django are separate processes. How does a background task push a message to a connected WebSocket client?&lt;/p&gt;

&lt;p&gt;The answer is the Channel Layer.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Celery task finishes processing
        ↓
Publishes message to Channel Layer (Redis)
        ↓
Django Channels picks it up
        ↓
Pushes to all connected WebSocket clients
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Redis is the bridge between the two processes. Celery writes to it, Channels reads from it.&lt;/p&gt;

&lt;p&gt;The WebSocket message looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"stock_update"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"timestamp"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2026-02-14T21:38:22+00:00"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"stocks"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"ticker"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"AAPL"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"price"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;255.78&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"sma"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;254.32&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"alert"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"ticker"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"MSFT"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"price"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;401.32&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"sma"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;399.80&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"alert"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"bullish"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"ticker"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"TSLA"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"price"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;417.44&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"sma"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;419.10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"alert"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"bearish"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;All 15 stocks in one message, every 60 seconds, automatically.&lt;/p&gt;




&lt;h2&gt;
  
  
  Running two servers in development
&lt;/h2&gt;

&lt;p&gt;One thing that tripped me up was Django's &lt;code&gt;runserver&lt;/code&gt; doesn't support WebSockets. It was a struggle. It's a WSGI server, which is request/response only. WebSockets need a persistent connection, so you need an ASGI server.&lt;/p&gt;

&lt;p&gt;I run both during development:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# For the DRF browsable API&lt;/span&gt;
python manage.py runserver        &lt;span class="c"&gt;# port 8000&lt;/span&gt;

&lt;span class="c"&gt;# For WebSocket connections&lt;/span&gt;
uvicorn core.asgi:application &lt;span class="nt"&gt;--port&lt;/span&gt; 8001
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;REST endpoints on 8000, WebSocket on 8001.&lt;/p&gt;




&lt;h2&gt;
  
  
  What I actually learned
&lt;/h2&gt;

&lt;p&gt;Going in I knew Django and had used Redis a little. Coming out I understand:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;How background task scheduling works with Celery Beat&lt;/li&gt;
&lt;li&gt;How Redis Lists work and why they're perfect for rolling windows of data&lt;/li&gt;
&lt;li&gt;What Redis pipelines are and why batching matters&lt;/li&gt;
&lt;li&gt;The difference between WSGI and ASGI&lt;/li&gt;
&lt;li&gt;How Django Channels uses a Channel Layer to bridge async and sync code&lt;/li&gt;
&lt;li&gt;How to structure a real-time data pipeline end to end&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Building this made real-time systems a lot less mysterious. It's not magic — it's just a producer, a channel, and a consumer.&lt;/p&gt;




&lt;h2&gt;
  
  
  Source code
&lt;/h2&gt;

&lt;p&gt;GitHub: &lt;a href="https://github.com/nlankwei5/Stock-Price-Tracker-API" rel="noopener noreferrer"&gt;Stock-Price-Tracker-API&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Future plans
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Simple frontend to show how it works&lt;/li&gt;
&lt;li&gt;Making sure API calls are not made when market closes. This should be done automatically.&lt;/li&gt;
&lt;li&gt;Migrate database to PostgreSQL&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>django</category>
      <category>python</category>
      <category>redis</category>
      <category>showdev</category>
    </item>
    <item>
      <title>Building and Dockerizing a Django To-Do List API – My First Real Backend Project!</title>
      <dc:creator>Ebenezer Lamptey</dc:creator>
      <pubDate>Fri, 27 Jun 2025 13:49:39 +0000</pubDate>
      <link>https://dev.to/nlankwei5/building-and-dockerizing-a-django-to-do-list-api-my-first-real-backend-project-1k4a</link>
      <guid>https://dev.to/nlankwei5/building-and-dockerizing-a-django-to-do-list-api-my-first-real-backend-project-1k4a</guid>
      <description>&lt;p&gt;Creating my first full Django backend project from scratch was both exciting and eye-opening. I built a To-Do List API, added PostgreSQL, containerized the entire application with Docker. &lt;/p&gt;

&lt;p&gt;This blog post documents that journey; lessons learned, hurdles faced, and how I overcame them. If you're a beginner backend developer or looking to dip your feet into Docker and Django, I hope this post inspires and helps you!&lt;/p&gt;

&lt;h2&gt;
  
  
  Project Goal: Build a Simple Yet Scalable To-Do List API
&lt;/h2&gt;

&lt;p&gt;The idea was to build a lightweight RESTful To-Do List API using Django and Django REST Framework (DRF). Features included:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;User registration and login with access/refresh tokens&lt;/li&gt;
&lt;li&gt;Secure authentication with JWT&lt;/li&gt;
&lt;li&gt;Users can only access and manage their own tasks&lt;/li&gt;
&lt;li&gt;CRUD operations on tasks&lt;/li&gt;
&lt;li&gt;Mark tasks as completed (Boolean toggle)&lt;/li&gt;
&lt;li&gt;Search, filter, and order support&lt;/li&gt;
&lt;li&gt;Switched from SQLite to PostgreSQL&lt;/li&gt;
&lt;li&gt;Docker support for consistent setup&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Note: Deployment is a planned next step. This project has not yet been deployed.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step-by-Step Journey
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Building the API with Django + DRF
&lt;/h3&gt;

&lt;p&gt;I started with Django and Django REST Framework, setting up views, serializers, and models to handle ToDo objects on a per-user basis. Also  worked on urls. register it in admin.py; an in-built admin page for Django. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdw6lj9fnmadx91w21n2k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdw6lj9fnmadx91w21n2k.png" alt="Code image" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj5wsysb0bnlise93usz5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj5wsysb0bnlise93usz5.png" alt="Code image" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Integrating JWT Authentication&lt;br&gt;
Using SimpleJWT, I added token-based authentication, allowing users to securely interact with their own tasks.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Swapping SQLite for PostgreSQL&lt;br&gt;
SQLite was fine initially, but PostgreSQL better reflects real-world apps, especially when containerizing. Before i did that i had to save keys and important information in an env file to avoid commit keys and secrets to the repository. This is for security purposes.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa8kvpz4b855io9wgwgtb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa8kvpz4b855io9wgwgtb.png" alt="Settings.py image" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Containerizing with Docker
I wrote a Dockerfile, .dockerignore, and docker-compose.yml to package both the Django app and PostgreSQL database.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Key learnings:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Why we don't use virtual environments in containers&lt;/li&gt;
&lt;li&gt;Setting up environment variables properly&lt;/li&gt;
&lt;li&gt;Troubleshooting docker-compose issues&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;Testing Locally
Using:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker-compose up &lt;span class="nt"&gt;--build&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I run into an issue where the database wasn't migrated. so I had to run the following commands to make the API work.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker &lt;span class="nb"&gt;exec&lt;/span&gt; &lt;span class="nt"&gt;-it&lt;/span&gt; django-docker python /TDL/manage.py migrate
docker &lt;span class="nb"&gt;exec&lt;/span&gt; &lt;span class="nt"&gt;-it&lt;/span&gt; django-docker python /TDL/manage.py createsuperuser
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;createsuperuser was for admin access. &lt;/p&gt;

&lt;p&gt;I tested the API endpoints through the container and ensured they worked with the PostgreSQL service.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3273624m82b3balcpzse.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3273624m82b3balcpzse.png" alt="Docker in terminal" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What’s Next?
&lt;/h2&gt;

&lt;p&gt;I explored deployment on Render using a Docker-based service. Though I didn’t complete deployment (since free tier ends soon), I now understand what it takes to deploy to production.&lt;/p&gt;

&lt;p&gt;Planning to finalize and deploy this soon.&lt;/p&gt;

&lt;h2&gt;
  
  
  Reflection
&lt;/h2&gt;

&lt;p&gt;This journey wasn’t just about building an API. It was about preparing to scale, containerize, and potentially deploy. I hit some roadblocks with .env handling, database permissions, and YAML formatting, but those were key moments in learning real backend + DevOps skills.&lt;/p&gt;

&lt;p&gt;You can access the entire project in my Github:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/nlankwei5/To-Do-List-Api" rel="noopener noreferrer"&gt;https://github.com/nlankwei5/To-Do-List-Api&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Feel free to comment, like and share. Also give advice on how to make this better. &lt;/p&gt;

</description>
    </item>
    <item>
      <title>I Pivoted from DevOps to Backend Software Engineering and Why It Was the Right Move for Me</title>
      <dc:creator>Ebenezer Lamptey</dc:creator>
      <pubDate>Fri, 20 Jun 2025 13:35:35 +0000</pubDate>
      <link>https://dev.to/nlankwei5/i-pivoted-from-devops-to-backend-engineering-and-why-it-was-the-right-move-for-me-2hlp</link>
      <guid>https://dev.to/nlankwei5/i-pivoted-from-devops-to-backend-engineering-and-why-it-was-the-right-move-for-me-2hlp</guid>
      <description>&lt;p&gt;Over the past few months, I’ve been exploring my path in software engineering. I originally set my sights on DevOps; the automation, the CI/CD pipelines, the cloud-native infrastructure... it all sounded exciting.&lt;/p&gt;

&lt;p&gt;But I quickly realized something:&lt;br&gt;
Entry-level DevOps roles are few and far between. Most job listings demand years of experience, deep system knowledge, and cloud certifications things that take time to build.&lt;/p&gt;

&lt;p&gt;So I made a strategic shift.&lt;/p&gt;

&lt;p&gt;I pivoted into backend development, specifically with Django and Django REST Framework. It wasn’t a complete departure, in fact, many of the skills overlap:&lt;/p&gt;

&lt;p&gt;Writing efficient, testable code&lt;/p&gt;

&lt;p&gt;Thinking in systems and logic&lt;/p&gt;

&lt;p&gt;Managing environments and APIs&lt;/p&gt;

&lt;p&gt;Designing clean, scalable backend architecture&lt;/p&gt;

&lt;p&gt;More importantly, I found that backend engineering plays to one of my natural strengths:&lt;br&gt;
I love logic. Solving problems, modeling real-world workflows in code, and making APIs that actually serve users that excites me.&lt;/p&gt;

&lt;p&gt;Now I’m building a full-featured ToDo List API with authentication, permissions, validation, pagination, Docker support, and (soon) automated tests and deployment. I’ll even be sharing my journey soon.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Automate Your Amazon EC2 Instance Setup with LAMP Stack Using Bash</title>
      <dc:creator>Ebenezer Lamptey</dc:creator>
      <pubDate>Tue, 24 Dec 2024 20:08:04 +0000</pubDate>
      <link>https://dev.to/nlankwei5/automate-your-amazon-ec2-instance-setup-with-lamp-stack-using-bash-2cin</link>
      <guid>https://dev.to/nlankwei5/automate-your-amazon-ec2-instance-setup-with-lamp-stack-using-bash-2cin</guid>
      <description>&lt;h3&gt;
  
  
  Introduction
&lt;/h3&gt;

&lt;p&gt;Setting up an Amazon EC2 instance manually can be time-consuming, especially when configuring a LAMP stack (Linux, Apache, MySQL, PHP). This guide introduces a Bash script that simplifies the process, automating tasks such as AWS CLI installation, EC2 instance configuration, and LAMP stack setup. Whether you're new to AWS or a seasoned cloud engineer, this script can save you significant time and effort.&lt;/p&gt;

&lt;h3&gt;
  
  
  Prerequisites
&lt;/h3&gt;

&lt;p&gt;Before running the script, ensure you have the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;AWS account&lt;/strong&gt;: Permissions to launch EC2 instances and configure security groups.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Key pair&lt;/strong&gt;: Created in the AWS region you plan to use.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;System requirements&lt;/strong&gt;: A Debian-based Linux system with &lt;code&gt;apt-get&lt;/code&gt; package manager.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Features
&lt;/h3&gt;

&lt;p&gt;The script provides the following functionalities:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;AWS CLI Check &amp;amp; Installation&lt;/strong&gt;: Verifies if the AWS CLI is installed and installs it if missing.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Instance Launch&lt;/strong&gt;: Automates EC2 instance creation with user-specified parameters.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security Group Configuration&lt;/strong&gt;: Sets up security rules to allow SSH access from your current IP.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;LAMP Stack Installation&lt;/strong&gt;: Installs and configures Apache, MySQL, and PHP on the instance.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Usage
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Clone or download the script.&lt;/li&gt;
&lt;li&gt;Make it executable:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;   &lt;span class="nb"&gt;chmod&lt;/span&gt; +x setup_ec2.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Run the script:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;   ./setup_ec2.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Follow the prompts to input:

&lt;ul&gt;
&lt;li&gt;AMI Image ID&lt;/li&gt;
&lt;li&gt;Key Pair Name&lt;/li&gt;
&lt;li&gt;Instance Type (e.g., t2.micro)&lt;/li&gt;
&lt;li&gt;AWS Region&lt;/li&gt;
&lt;li&gt;Security Group ID&lt;/li&gt;
&lt;li&gt;Path to your private SSH key&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Script Workflow
&lt;/h3&gt;

&lt;p&gt;The script performs the following steps:&lt;/p&gt;

&lt;h4&gt;
  
  
  1. &lt;strong&gt;AWS CLI Check &amp;amp; Installation&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;If the AWS CLI is not found on your system, the script installs it using &lt;code&gt;apt-get&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;sudo &lt;/span&gt;apt-get update
&lt;span class="nb"&gt;sudo &lt;/span&gt;apt &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-y&lt;/span&gt; awscli
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  2. &lt;strong&gt;AWS CLI Configuration&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The script configures the AWS CLI using your access credentials:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws configure
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  3. &lt;strong&gt;EC2 Instance Launch&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Prompts you to enter required parameters, then launches an EC2 instance:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws ec2 run-instances &lt;span class="nt"&gt;--image-id&lt;/span&gt; &lt;span class="nv"&gt;$image_id&lt;/span&gt; &lt;span class="nt"&gt;--key-name&lt;/span&gt; &lt;span class="nv"&gt;$key_name&lt;/span&gt; &lt;span class="nt"&gt;--instance-type&lt;/span&gt; &lt;span class="nv"&gt;$instance_type&lt;/span&gt; &lt;span class="nt"&gt;--region&lt;/span&gt; &lt;span class="nv"&gt;$region&lt;/span&gt; &lt;span class="nt"&gt;--query&lt;/span&gt; &lt;span class="s1"&gt;'Instances[0].InstanceId'&lt;/span&gt; &lt;span class="nt"&gt;--output&lt;/span&gt; text
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  4. &lt;strong&gt;Security Group Configuration&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Sets up SSH access by authorizing your current IP:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws ec2 authorize-security-group-ingress &lt;span class="nt"&gt;--group-id&lt;/span&gt; &lt;span class="nv"&gt;$security_group&lt;/span&gt; &lt;span class="nt"&gt;--protocol&lt;/span&gt; tcp &lt;span class="nt"&gt;--port&lt;/span&gt; 22 &lt;span class="nt"&gt;--cidr&lt;/span&gt; &lt;span class="si"&gt;$(&lt;/span&gt;curl &lt;span class="nt"&gt;-s&lt;/span&gt; https://checkip.amazonaws.com&lt;span class="si"&gt;)&lt;/span&gt;/32 &lt;span class="nt"&gt;--region&lt;/span&gt; &lt;span class="nv"&gt;$region&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  5. &lt;strong&gt;Connect and Install LAMP Stack&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Uses SSH to log in to the EC2 instance and install the LAMP stack:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;ssh &lt;span class="nt"&gt;-i&lt;/span&gt; &lt;span class="nv"&gt;$PRIVATE_KEY_PATH&lt;/span&gt; ubuntu@&lt;span class="nv"&gt;$public_ip&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt;&lt;span class="no"&gt;EOF&lt;/span&gt;&lt;span class="sh"&gt;
    sudo apt update &amp;amp;&amp;amp; sudo apt -y upgrade
    sudo apt install -y apache2 mysql-server php libapache2-mod-php php-mysql
    sudo systemctl enable apache2
    sudo systemctl restart apache2
&lt;/span&gt;&lt;span class="no"&gt;EOF
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Example Output
&lt;/h3&gt;

&lt;p&gt;An example of what to expect when running the script:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;2024-11-09 12:00:00 - AWS CLI not found. Installing...
2024-11-09 12:01:00 - Configuring AWS CLI
2024-11-09 12:02:00 - EC2 instance i-1234567890abcdef0 launched successfully.
2024-11-09 12:03:00 - EC2 instance public IP: 18.217.123.456
2024-11-09 12:04:00 - Connecting to EC2 instance via SSH...
2024-11-09 12:05:00 - Setting up LAMP server
2024-11-09 12:10:00 - LAMP setup completed on EC2 instance i-1234567890abcdef0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Troubleshooting
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;AWS CLI Installation Fails&lt;/strong&gt;: Ensure your system is Debian-based and supports &lt;code&gt;apt-get&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Instance Launch Fails&lt;/strong&gt;: Double-check your AWS credentials, region, and instance type.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;SSH Connection Issues&lt;/strong&gt;: Verify the private key path and ensure security group rules are configured correctly.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;This Bash script streamlines the process of setting up an EC2 instance with a LAMP stack, automating repetitive tasks and reducing the potential for errors. By following the steps outlined in this guide, you can deploy a fully functional LAMP server in minutes. Try it out, and feel free to share feedback or suggestions for improvement!&lt;/p&gt;

&lt;p&gt;see code here:  &lt;a href="https://github.com/nlankwei5/Creating-Ubuntu-VM-in-AWS" rel="noopener noreferrer"&gt;Automate Your Amazon EC2 Instance Setup with LAMP Stack Using Bash&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>[Boost]</title>
      <dc:creator>Ebenezer Lamptey</dc:creator>
      <pubDate>Thu, 05 Dec 2024 10:59:25 +0000</pubDate>
      <link>https://dev.to/nlankwei5/-1d75</link>
      <guid>https://dev.to/nlankwei5/-1d75</guid>
      <description>&lt;div class="ltag__link"&gt;
  &lt;a href="/nlankwei5" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__pic"&gt;
      &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F1682576%2F7e421f84-c7fe-4534-a962-77502f3f84ec.jpg" alt="nlankwei5"&gt;
    &lt;/div&gt;
  &lt;/a&gt;
  &lt;a href="/nlankwei5/comprehensive-bash-system-monitoring-script-technical-breakdown-2jko" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__content"&gt;
      &lt;h2&gt;Comprehensive Bash System Monitoring Script: Technical Breakdown&lt;/h2&gt;
      &lt;h3&gt;Ebenezer Lamptey ・ Dec 4&lt;/h3&gt;
      &lt;div class="ltag__link__taglist"&gt;
        &lt;span class="ltag__link__tag"&gt;#linux&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#bash&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#ubuntu&lt;/span&gt;
      &lt;/div&gt;
    &lt;/div&gt;
  &lt;/a&gt;
&lt;/div&gt;


</description>
    </item>
    <item>
      <title>Comprehensive Bash System Monitoring Script: Technical Breakdown</title>
      <dc:creator>Ebenezer Lamptey</dc:creator>
      <pubDate>Wed, 04 Dec 2024 16:45:14 +0000</pubDate>
      <link>https://dev.to/nlankwei5/comprehensive-bash-system-monitoring-script-technical-breakdown-2jko</link>
      <guid>https://dev.to/nlankwei5/comprehensive-bash-system-monitoring-script-technical-breakdown-2jko</guid>
      <description>&lt;p&gt;In today's fast-paced IT environments, keeping track of system performance is crucial. This bash script provides an elegant, automated solution for monitoring critical system metrics and sending timely alerts when resources are under strain.&lt;/p&gt;

&lt;h3&gt;
  
  
  Understanding the Script's Architecture
&lt;/h3&gt;

&lt;p&gt;The script is designed to monitor three primary system health indicators:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;CPU Usage&lt;/li&gt;
&lt;li&gt;Memory Consumption&lt;/li&gt;
&lt;li&gt;Disk Space Utilization&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Key Functionalities
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Logging Mechanism
&lt;/h3&gt;

&lt;p&gt;The script includes a robust logging function that timestamps every event. All system metrics and alerts are recorded in /var/log/sysmonitor.log, providing a comprehensive audit trail for system administrators.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;logfile=/var/log/sysmonitor.log

log(){     
    local message=$1     
    echo "$message - $(date +%T)" &amp;gt;&amp;gt; $logfile 
}
# Logging function with timestamp
# Uses local variable for message
# Appends to log file with current time

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Metric Collection
&lt;/h3&gt;

&lt;p&gt;Each metric collection function leverages standard Linux command-line tools:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;top&lt;/strong&gt; for CPU usage&lt;br&gt;
&lt;strong&gt;free **for memory consumption&lt;br&gt;
**df&lt;/strong&gt; for disk space utilization&lt;/p&gt;

&lt;p&gt;CPU Usage Metrics&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;CPU_metrics(){    
    local cpu_usage=$(top -bn1 | grep "CPU(s)" | awk '{print $1 + $2 + $3}')
    echo "$cpu_usage"
    log "Cpu usage is $cpu_usage" 
}

# Uses `top` command to retrieve CPU usage
# Adds first three CPU percentage columns
# Logs the result
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Memory Usage Metrics&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Memory_metrics(){     
    local Mem_usage=$(free -m | awk '/Mem:/ {print ($3 / $2) * 100.00}')
    echo "$Mem_usage"
    log "Cpu usage is $Mem_usage" 
}
# Uses `free` command to calculate memory usage
# Calculates percentage of used memory
# Converts to percentage with decimal precision

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Disk Usage Metrics&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;disk_usage_metrics(){     
    local disk_usage=$(df -h --total | grep ^total | awk '{print $5}' | sed 's/%//')
    echo "$disk_usage"
    log "Disk Usage is $disk_usage" 
}
# Uses `df` command to get total disk usage
# Removes percentage symbol for calculation
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Metric Processing&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;dskusage=$(printf "%.0f" "$(disk_usage_metrics)")
memusage=$(printf "%.0f" "$(Memory_metrics)")
cpuUsage=$(printf "%.0f" "$(CPU_metrics)")
# Converts floating-point metrics to integers
# Uses printf to round to nearest whole number
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Threshold-Based Alerting
&lt;/h2&gt;

&lt;p&gt;The script defines critical thresholds for each metric:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Disk Usage: 75%&lt;/li&gt;
&lt;li&gt;Memory Usage: 85%&lt;/li&gt;
&lt;li&gt;CPU Usage: 90%&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;When any metric exceeds its predefined threshold, the script:&lt;/p&gt;

&lt;p&gt;Logs the event in the system log&lt;br&gt;
Sends an email alert to the specified address&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;CPU_threshold=90
Mem_threshold=85
Disk_threshold=75

if [ "$dskusage" -gt "$Disk_threshold" ] then
    echo "Disk limt exceeded!!!" 1&amp;gt;&amp;gt;$logfile
    mail -s "Disk Limit Alert" email@address.com &amp;lt;&amp;lt;&amp;lt; "Disk usage has exceeded Limit. Kindly review"
fi
# Checks if disk usage exceeds 75%
# Logs the event
# Sends email alert
# Similar logic for memory and CPU thresholds

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Performance Automation
&lt;/h2&gt;

&lt;p&gt;The script is scheduled using a crontab entry to run every 4 hours, ensuring continuous, hands-off system monitoring.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cron_job="0 */4 * * */workspace/Automated-System-Monitoring-/Monitor.sh 1&amp;gt;&amp;gt;$logfile"
(crontab -l | grep -v -F "$cron_job"; echo "$cron_job") | crontab -
# Schedules script to run every 4 hours
# Adds to crontab without duplicating existing entries
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Implementation Tips
&lt;/h2&gt;

&lt;p&gt;To use this script:&lt;/p&gt;

&lt;p&gt;Replace &lt;a href="mailto:email@address.com"&gt;email@address.com&lt;/a&gt; with your actual email&lt;br&gt;
Ensure necessary permissions are set (chmod +x Monitor.sh)&lt;br&gt;
Verify mail utility is configured on your system&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Automated system monitoring is essential for maintaining infrastructure health. This bash script provides a lightweight, effective solution for tracking and alerting on critical system resources.&lt;/p&gt;

&lt;h2&gt;
  
  
  Sample Use Cases
&lt;/h2&gt;

&lt;p&gt;DevOps teams monitoring server performance&lt;br&gt;
Small to medium enterprise infrastructure management&lt;br&gt;
Personal server and workstation health tracking&lt;/p&gt;

&lt;p&gt;By proactively monitoring system metrics, you can prevent performance bottlenecks and potential system failures before they impact your operations. You can share ideas on improving this code in the comment section. Feel free to have discussions. you can accesss this code in my github : &lt;a href="https://github.com/nlankwei5/Automated-System-Monitoring-" rel="noopener noreferrer"&gt;Automated Monitoring Script&lt;/a&gt;&lt;/p&gt;

</description>
      <category>linux</category>
      <category>bash</category>
      <category>ubuntu</category>
    </item>
    <item>
      <title>Saving Time: Batch QR Code Generation with Bash</title>
      <dc:creator>Ebenezer Lamptey</dc:creator>
      <pubDate>Mon, 25 Nov 2024 18:14:48 +0000</pubDate>
      <link>https://dev.to/nlankwei5/saving-time-batch-qr-code-generation-with-bash-4bcm</link>
      <guid>https://dev.to/nlankwei5/saving-time-batch-qr-code-generation-with-bash-4bcm</guid>
      <description>&lt;p&gt;Ever found yourself manually creating QR codes one by one? As a choir marketing committee member, I was frustrated with online tools that either:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Limit free users to one QR code&lt;/li&gt;
&lt;li&gt;Charge for bulk generation&lt;/li&gt;
&lt;li&gt;Waste precious time&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Frustration
&lt;/h2&gt;

&lt;p&gt;Our choir needed QR codes for multiple YouTube links to print on marketing t-shirts. Existing solutions? Painfully slow and expensive.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Bash Solution
&lt;/h3&gt;

&lt;p&gt;I wrote a simple script to automate QR code generation:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;bash

# Check if input file is provided
if [ -z "$1" ]; then
    echo "Usage: $0 &amp;lt;input_file&amp;gt;"
    exit 1
fi

input_file="$1"

# Create a directory to store QR codes
mkdir -p qr_codes

# Generate QR codes for each URL in the input file
while IFS= read -r url; do
    filename="qr_codes/$(basename "$url").png"
    qrencode -o "$filename" "$url"
    echo "Generated QR code for $url"
done &amp;lt; "$input_file"

echo "QR codes saved in 'qr_codes' directory."

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Key Benefits
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Free&lt;/li&gt;
&lt;li&gt;Fast&lt;/li&gt;
&lt;li&gt;Customizable&lt;/li&gt;
&lt;li&gt;No subscription needed&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  How It Works
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Create a text file with URLs&lt;/li&gt;
&lt;li&gt;Run the script
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;./qr_code.sh &amp;lt;text file&amp;gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Get QR codes in seconds&lt;/p&gt;

&lt;h2&gt;
  
  
  Pro Tips
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Use with qrencode&lt;/li&gt;
&lt;li&gt;Works for any links, not just YouTube&lt;/li&gt;
&lt;li&gt;Easily scriptable and modifiable&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You can see the code in my github &lt;br&gt;
&lt;a href="https://github.com/nlankwei5/Create-Qr-code/blob/main/qr_code.sh" rel="noopener noreferrer"&gt;https://github.com/nlankwei5/Create-Qr-code/blob/main/qr_code.sh&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Feel free to try it and give comments for more improvements&lt;br&gt;
No more manual, repetitive work. 🚀&lt;/p&gt;

</description>
      <category>bash</category>
      <category>linux</category>
      <category>tutorial</category>
      <category>learning</category>
    </item>
    <item>
      <title>Building Your Own S3 Cloud Uploader CLI with Bash</title>
      <dc:creator>Ebenezer Lamptey</dc:creator>
      <pubDate>Wed, 24 Jul 2024 15:32:06 +0000</pubDate>
      <link>https://dev.to/nlankwei5/building-your-own-s3-cloud-uploader-cli-with-bash-2jek</link>
      <guid>https://dev.to/nlankwei5/building-your-own-s3-cloud-uploader-cli-with-bash-2jek</guid>
      <description>&lt;p&gt;Hey everyone! Today, I'm super excited to walk you through building your very own cloud uploader CLI using Bash. We'll cover everything from setting up your AWS S3 bucket to scripting the whole thing in Bash. I had an opportunity to also review a lot of peoples work and made mine a little bit different. &lt;/p&gt;

&lt;p&gt;This project was completed and uploaded to this Github repository: &lt;a href="https://github.com/nlankwei5/Cloud-Uploader-CLI" rel="noopener noreferrer"&gt;Cloud Uploader Cli&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;Before we dive into the script, let’s talk a bit about Bash. Bash (Bourne Again Shell) is a Unix shell and command language. It’s widely available on various operating systems, especially on Linux. Shell scripting allows you to automate tasks by writing a series of commands in a script file. It’s super powerful for automating system administration tasks, managing file systems, and even developing simple CLI tools.&lt;/p&gt;

&lt;h3&gt;
  
  
  Getting Started with AWS S3
&lt;/h3&gt;

&lt;p&gt;First things first, let's set up an S3 bucket on AWS. If you don't have an AWS account yet, go ahead and create one here. Once you're all set, follow these steps to create a bucket:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Log into the AWS Management Console: Head over to the S3 section.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmqpoip8b4ollefxoxvl4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmqpoip8b4ollefxoxvl4.png" alt="Image description" width="800" height="454"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create a new bucket: Click on the "Create bucket" button. Give your bucket a unique name and choose a region that’s closest to you.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyre0p8g7vn1c0j0127ue.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyre0p8g7vn1c0j0127ue.png" alt="Image description" width="800" height="455"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Bucket settings: Type the preferred name of the bucket making sure it is unique. You can pretty much go with the default settings for now. Just click through and create the bucket.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs5bllo74acriuzufhlng.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs5bllo74acriuzufhlng.png" alt="Image description" width="800" height="456"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fadgwcjg1bvcb239l069l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fadgwcjg1bvcb239l069l.png" alt="Image description" width="800" height="455"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Getting Your AWS Access Keys
&lt;/h3&gt;

&lt;p&gt;Now that we have our S3 bucket, we need to get our AWS access keys to allow our Bash script to interact with AWS services. Here’s how:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Navigate to IAM (Identity and Access Management): From the AWS Management Console, go to the IAM section.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fumcuk72hmyvb69lhafwq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fumcuk72hmyvb69lhafwq.png" alt="Image description" width="800" height="454"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create a new user: Click on create user on you top right in orange background. Then after, select a username of your choice.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fms88qrwe6ufzr8n7xoaj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fms88qrwe6ufzr8n7xoaj.png" alt="Image description" width="800" height="457"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft6d7p9oo70bcf7honeqb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft6d7p9oo70bcf7honeqb.png" alt="Image description" width="800" height="459"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Set permissions: Select Attach policy directly and in the search box, search for s3. Select the policy "AmazonS3FullAccess".&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fumj0foun44c6vgep4vfd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fumj0foun44c6vgep4vfd.png" alt="Image description" width="800" height="456"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2ms3xx9849sb1c72h3g0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2ms3xx9849sb1c72h3g0.png" alt="Image description" width="800" height="457"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;After click on your user created on the Users page. select Security credentials. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl6nuvjp7rz0ipeslks56.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl6nuvjp7rz0ipeslks56.png" alt="Image description" width="800" height="437"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Scroll down to find Access Keys then you create the access keys. &lt;/li&gt;
&lt;li&gt;Download the .csv file containing your access key ID and secret access key. Keep these credentials secure. DON'T SHARE!!!&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6byb5hmrmloqcmgka3tb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6byb5hmrmloqcmgka3tb.png" alt="Image description" width="800" height="455"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5st457dm1aa0cimxbpv1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5st457dm1aa0cimxbpv1.png" alt="Image description" width="800" height="455"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Cloud Uploader Script
&lt;/h2&gt;

&lt;p&gt;Alright, let’s get to the fun part – writing our Bash script! &lt;br&gt;
Make sure you have the AWS CLI installed and configured on your system. You can install it on Linux terminal with the following commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo apt-get update

sudo apt-get install awscli

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After installing AWS cli, you have to connect yout terminal to you aws account. You can do that by doing the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Run the AWS configure command
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws configure

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Enter your credentials
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;AWS Access Key ID [None]: YOUR_ACCESS_KEY_ID
AWS Secret Access Key [None]: YOUR_SECRET_ACCESS_KEY
Default region name [None]: YOUR_DEFAULT_REGION
Default output format [None]: json

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Replace YOUR_ACCESS_KEY_ID, YOUR_SECRET_ACCESS_KEY, and YOUR_DEFAULT_REGION with your actual AWS credentials and preferred region.&lt;/p&gt;

&lt;p&gt;That’s it. We are now connected to AWS. WE can go on to writing our bash script. &lt;br&gt;
Also install pv as well. (you will understand it's use soon)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo apt-get install pv 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;First create a bash script file that end in .sh eg: clouduploader.sh&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Log Function: Logs events and errors to clouduploader.log for troubleshooting.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;logfile="clouduploader.log"

log() {
    local message="$1"
    echo "$(date +'%Y-%m-%d %H:%M:%S') - $message" &amp;gt;&amp;gt; "$logfile"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;AWS Configuration Check and Confirmation: Ensures AWS CLI is correctly configured and also has access to your S3 bucket.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Read AWS credentials and configuration from environment variables or use defaults
AWS_ACCESS_KEY_ID="${AWS_ACCESS_KEY_ID}"
AWS_SECRET_ACCESS_KEY="${AWS_SECRET_ACCESS_KEY}"
AWS_DEFAULT_REGION="${AWS_DEFAULT_REGION}"
S3_BUCKET="${S3_BUCKET}"

# Function to check S3 configuration
check_s3_configuration() {
    echo "Checking S3 configuration..."
    if aws s3 ls "s3://$S3_BUCKET" &amp;amp;&amp;gt; /dev/null; then
        echo "S3 configuration is set correctly. The bucket $S3_BUCKET is accessible."
    else
        echo "Error: S3 configuration is incorrect or the bucket $S3_BUCKET is not accessible."
        exit 1
    fi
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Argument check: Checks if the arguments passed are exactly 2 ie; the file path and destination path
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;if [ $# -ne 2 ]; then
    echo "Please provide exactly two parameters: the file path and the destination path."
    exit 1
fi
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Assign 1st and 2nd arguments File path and destination path repectively and extract filename from file path
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Assign the first argument and second argument to FILEPATH and S3_DESTINATION respectively
FILEPATH="$1"
S3_DESTINATION="$2"

# Extract the filename from FILEPATH
file_name=$(basename "$FILEPATH")
echo "Filename: $file_name"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;File Check: function to check if the file exists locally using -f
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Function to check if the file exists locally
file_check() {
    if [ -f "$FILEPATH" ]; then
        echo "$file_name exists."
    else
        echo "Error: $file_name does not exist."
        exit 1
    fi
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;File Upload: Uses AWS CLI to upload the file to the specified S3 destination with a progress bar. Note:  "pv" already installed is a terminal-based (command-line based) tool in Linux that allows us for the monitoring of data being sent through pipe.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Function to upload file to S3 with progress bar using pv
upload_to_s3() {
    local local_file="$1"
    local s3_destination="$2"

    # Upload the file to S3 using AWS CLI and capture output/error
    if pv "$local_file" | aws s3 cp - "$s3_destination"; then
        log "File upload successful: $local_file -&amp;gt; $s3_destination"
        echo "File upload successful: $local_file -&amp;gt; $s3_destination"
    else
        log "Error: Failed to upload file $local_file to $s3_destination"
        echo "Error: Failed to upload file $local_file to $s3_destination"
        exit 1
    fi
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;File Synchronization: Offers options to manage existing files in S3 (overwrite, skip, rename).
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;file_sync() {
    if aws s3 ls "$S3_DESTINATION$file_name"; then
        read -p "File already exists in S3. Overwrite, skip, or rename? [o/s/r]: " choice
        case $choice in
            o)
                upload_to_s3 "$FILEPATH" "$S3_DESTINATION"
                ;;
            s)
                echo "Skipping upload."
                ;;
            r)
                upload_to_s3 "$FILEPATH" "$S3_DESTINATION-renamed"
                ;;
            *)
                echo "Invalid choice. Skipping upload."
                ;;
        esac
    else
        upload_to_s3 "$FILEPATH" "$S3_DESTINATION"
    fi
}

# Perform file synchronization if needed
file_sync
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Finally a function to generate a shareable link
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;presigned_url=$(aws s3 presign "$S3_DESTINATION$file_name" --expires-in 3600)
echo "Shareable link for the uploaded file: $presigned_url"

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Running the Script
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Save the script to to the created file, e.g., clouduploader.sh.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Make the script executable:&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;chmod +x clouduploader.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Run the script with the file path and S3 destination as arguments:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;./clouduploader.sh /path/to/your/file s3://your-bucket-name/destination/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;"/path/to/your/file" is your filepath and "s3://your-bucket-name/destination/" is your destination path&lt;/p&gt;

&lt;h4&gt;
  
  
  Advanced Features
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Progress Bar: The script uses pv to show a progress bar during file upload.&lt;/li&gt;
&lt;li&gt;Shareable Link: After a successful upload, the script generates a presigned URL for easy sharing.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Troubleshooting
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Permissions: Ensure your IAM user has the necessary permissions to access and upload to the S3 bucket.&lt;/li&gt;
&lt;li&gt;AWS CLI Configuration: Double-check your AWS CLI configuration and credentials.&lt;/li&gt;
&lt;li&gt;Log File: Check clouduploader.log for detailed logs and error messages.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Wrapping Up!
&lt;/h3&gt;

&lt;p&gt;You've now got your very own cloud uploader CLI that can upload files to your AWS S3 bucket. This is just the tip of the iceberg! There’s plenty more you can do to enhance this script – think about adding more features.By following this process you will be able to upload your files to S3 seamlessly through your Command Line. &lt;/p&gt;

&lt;p&gt;Bash is incredibly powerful, and mastering it can really up your automation game. Whether it's for system administration, file management, or building custom tools, Bash scripting opens up a world of possibilities. For those keen to dive deeper, check out all bash resources online.&lt;/p&gt;

&lt;p&gt;Happy scripting, and keep exploring!&lt;/p&gt;

&lt;p&gt;Enjoy trying it out and I am open for any feedback and ideas. &lt;/p&gt;

</description>
      <category>linux</category>
      <category>bash</category>
      <category>aws</category>
      <category>cli</category>
    </item>
    <item>
      <title>New Journey</title>
      <dc:creator>Ebenezer Lamptey</dc:creator>
      <pubDate>Wed, 26 Jun 2024 13:44:33 +0000</pubDate>
      <link>https://dev.to/nlankwei5/new-journey-oj7</link>
      <guid>https://dev.to/nlankwei5/new-journey-oj7</guid>
      <description>&lt;p&gt;Hi everyone, I'm new here and I am a computer engineering graduate from Ghana Communications Technology University, Accra-Ghana in West Africa. I have always been curious abut the cloud and networking. The path has been a bit hazy for me for the past two years. I got completely confused. Finally a have a guide and a mentor to help out with my journey. I am restarting now hopefully I become a cloud engineer in the next six months. I'll be posting what i learn and share a few concerns and project I am undertaking. Your feedback is welcomed. &lt;/p&gt;

</description>
      <category>cloudcomputing</category>
      <category>aws</category>
      <category>beginners</category>
    </item>
  </channel>
</rss>
