<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Mohammed J Hossain</title>
    <description>The latest articles on DEV Community by Mohammed J Hossain (@mjhossain).</description>
    <link>https://dev.to/mjhossain</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/mjhossain"/>
    <language>en</language>
    <item>
      <title>My Mistake: Learning too fast</title>
      <dc:creator>Mohammed J Hossain</dc:creator>
      <pubDate>Tue, 30 Jan 2024 21:19:57 +0000</pubDate>
      <link>https://dev.to/mjhossain/my-mistake-learning-too-fast-3cd6</link>
      <guid>https://dev.to/mjhossain/my-mistake-learning-too-fast-3cd6</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;Disclaimer 1: The most significant impact I wish for this article to have is on myself.&lt;br&gt;
Disclaimer 2: The purpose of this article is to encourage a thoughtful slowdown for a beneficial reason.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Backstory
&lt;/h2&gt;

&lt;p&gt;I always wanted to be knowledgeable in many different areas and have the qualities of a jack of all trades. I have a lot of respect for people who are very good at one thing; they are very important to the progress in their fields and have made a big contribution to the achievements of today.&lt;/p&gt;

&lt;p&gt;But knowing that we only have one life motivates me to strive to be a jack of all trades. My drive to discover and acquire as much knowledge as I can stems from the fact that death is inevitable. At first, this method worked for me, but as I moved into my late 20s, I realized that learning at a slower pace could improve my effectiveness. I was anxious to learn as much as I could quickly because I believed that the more I learned, the more I could explore. Despite being reasonable, this way of thinking caused me to only understand many things on the surface, never reaching a level of comprehension that I found satisfactory—a subjective and personal standards.&lt;/p&gt;

&lt;p&gt;So what, in my opinion, makes this "satisfactory level"? I'm studying to the point where I can confidently answer questions and mentor others on the subject, providing concise explanations and setting the foundation for the way they learn. &lt;/p&gt;

&lt;h2&gt;
  
  
  The Mistake
&lt;/h2&gt;

&lt;p&gt;From the above 3 paragraphs you probably got the idea of the mistake I am about to talk about i.e learning too fast. I didn't believe that there is something called "learning too fast" until recently when I turned around to see how I got to where I am, don't get me wrong, I am very happy where I am but it is important to acknowledge the mistakes and better one self.&lt;/p&gt;

&lt;p&gt;I just think that if I slowed down and took my time learning each topic or in my case, programming languages or a certain technology or concept, than I would have achieved more than what I did as of now, I failed to tell myself that &lt;strong&gt;it's not a race but a marathon&lt;/strong&gt;. Going forward I want to set goals trying to slow myself down when it comes to learning and will not fall into the trap of FOMO(Fear of Missing out). &lt;/p&gt;

&lt;p&gt;Next time you feel like you're not learning fast enough just tell yourself that it's okay, it's okay to take your time to learn. &lt;strong&gt;It's okay to slow down at times.&lt;/strong&gt;&lt;/p&gt;

</description>
      <category>learning</category>
      <category>computerscience</category>
      <category>beginners</category>
      <category>programming</category>
    </item>
    <item>
      <title>A Developer’s Tale: Crafting a CI/CD Pipeline for My Containerized React Portfolio Using GitHub Actions</title>
      <dc:creator>Mohammed J Hossain</dc:creator>
      <pubDate>Sun, 28 Jan 2024 03:28:18 +0000</pubDate>
      <link>https://dev.to/mjhossain/a-developers-tale-crafting-a-cicd-pipeline-for-my-containerized-react-portfolio-using-github-actions-5bjm</link>
      <guid>https://dev.to/mjhossain/a-developers-tale-crafting-a-cicd-pipeline-for-my-containerized-react-portfolio-using-github-actions-5bjm</guid>
      <description>&lt;p&gt;CI/CD was something I never truly understood, even after reading, listening to, and watching various resources on it — until the day I needed to automate my website. I have a static (for now) portfolio website, &lt;a href="http://www.mointech.dev"&gt;www.mointech.dev&lt;/a&gt;, where I make occasional updates, particularly to my attached resume.&lt;/p&gt;

&lt;p&gt;The process used to be:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Make a change.&lt;/li&gt;
&lt;li&gt;Push the code change to GitHub.&lt;/li&gt;
&lt;li&gt;Build the React app.&lt;/li&gt;
&lt;li&gt;Create a Docker image.&lt;/li&gt;
&lt;li&gt;Push it to DockerHub.&lt;/li&gt;
&lt;li&gt;Pull the image on my Raspberry Pi hosting the website.&lt;/li&gt;
&lt;li&gt;Run the container.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This was too many steps for some small changes. Upon further exploration, I came across “CI/CD” — the same mystery I’d been trying to unravel. This time, however, I had a practical application for it, so I delved deeply, using my best tool: CHAT GPT. As the AI it is, it outlined the entire process and provided a crucial GitHub Actions YAML file. While deciphering this YAML file, I began to grasp a significant portion of the CI/CD methodology… sometimes in IT, it takes a while to understand a simple concept.&lt;/p&gt;

&lt;p&gt;Before anyone says, &lt;em&gt;“evErYOne cAn do IT uSIng chAtgpT,”&lt;/em&gt; I want to clarify: I used ChatGPT as a tool, exactly how it’s meant to be used. It provided a clarified version of what I had already envisioned. Now that we’ve got that out of the way, let’s continue.&lt;/p&gt;

&lt;p&gt;My goal was simple:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Push changes to the GitHub repo.&lt;/li&gt;
&lt;li&gt;Trigger a CI/CD pipeline in GitHub Actions to create a Docker image.&lt;/li&gt;
&lt;li&gt;Push the Docker image to DockerHub.&lt;/li&gt;
&lt;li&gt;SSH into my web server.&lt;/li&gt;
&lt;li&gt;Pull the Docker image from DockerHub.&lt;/li&gt;
&lt;li&gt;Deploy my portfolio container.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This “simple” goal turned out to be akin to level 1000 of the snake game on the old Nokia 3310 (yeah, I was born in 1997 and played that).&lt;/p&gt;

&lt;p&gt;I faced a lot… I mean A LOT of errors when I tried it. I’ll mention some notable ones and how I approached each issue and solved it. In the end, it took me 52 failed GitHub Actions to get it right. Buckle up for this ride.&lt;/p&gt;

&lt;h2&gt;
  
  
  Issue #1: Container Does Not Reflect Change
&lt;/h2&gt;

&lt;p&gt;This issue had nothing to do with GitHub Actions but more with testing the Docker image, container, and connection to Docker Hub. The problem was that every time I made changes to the code and created an image, the change wasn’t reflected. After almost 2 hours of testing, trying, and cursing, the solution was Ctrl + Shift + R. Yes, you read that right! It was my browser cache that didn’t want to update. This was a wholesome error, to be honest.&lt;/p&gt;

&lt;h2&gt;
  
  
  Issue #2: GitHub Shenanigans
&lt;/h2&gt;

&lt;p&gt;This issue was a cluster of many problems. It was my first time using GitHub Actions in my project, so it took me a while to get comfortable with setting secrets, environment variables, SSH keys, and other minor things. One notable issue was that, for some reason, after building the Docker image, the actions VM couldn’t push it due to a two-section build and a weird issue with the system deleting the image file. Later, I realized it was my doing; while executing ‘rm -rf ./dist/*’, I accidentally removed the dist directory itself. Most of the issues I faced with actions were due to my lack of experience using it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Issue #3: OS Architecture Mismatch
&lt;/h2&gt;

&lt;p&gt;This was the most challenging issue, the final boss of this whole adventure. The issue was quite unique compared to the others. For context, my Raspberry Pi (web server) runs on Ubuntu 22.04 LTS (arm64v8 architecture), and the GitHub Actions platform also used Ubuntu 22.04, but I wasn’t sure about the architecture. The problem was that the Docker image created on the GitHub Actions platform wasn’t compatible when I tried to run the container on my RPi. After two days of grappling with this, I decided to ditch the idea of using the GitHub platform to create the image, and I’ll explain how I achieved my goal without tearing my hair out.&lt;br&gt;
When you can’t climb a wall, break that sh*t!&lt;/p&gt;

&lt;p&gt;I tried, and when I say I tried, I really, really tried to solve this platform mismatch issue. I spent hours on it, attempting to do the whole “CI/CD” in the GitHub Actions way. Then, I realized my goal wasn’t to use a specific tool to automate my web app deployment, but to automate my web app deployment, no matter how I achieved it.&lt;/p&gt;

&lt;p&gt;I wrote a bash script to create, push, pull, and deploy the Docker image/container. Things were a bit different, as I needed the git repo on my RPi as well, but in the end, I still used GitHub Actions as a tool to kick-start the automation. Every time I pushed changes to my main branch, I SSH-ed into my web server using GitHub Actions and ran my script that handled all the other steps, consisting of:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Pulling the GitHub Repo.&lt;/li&gt;
&lt;li&gt;Creating the Docker Image.&lt;/li&gt;
&lt;li&gt;Pushing it to DockerHub (no longer needed, but I kept it).&lt;/li&gt;
&lt;li&gt;Deploying the web app Container.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;It was one heck of an adventure, and I learned a lot. I didn’t need to do all this just for a static portfolio website, but I wanted to, just for fun. And what is life without having fun with complex stuff? ;)&lt;/p&gt;

&lt;p&gt;I hope you enjoyed reading about my automation journey. Stay tuned for more!&lt;/p&gt;

</description>
      <category>cicd</category>
      <category>react</category>
      <category>docker</category>
      <category>githubactions</category>
    </item>
    <item>
      <title>From Dream to Server Rack: Unveiling the Epic Journey of Crafting My Homelab Server</title>
      <dc:creator>Mohammed J Hossain</dc:creator>
      <pubDate>Sun, 28 Jan 2024 03:23:29 +0000</pubDate>
      <link>https://dev.to/mjhossain/from-dream-to-server-rack-unveiling-the-epic-journey-of-crafting-my-homelab-server-1hif</link>
      <guid>https://dev.to/mjhossain/from-dream-to-server-rack-unveiling-the-epic-journey-of-crafting-my-homelab-server-1hif</guid>
      <description>&lt;p&gt;As a kid, even the word “server” would give me goosebumps; I never truly understood what a server is. A cold room full of tall black boxes? A giant skyscraper-shaped computer? A huge parking lot-sized room full of computers? The answer I finally got is that it could be as simple as a credit card-sized Raspberry Pi or an arena-sized infrastructure.&lt;/p&gt;

&lt;p&gt;I can’t afford a building’s worth of computers, but a Raspberry Pi? Pi! Maybe! This got me thinking, and for over a year, I pondered what I could do with a server. Host my website? I got Hostinger.&lt;/p&gt;

&lt;p&gt;Then I started digging into why anyone would ever want to have their own server, and I got overstimulated, fascinated, and bamboozled by what people did with their servers or the magic word: HOMELAB. Now I wanted to buy an RPi or maybe two, or three, or four, and a mini PC too, why not?&lt;/p&gt;

&lt;p&gt;After a week of reckless buying, here I am with a Mini PC and 4 Raspberry Pis. NOW WHAT? What do I create? Also, where do I put them all… well, add a 12U server rack as well.&lt;/p&gt;

&lt;p&gt;Here are some of the notable services I am running in my homelab.&lt;/p&gt;

&lt;h2&gt;
  
  
  Automated Media Server — RPi #1
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;Disclaimer: The following section provides information on creating a media platform based on torrents. It is important to note that downloading copyrighted content without proper authorization from the content owner may be illegal in many jurisdictions. This article does not condone or promote any form of piracy or copyright infringement.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The first project I worked on is a fully automated media server using Jellyfin and the *arr suite of apps. Undertaking a project like this taught me the basics of RPi and its OS’s. For this project, though, I wanted something minimal with a clean interface, so I used CasaOS as the platform, and it made it very simple to configure and maintain the Docker containers related to the media server.&lt;/p&gt;

&lt;p&gt;It wasn’t the easiest project, as I did bump into lots of issues regarding storage mount, users and groups permission in the Linux file system, and a lot of Docker-related issues. Nonetheless, this was one of my favorite projects I did using an RPi, and it also made managing my media in my house and outside very easy.&lt;/p&gt;

&lt;p&gt;One RPi down, 3 more to go!&lt;/p&gt;

&lt;h2&gt;
  
  
  Proxmox Server — Mini PC
&lt;/h2&gt;

&lt;p&gt;After completing the media server, I had 3 more RPis and a Mini PC. I wanted the Mini PC to have a more significant role and basically be the center of my homelab, and then I came across Proxmox — my favorite hypervisor. Over the years, I tinkered around with Type-2 hypervisors such as Virtualbox, Parallels, and VMWare, but to have a virtualization platform where I can access the host’s desktop/terminal from a web browser is absolutely mind-blowing.&lt;/p&gt;

&lt;p&gt;Setting up Proxmox was fairly simple, and the user interface for a tech geek should feel like home (cluttered). Remember when I said I was hosting my website on Hostinger… well, goodbye Hostinger. I HAVE MY OWN CLOUD NOW!! Yeah, I was ecstatic when I realized I have all 5 infinity stones in the form of PROXMOX! Basically, I created multiple web servers, NAS servers, a cybersecurity homelab, and a sandbox machine all on my Mini PC with Proxmox… yeah, the Mini PC (32GB RAM) ain’t mini anymore.&lt;/p&gt;

&lt;p&gt;My future endeavor is to get another Mini PC and set up a small business Microsoft AD, MDM, DHCP, DNS suite for testing purposes.&lt;/p&gt;

&lt;h2&gt;
  
  
  NAS Servers — RPi &amp;amp; Proxmox
&lt;/h2&gt;

&lt;p&gt;One of the driving forces behind buying 4 RPis was to create multiple NAS servers. I run a wedding videography business where I have to store a lot of files, and it is huge in size as well. The issue that I was facing was the transfer of files between my photographers/videographers and editors. There are services like Google Drive, Dropbox, One Drive, and many more, but the cost for those services is not justifiable. Hence, I invested in a good HDD and an RPi to give me my own cloud drive with access over the internet to whoever I choose. So, I used One RPi with OpenMediaVault and Two Linux VMs in Proxmox to run TrueNAS and another OMV NAS. In total, 3 NAS servers in my homelab provide me access to my data across all my devices from anywhere.&lt;/p&gt;

&lt;h2&gt;
  
  
  Home VPN Tunnel &amp;amp; Reverse Proxy — RPi
&lt;/h2&gt;

&lt;p&gt;You might’ve noticed words like “access over the internet,” “public access,” and other similar words are italicized in the above sections, and that is basically a hint of this section. So if you are a little bit knowledgeable in basic networking, you will know that we have public internet and private internet, which basically means that all of our devices can browse the internet, but the services running on our machines are not always accessible from outside. This is why you can play a video on your TV when you’re at home, but when outside, you cannot do the same.&lt;/p&gt;

&lt;p&gt;I needed a way to access my media server, NAS servers, and Proxmox from outside, and that’s when VPN comes into play. Normally we use VPN to bypass location-based access and other trivial things, but VPN can also be used within your homelab setup to access all the services from outside.&lt;/p&gt;

&lt;p&gt;So, I used one of my RPis to run WireGuard VPN and other core services such as a reverse proxy with NGINX Proxy Manager for secure access and other internet access services.&lt;/p&gt;

&lt;h2&gt;
  
  
  NTFY.sh &amp;amp; UpTime Kuma — RPi
&lt;/h2&gt;

&lt;p&gt;Running all the above-mentioned servers and containers and the ones I didn’t mention comes with a lot of monitoring and maintenance, and I do not have the time or the desire to look into each service every now and then. So, in order to automate all these, I use UpTime Kuma to monitor all the servers and containers running and NTFY to send notification if any of the services is down.&lt;/p&gt;

&lt;h2&gt;
  
  
  Future Plan for my Homelab
&lt;/h2&gt;

&lt;p&gt;The future of my homelab is definitely bright lol. I do plan to run my own DNS and DHCP servers in the near future along with a software-based firewall. Ultimately, I want my homelab to be technologically fun and be able to use it to learn complicated DevOps processes and concepts.&lt;/p&gt;

&lt;p&gt;Thank you for reading through it all and keep in touch as I plan to publish guides on the installation processes and general server updates as I move forward!&lt;/p&gt;

</description>
      <category>homelab</category>
      <category>virtualization</category>
      <category>raspberrypi</category>
      <category>homenetwork</category>
    </item>
  </channel>
</rss>
