<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Dharamraj Yadav</title>
    <description>The latest articles on DEV Community by Dharamraj Yadav (@dharam_in).</description>
    <link>https://dev.to/dharam_in</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/dharam_in"/>
    <language>en</language>
    <item>
      <title>My First CI/CD Pipeline with YAML and Docker on AWS</title>
      <dc:creator>Dharamraj Yadav</dc:creator>
      <pubDate>Tue, 17 Mar 2026 13:41:32 +0000</pubDate>
      <link>https://dev.to/dharam_in/my-first-cicd-pipeline-with-yaml-and-docker-on-aws-1glh</link>
      <guid>https://dev.to/dharam_in/my-first-cicd-pipeline-with-yaml-and-docker-on-aws-1glh</guid>
      <description>&lt;p&gt;Hello everyone, today I want to share how I added my first CI/CD pipeline using YAML and Docker on AWS. First, I forked a React project on my GitHub because I didn’t want to waste time setting up React and adding a very simple app, so I decided to fork a project. Then I cloned that repository and added &lt;code&gt;.dockerignore&lt;/code&gt; and &lt;code&gt;Dockerfile&lt;/code&gt; files in my project’s root directory, and this is the code you can see in the screenshots:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frngm3wsoruez4kkezexn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frngm3wsoruez4kkezexn.png" alt=" " width="800" height="139"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After that, I created an EC2 instance to run my project and installed Docker. I added my Ubuntu user to Docker because Docker runs as root and the Ubuntu user doesn’t have access to that, so I added the user to the Docker group. Look at the screenshot I attached:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxy7nhm41zcuw6jqoni99.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxy7nhm41zcuw6jqoni99.png" alt=" " width="800" height="337"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After that, I worked on the YAML for CI/CD. I created a &lt;code&gt;.github&lt;/code&gt; folder in the root directory of my project, then created another folder called &lt;code&gt;workflows&lt;/code&gt;, and in this folder I created a &lt;code&gt;deploy.yml&lt;/code&gt; file. In this file, first I added the project name, after that I added &lt;code&gt;on&lt;/code&gt; for when I push code to the main branch, then it triggers the action, and in the last I added my jobs. First, I used Ubuntu, then I checked out the code in that Ubuntu machine. After that, I copied this code to my EC2 machine, then SSH into my Ubuntu machine, then cd into my project, stop the running Docker container, build a new one, and run it. When I pushed the code, I saw my project was still not running, then I added an inbound rule in my EC2 machine because by default EC2 blocks other access points, so I added an inbound rule so everyone can access it.&lt;/p&gt;

&lt;p&gt;Look at the YAML code screenshot and running project screenshot:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxg76s8q2sdmg23mwcdgo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxg76s8q2sdmg23mwcdgo.png" alt=" " width="800" height="337"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw8wu9g3vupjj0yaq3gbt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw8wu9g3vupjj0yaq3gbt.png" alt=" " width="800" height="437"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>docker</category>
      <category>actionshackathon</category>
      <category>aws</category>
      <category>cicd</category>
    </item>
    <item>
      <title>Learning Docker: Dockerfiles, .dockerignore, and Port Mapping 🐳</title>
      <dc:creator>Dharamraj Yadav</dc:creator>
      <pubDate>Sat, 07 Mar 2026 10:56:14 +0000</pubDate>
      <link>https://dev.to/dharam_in/learning-docker-dockerfiles-dockerignore-and-port-mapping-1g2n</link>
      <guid>https://dev.to/dharam_in/learning-docker-dockerfiles-dockerignore-and-port-mapping-1g2n</guid>
      <description>&lt;p&gt;Hey everyone! Here’s a quick update on what I learned last week. I finally got hands-on with two essential Docker files: the &lt;code&gt;Dockerfile&lt;/code&gt; and &lt;code&gt;.dockerignore&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;First up, let's talk about the &lt;code&gt;.dockerignore&lt;/code&gt; file. If you've used Git, this works exactly like &lt;code&gt;.gitignore&lt;/code&gt;. When we build an image, this file prevents Docker from copying over unnecessary files and folders, which helps keep our final image clean and lightweight.&lt;br&gt;
Here is my setup:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffs6pwfi6e1bjhn77c9cl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffs6pwfi6e1bjhn77c9cl.png" alt=" " width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, I created the &lt;code&gt;Dockerfile&lt;/code&gt;. This is basically the instruction manual for your project. We write down all the necessary commands in here, and Docker uses this file as the blueprint to build our custom image.&lt;br&gt;
Take a look at the code:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F28rlk7gs5kprpii59p7g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F28rlk7gs5kprpii59p7g.png" alt=" " width="800" height="295"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After building the image, it was time to run it. This is where I learned how &lt;strong&gt;port mapping&lt;/strong&gt; works. Port mapping is simply telling the Docker container, "Hey, forward your internal project port to this specific port on my local system."&lt;/p&gt;

&lt;p&gt;In my setup below, I mapped the ports so my project runs on port &lt;code&gt;5173&lt;/code&gt; inside the container's terminal, but I can actually access it in my local browser using port &lt;code&gt;3200&lt;/code&gt;.&lt;br&gt;
Check out the terminal and browser screenshots:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm714g8jdt4k2qsyyjjmn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm714g8jdt4k2qsyyjjmn.png" alt=" " width="800" height="95"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgkqicpp4ifivogmop9m4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgkqicpp4ifivogmop9m4.png" alt=" " width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>docker</category>
      <category>webdev</category>
      <category>programming</category>
      <category>containers</category>
    </item>
    <item>
      <title>Getting Started with Docker: Images, Containers, Volumes, and Networks 🐳</title>
      <dc:creator>Dharamraj Yadav</dc:creator>
      <pubDate>Tue, 24 Feb 2026 11:15:43 +0000</pubDate>
      <link>https://dev.to/dharam_in/getting-started-with-docker-images-containers-volumes-and-networks-2gcn</link>
      <guid>https://dev.to/dharam_in/getting-started-with-docker-images-containers-volumes-and-networks-2gcn</guid>
      <description>&lt;p&gt;Hello Everyone as part of my journey transitioning to DevSecOps, today i dove into Docker.&lt;/p&gt;

&lt;p&gt;In simple terms Docker is a software platform that allows you to build, test and deploy your applications quickly. Today i learned about four core concept of Docker, and i want to share my understanding with you all.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Docker Images (The Blueprint)&lt;/strong&gt;&lt;br&gt;
An image is basically the blueprint for a container.We create this blueprint by writing some instructions inside a file called a &lt;strong&gt;&lt;em&gt;Dockerfile&lt;/em&gt;&lt;/strong&gt;. it contains everything your application needs to run - the code, runtime, libraries, and environment variables.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Docker Container (The Running App)&lt;/strong&gt;&lt;br&gt;
If an image is a blueprint, then a container is the actual building. A container is simply the running instance of a Docker Image. You can start, stop, or delete a container anytime without affecting the original image.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Docker Volume (Permanent Storage)&lt;/strong&gt;&lt;br&gt;
By default, data inside the container temporary. this is where a volume come in. Volumes provide permanent storage for your container's data by saving it directly to your hard drive.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Real World Example:&lt;/strong&gt; Suppose we host a database(like mongoDB) inside a container, and we have 1,000 users registered. if we delete or restart the container, all the users data will be lost. To prevent this we use a volume to safety store that database information permanently on the host machine.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Docker Network (The Communication Bridge)&lt;/strong&gt;&lt;br&gt;
When we create and run multiple containers at the same time, they are completely isolated from each other by default for security.&lt;br&gt;
However in real applications, they need to communicate. For example if i have a Node.js backend running in one container and a database in another container, they need to talk each other. We Use Docker Networks to create a secure connection between these isolated containers.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;My Journey Continues&lt;/strong&gt;&lt;br&gt;
Understanding these four concepts completely changed how I look at deployments. No more "&lt;em&gt;It work's on my machine&lt;/em&gt;".&lt;/p&gt;

</description>
      <category>docker</category>
      <category>containers</category>
      <category>beginners</category>
      <category>devops</category>
    </item>
    <item>
      <title>Hello World: My Journey from MERN Stack to DevSecOps 🚀</title>
      <dc:creator>Dharamraj Yadav</dc:creator>
      <pubDate>Sat, 21 Feb 2026 04:13:32 +0000</pubDate>
      <link>https://dev.to/dharam_in/hello-world-my-journey-from-mern-stack-to-devsecops-127m</link>
      <guid>https://dev.to/dharam_in/hello-world-my-journey-from-mern-stack-to-devsecops-127m</guid>
      <description>&lt;p&gt;Hello everyone!&lt;/p&gt;

&lt;p&gt;My name is Dharamraj Yadav, and this is my first post on this platform. I am currently working as a MERN Stack Developer, where I build web applications and craft APIs using Node.js. I have gained solid experience working across the MERN stack.&lt;/p&gt;

&lt;p&gt;Recently, I’ve been observing my seniors handle server management, specifically focusing on reducing AWS infrastructure costs. Watching them work over the last three months really inspired me. I realized that with a bit of effort, I have the confidence to learn these new concepts myself.&lt;/p&gt;

&lt;p&gt;So, I started using AI tools like Gemini to deeply understand how servers, deployments, and cloud infrastructure actually work.&lt;/p&gt;

&lt;p&gt;Now, I have officially decided to start my #LearnInPublic journey—transitioning from a MERN Stack Developer to DevSecOps. ⚙️🛡️&lt;/p&gt;

&lt;p&gt;Please connect with me to follow my journey, and stay tuned for more updates and learnings from the tech field!&lt;/p&gt;

&lt;h1&gt;
  
  
  DevSecOps #MERN #AWS #LearnInPublic #DevOpsJourney
&lt;/h1&gt;

</description>
      <category>devsecops</category>
      <category>devops</category>
      <category>aws</category>
      <category>mern</category>
    </item>
  </channel>
</rss>
