<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Christiana Otoboh</title>
    <description>The latest articles on DEV Community by Christiana Otoboh (@christiana_otoboh).</description>
    <link>https://dev.to/christiana_otoboh</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/christiana_otoboh"/>
    <language>en</language>
    <item>
      <title>Deploying a Containerized WordPress App on AWS with Docker, EBS &amp; S3 Backups</title>
      <dc:creator>Christiana Otoboh</dc:creator>
      <pubDate>Fri, 20 Mar 2026 10:07:14 +0000</pubDate>
      <link>https://dev.to/christiana_otoboh/deploying-a-containerized-wordpress-app-on-aws-with-docker-ebs-s3-backups-38if</link>
      <guid>https://dev.to/christiana_otoboh/deploying-a-containerized-wordpress-app-on-aws-with-docker-ebs-s3-backups-38if</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;In this project, I deployed a containerized WordPress application on an AWS EC2 instance using Docker. The setup includes a MySQL database, persistent storage with EBS, and automated backups to S3.&lt;br&gt;
The goal wasn't just to get WordPress running; it was to understand how real-world deployments handle data persistence, networking, and automation.&lt;/p&gt;
&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;Before following along, make sure you have the following in place:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;An active AWS account&lt;/li&gt;
&lt;li&gt;Basic familiarity with the Linux command line&lt;/li&gt;
&lt;li&gt;A key pair created in AWS (needed to SSH into your EC2 instance)&lt;/li&gt;
&lt;li&gt;Basic understanding of what Docker is (you don't need to be an expert)&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; Everything in this project is done on a &lt;strong&gt;free-tier eligible&lt;/strong&gt; EC2 instance.&lt;br&gt;
Just be mindful to stop or terminate resources when you're done to avoid unexpected charges.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h2&gt;
  
  
  Project Overview
&lt;/h2&gt;

&lt;p&gt;Here’s what I built:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A Linux EC2 instance hosted on AWS&lt;/li&gt;
&lt;li&gt;Docker installed and configured using a Bash script&lt;/li&gt;
&lt;li&gt;WordPress and MySQL running as Docker containers&lt;/li&gt;
&lt;li&gt;Persistent storage using an attached EBS volume&lt;/li&gt;
&lt;li&gt;Automated MySQL backups uploaded to an S3 bucket&lt;/li&gt;
&lt;li&gt;Secure access configured via Security Groups&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;
  
  
  ⚙️Step 1: Provisioning the EC2 Instance and EBS Volume via the AWS Console
&lt;/h3&gt;

&lt;p&gt;With the architecture in mind, the first step was setting up the core infrastructure.&lt;br&gt;
Launching the EC2 Instance&lt;br&gt;
&lt;em&gt;(Not covered in detail here — the process is fairly straightforward. The default configuration suffices, with one exception: make sure to configure the security group as follows.)&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Port 22 → for SSH access&lt;/li&gt;
&lt;li&gt;Port 80 → for web access&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Creating the EBS Volume&lt;br&gt;
&lt;em&gt;(Also not covered in detail, but equally straightforward — just ensure the volume is provisioned in the same region as your EC2 instance.)&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Attaching the EBS Volume to the EC2 Instance&lt;br&gt;
Once the volume is created, attach it to the instance by following these steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Select the EC2 instance in the AWS Console.&lt;/li&gt;
&lt;li&gt;Click Actions, then navigate to Storage → Attach Volume.&lt;/li&gt;
&lt;li&gt;From the dropdown, select the volume you just created.&lt;/li&gt;
&lt;li&gt;Choose a device name from the dropdown — any available option works.&lt;/li&gt;
&lt;li&gt;Click Attach Volume to confirm.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;I then connected to the instance via SSH and handled everything else from the terminal.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fntcnvewaccze64el3kox.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fntcnvewaccze64el3kox.png" alt=" " width="800" height="167"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  🐳 Step 2: Installing Docker &amp;amp; Setting up Configurations
&lt;/h3&gt;

&lt;p&gt;Using the Bash script below, I automated the entire environment setup. Specifically, the script:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Installed Docker and its plugins, including Docker Compose&lt;/li&gt;
&lt;li&gt;Started and enabled the Docker service&lt;/li&gt;
&lt;li&gt;Added the ubuntu user to the Docker group, allowing Docker commands to run without sudo&lt;/li&gt;
&lt;li&gt;Mounted the EBS volume to the filesystem&lt;/li&gt;
&lt;li&gt;Used a bind mount so MySQL writes data directly to the EBS volume, ensuring data survives container restarts and is not tied to the container lifecycle&lt;/li&gt;
&lt;li&gt;Installed the AWS CLI in preparation for the S3 backup process&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The goal of this script was to handle everything in one pass, Docker installation, EBS volume mounting, and AWS CLI setup so the instance is fully ready before any containers are launched.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;#!/bin/bash&lt;/span&gt;
&lt;span class="nb"&gt;set&lt;/span&gt; &lt;span class="nt"&gt;-e&lt;/span&gt;

&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"============================================================"&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Provisioning script is now running"&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"============================================================"&lt;/span&gt;

&lt;span class="c"&gt;# Status of docker on the server&lt;/span&gt;

&lt;span class="k"&gt;if  &lt;/span&gt;&lt;span class="nb"&gt;command&lt;/span&gt; &lt;span class="nt"&gt;-v&lt;/span&gt; docker &amp;amp;&amp;gt;/dev/null &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="nb"&gt;sudo &lt;/span&gt;systemctl is-active &lt;span class="nt"&gt;--quiet&lt;/span&gt; docker&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;then
  &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"docker &lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;docker &lt;span class="nt"&gt;--version&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt; is installed and running. Skip docker installation"&lt;/span&gt;
&lt;span class="k"&gt;else
  &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Docker is not installed, installing docker ................"&lt;/span&gt;

  &lt;span class="c"&gt;#Update packages&lt;/span&gt;
    &lt;span class="nb"&gt;sudo &lt;/span&gt;apt update
    &lt;span class="nb"&gt;sudo &lt;/span&gt;apt &lt;span class="nb"&gt;install &lt;/span&gt;ca-certificates curl gnupg lsb-release &lt;span class="nt"&gt;-y&lt;/span&gt;

    &lt;span class="c"&gt;#Add Docker's official GPG key&lt;/span&gt;
    &lt;span class="nb"&gt;sudo install&lt;/span&gt; &lt;span class="nt"&gt;-m&lt;/span&gt; 0755 &lt;span class="nt"&gt;-d&lt;/span&gt; /etc/apt/keyrings
    curl &lt;span class="nt"&gt;-fsSL&lt;/span&gt; https://download.docker.com/linux/ubuntu/gpg | &lt;span class="nb"&gt;sudo &lt;/span&gt;gpg &lt;span class="nt"&gt;--dearmor&lt;/span&gt; &lt;span class="nt"&gt;-o&lt;/span&gt; /etc/apt/keyrings/docker.gpg
    &lt;span class="nb"&gt;sudo chmod &lt;/span&gt;a+r /etc/apt/keyrings/docker.gpg

    &lt;span class="c"&gt;# Set up the Docker apt repository&lt;/span&gt;
    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
      &lt;span class="s2"&gt;"deb [arch=&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;dpkg &lt;span class="nt"&gt;--print-architecture&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt; signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu &lt;/span&gt;&lt;span class="se"&gt;\&lt;/span&gt;&lt;span class="s2"&gt;
      &lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;lsb_release &lt;span class="nt"&gt;-cs&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt; stable"&lt;/span&gt; | &lt;span class="nb"&gt;sudo tee&lt;/span&gt; /etc/apt/sources.list.d/docker.list &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; /dev/null

    &lt;span class="c"&gt;#Install the Docker Engine packages&lt;/span&gt;
    &lt;span class="nb"&gt;sudo &lt;/span&gt;apt update
    &lt;span class="nb"&gt;sudo &lt;/span&gt;apt-get &lt;span class="nb"&gt;install &lt;/span&gt;docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin &lt;span class="nt"&gt;-y&lt;/span&gt;

    &lt;span class="c"&gt;# Add dcoker user to docker group (so no sudo needed)&lt;/span&gt;
    &lt;span class="nb"&gt;sudo &lt;/span&gt;usermod &lt;span class="nt"&gt;-aG&lt;/span&gt; docker ubuntu

      &lt;span class="c"&gt;# Enable and start Docker&lt;/span&gt;
    &lt;span class="nb"&gt;sudo &lt;/span&gt;systemctl &lt;span class="nb"&gt;enable &lt;/span&gt;docker
    &lt;span class="nb"&gt;sudo &lt;/span&gt;systemctl start docker
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"docker &lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;docker &lt;span class="nt"&gt;-v&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt; has been installed......"&lt;/span&gt;
&lt;span class="k"&gt;fi&lt;/span&gt;



&lt;span class="c"&gt;# Install aws cli&lt;/span&gt;

&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"============================================"&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Installing aws cli"&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"============================================"&lt;/span&gt;

&lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="nb"&gt;command&lt;/span&gt; &lt;span class="nt"&gt;-v&lt;/span&gt; aws &amp;amp;&amp;gt;/dev/null&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;then
  &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"aws CLI &lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;aws &lt;span class="nt"&gt;--version&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt; is installed"&lt;/span&gt;
&lt;span class="k"&gt;else&lt;/span&gt;

  &lt;span class="c"&gt;# Download the official AWS CLI v2 installer. I am using Ubuntu&lt;/span&gt;
  curl &lt;span class="s2"&gt;"https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip"&lt;/span&gt; &lt;span class="nt"&gt;-o&lt;/span&gt; &lt;span class="s2"&gt;"/tmp/awscliv2.zip"&lt;/span&gt;

  &lt;span class="c"&gt;# Unzip it&lt;/span&gt;
  &lt;span class="nb"&gt;sudo &lt;/span&gt;apt &lt;span class="nb"&gt;install &lt;/span&gt;unzip &lt;span class="nt"&gt;-y&lt;/span&gt;
  unzip /tmp/awscliv2.zip &lt;span class="nt"&gt;-d&lt;/span&gt; /tmp

  &lt;span class="c"&gt;# Run the installer&lt;/span&gt;
  &lt;span class="nb"&gt;sudo&lt;/span&gt; /tmp/aws/install

  &lt;span class="c"&gt;# Verify installation&lt;/span&gt;
  aws &lt;span class="nt"&gt;--version&lt;/span&gt;

  &lt;span class="c"&gt;# Cleanup&lt;/span&gt;
  &lt;span class="nb"&gt;rm&lt;/span&gt; &lt;span class="nt"&gt;-rf&lt;/span&gt; /tmp/awscliv2.zip /tmp/aws
&lt;span class="k"&gt;fi&lt;/span&gt;

&lt;span class="c"&gt;# Format the EBS volume and make it a filesystem &lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"==========================================="&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Mounting EBS Volume"&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"==========================================="&lt;/span&gt;

&lt;span class="k"&gt;if &lt;/span&gt;mountpoint &lt;span class="nt"&gt;-q&lt;/span&gt; /mnt/ebs&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;then
  &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"EBS Volume is mounted"&lt;/span&gt;
&lt;span class="k"&gt;else
  &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"EBS Volume is not mounted.... Mounting EBS Volume"&lt;/span&gt;

  &lt;span class="c"&gt;#Format the EBS volume&lt;/span&gt;
  &lt;span class="nb"&gt;sudo &lt;/span&gt;mkfs &lt;span class="nt"&gt;-t&lt;/span&gt; ext4 /dev/nvme1n1

  &lt;span class="c"&gt;#Create a mount point where the EBS Volume will appear&lt;/span&gt;
  &lt;span class="nb"&gt;sudo mkdir&lt;/span&gt; &lt;span class="nt"&gt;-p&lt;/span&gt; /mnt/ebs

  &lt;span class="c"&gt;#Mount the volume to the mount point (Folder), add the fstan to make the mount permanent (survives reboots)&lt;/span&gt;
  &lt;span class="nb"&gt;sudo &lt;/span&gt;mount /dev/nvme1n1 /mnt/ebs
  &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s1"&gt;'/dev/nvme1n1 /mnt/ebs ext4 defaults,nofail 0 2'&lt;/span&gt; | &lt;span class="nb"&gt;sudo tee&lt;/span&gt; &lt;span class="nt"&gt;-a&lt;/span&gt; /etc/fstab
&lt;span class="k"&gt;fi&lt;/span&gt;



&lt;span class="c"&gt;# Create a folder on the mounted EBS volume for mysql to write to&lt;/span&gt;
&lt;span class="nb"&gt;sudo mkdir&lt;/span&gt; &lt;span class="nt"&gt;-p&lt;/span&gt; /mnt/ebs/mysql-data

&lt;span class="c"&gt;# mysql runs as a user with UID:999,  grant it ownership to the volume directory so that it can write into it&lt;/span&gt;
&lt;span class="nb"&gt;sudo chown&lt;/span&gt; &lt;span class="nt"&gt;-R&lt;/span&gt; 999:999 /mnt/ebs/mysql-data

&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Provisoning Script completed!"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Running the Script
&lt;/h4&gt;

&lt;p&gt;Before executing, make the script executable by running:&lt;br&gt;
 &lt;code&gt;chmod +x scriptname.sh&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Then execute it:&lt;br&gt;
&lt;code&gt;./scriptname.sh&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7h235imd1jlprqsxd4oj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7h235imd1jlprqsxd4oj.png" alt=" " width="782" height="237"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once the script has finished running, verify that everything was set up correctly. These checks are not strictly necessary if your script runs successfully, but they are worth doing especially if you are just starting out, as they help build confidence that everything is set up correctly before moving on.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Confirm Docker is installed:&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;docker --version&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F94h8blr23m489aassvz0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F94h8blr23m489aassvz0.png" alt=" " width="800" height="57"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Next, confirm the Docker daemon is accessible:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;code&gt;docker ps&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;On first run, docker ps might return the following error:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;permission denied while trying to connect to the docker API at unix:///var/run/docker.sock&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Log out of your server and SSH back in, then run docker ps again, the error should be resolved.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmenh1gedqvwz9jsnf8p4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmenh1gedqvwz9jsnf8p4.png" alt=" " width="717" height="146"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Confirm the EBS volume is mounted:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F02ib5gt7rgdjxj275fys.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F02ib5gt7rgdjxj275fys.png" alt=" " width="603" height="292"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Confirm the AWS CLI is installed:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhpk88vn1hsxl06p7bbxs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhpk88vn1hsxl06p7bbxs.png" alt=" " width="733" height="166"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  ▶️ Step 3: Getting the Containers Running
&lt;/h3&gt;

&lt;p&gt;I defined the services using a Docker Compose file to run:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A WordPress container (frontend)&lt;/li&gt;
&lt;li&gt;A MySQL container (database)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Both containers communicate over Docker's default network. I also ensured that the MySQL volume was mapped to the mounted EBS volume, so all database data is written to persistent storage.&lt;/p&gt;

&lt;p&gt;For security reasons, I stored the database credentials in a &lt;code&gt;.env&lt;/code&gt; file and referenced them as variables inside the Docker Compose file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;services&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;db&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;image&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;mysql:8.0&lt;/span&gt;
    &lt;span class="na"&gt;restart&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;unless-stopped&lt;/span&gt;
    &lt;span class="na"&gt;environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;MYSQL_ROOT_PASSWORD&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${MYSQL_ROOT_PASSWORD}&lt;/span&gt;
      &lt;span class="na"&gt;MYSQL_DATABASE&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${MYSQL_DATABASE}&lt;/span&gt;
      &lt;span class="na"&gt;MYSQL_USER&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${MYSQL_USER}&lt;/span&gt;
      &lt;span class="na"&gt;MYSQL_PASSWORD&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${MYSQL_PASSWORD}&lt;/span&gt;
    &lt;span class="na"&gt;volumes&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;/mnt/ebs/mysql-data:/var/lib/mysql&lt;/span&gt;
  &lt;span class="na"&gt;wordpress&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;depends_on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;db&lt;/span&gt;
    &lt;span class="na"&gt;image&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;wordpress:php8.2-apache&lt;/span&gt; 
    &lt;span class="na"&gt;ports&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;80:80"&lt;/span&gt;

    &lt;span class="na"&gt;restart&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;unless-stopped&lt;/span&gt;
    &lt;span class="na"&gt;environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;WORDPRESS_DB_HOST&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;db:3306&lt;/span&gt;                                                  
      &lt;span class="na"&gt;WORDPRESS_DB_USER&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${MYSQL_USER}&lt;/span&gt;                                       
      &lt;span class="na"&gt;WORDPRESS_DB_PASSWORD&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${MYSQL_PASSWORD}&lt;/span&gt;                                      
      &lt;span class="na"&gt;WORDPRESS_DB_NAME&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${MYSQL_DATABASE}&lt;/span&gt;
&lt;span class="na"&gt;networks&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;default&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;{}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I then started the containers by running:&lt;br&gt;
&lt;code&gt;docker compose up -d&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;To confirm the containers were created and running:&lt;br&gt;
&lt;code&gt;docker ps&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmnq0n0smwnpa6mev1vf8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmnq0n0smwnpa6mev1vf8.png" alt=" " width="800" height="85"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If successful, your WordPress application should now be accessible via the EC2 instance's public IP address. Make sure the URL starts with http and not https. Only port 80 was configured and not port 443, using https will make the site unreachable.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhjmycdyrmly4gqnh31a7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhjmycdyrmly4gqnh31a7.png" alt=" " width="800" height="325"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  ☁️ Step 4: Automating Backups to S3
&lt;/h3&gt;

&lt;p&gt;To improve reliability, I implemented a backup strategy that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Exports the MySQL database using mysqldump&lt;/li&gt;
&lt;li&gt;Compresses the backup file&lt;/li&gt;
&lt;li&gt;Uploads it to an S3 bucket using the AWS CLI. This ensures data can be restored even if the instance fails.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Prerequisites: S3 Bucket &amp;amp; IAM Role&lt;br&gt;
Before creating the backup script, I set up two things:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Created an S3 bucket to store the backups.&lt;/li&gt;
&lt;li&gt;Created an IAM role for authentication, which I attached to the EC2 instance. This allows the instance to upload to S3 without needing to hardcode credentials — I find this the easiest and most secure approach.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;When creating the IAM role, make sure:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Trusted entity type is set to AWS Service&lt;/li&gt;
&lt;li&gt;Use case is set to EC2&lt;/li&gt;
&lt;li&gt;The role is granted full S3 access or a custom S3 bucket policy — without this, the upload will fail.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To attach the IAM role to the instance, select the EC2 instance, go to Actions → Security → Modify IAM Role, select the role you created from the dropdown, and save.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Backup Script&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;#!/bin/bash&lt;/span&gt;

&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"========================================"&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt;        &lt;span class="s2"&gt;"Starting backup.sh script"&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"========================================"&lt;/span&gt;

&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Setting up environment variables"&lt;/span&gt;

&lt;span class="nb"&gt;set&lt;/span&gt; &lt;span class="nt"&gt;-e&lt;/span&gt;        &lt;span class="c"&gt;# stop if anything fails&lt;/span&gt;
&lt;span class="nb"&gt;set&lt;/span&gt; &lt;span class="nt"&gt;-a&lt;/span&gt;        &lt;span class="c"&gt;# start exporting variables&lt;/span&gt;
&lt;span class="nb"&gt;source&lt;/span&gt; .env   &lt;span class="c"&gt;# load variables from .env&lt;/span&gt;
&lt;span class="nb"&gt;set&lt;/span&gt; +a        &lt;span class="c"&gt;# stop exporting&lt;/span&gt;

&lt;span class="nv"&gt;DB_CONTAINER&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"ubuntu-db-1"&lt;/span&gt;
&lt;span class="nv"&gt;MYSQL_USER&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;MYSQL_USER&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;                &lt;span class="c"&gt;# mysql DB user&lt;/span&gt;
&lt;span class="nv"&gt;MYSQL_DATABASE&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;MYSQL_DATABASE&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;        &lt;span class="c"&gt;# mysql DB&lt;/span&gt;
&lt;span class="nv"&gt;MYSQL_PASSWORD&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;MYSQL_PASSWORD&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;        &lt;span class="c"&gt;# mysql Password&lt;/span&gt;
&lt;span class="nv"&gt;S3_BUCKET&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"s3://sca-wordpress-demo"&lt;/span&gt;     &lt;span class="c"&gt;# s3 bucket name&lt;/span&gt;

&lt;span class="c"&gt;# Create local Repository first to store the backup before uploading to s3&lt;/span&gt;
&lt;span class="nv"&gt;TIMESTAMP&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;date&lt;/span&gt; +%Y%m%d_%H%M%S&lt;span class="si"&gt;)&lt;/span&gt;
&lt;span class="nv"&gt;BACKUP_FILE&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"mysql-backup-&lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;TIMESTAMP&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;.sql.gz"&lt;/span&gt; 
&lt;span class="nv"&gt;TEMP_DIR&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"/tmp"&lt;/span&gt;
&lt;span class="nv"&gt;FULL_BACKUP_PATH&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$TEMP_DIR&lt;/span&gt;&lt;span class="s2"&gt;/&lt;/span&gt;&lt;span class="nv"&gt;$BACKUP_FILE&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;



&lt;span class="c"&gt;# Run the docker exec to run the mysql dump&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"================================================="&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Taking the mysql backup locally"&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"================================================"&lt;/span&gt;

&lt;span class="k"&gt;if 
&lt;/span&gt;docker &lt;span class="nb"&gt;exec&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$DB_CONTAINER&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; sh &lt;span class="nt"&gt;-c&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
 &lt;span class="s2"&gt;"exec mysqldump &lt;/span&gt;&lt;span class="se"&gt;\&lt;/span&gt;&lt;span class="s2"&gt;
 --single-transaction --set-gtid-purged=OFF  --no-tablespaces  &lt;/span&gt;&lt;span class="se"&gt;\&lt;/span&gt;&lt;span class="s2"&gt;
 -u&lt;/span&gt;&lt;span class="nv"&gt;$MYSQL_USER&lt;/span&gt;&lt;span class="s2"&gt; -p&lt;/span&gt;&lt;span class="nv"&gt;$MYSQL_PASSWORD&lt;/span&gt;&lt;span class="s2"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$MYSQL_DATABASE&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; | &lt;span class="nb"&gt;gzip&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$FULL_BACKUP_PATH&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;then
 &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Dump taken successfully"&lt;/span&gt;
&lt;span class="k"&gt;else
&lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Dump not successful"&lt;/span&gt;
&lt;span class="nb"&gt;exit &lt;/span&gt;1
&lt;span class="k"&gt;fi&lt;/span&gt;


&lt;span class="c"&gt;# Upload to s3 bucket&lt;/span&gt;
&lt;span class="c"&gt;# I created an IAM role with an s3 bucket access, attached it to the EC2 Instance. This takes care of verification&lt;/span&gt;

&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"================================================"&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt;                &lt;span class="s2"&gt;"Uploading to s3 bucket"&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"================================================"&lt;/span&gt;

&lt;span class="k"&gt;if &lt;/span&gt;aws s3 &lt;span class="nb"&gt;cp&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$FULL_BACKUP_PATH&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$S3_BUCKET&lt;/span&gt;&lt;span class="s2"&gt;/"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;then
&lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Upload to s3 was completed succesfully"&lt;/span&gt;

&lt;span class="c"&gt;#Confirm if the backup file is in s3 bucket&lt;/span&gt;
    &lt;span class="k"&gt;if &lt;/span&gt;aws s3 &lt;span class="nb"&gt;ls&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$S3_BUCKET&lt;/span&gt;&lt;span class="s2"&gt;/&lt;/span&gt;&lt;span class="nv"&gt;$BACKUP_FILE&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &amp;amp;&amp;gt;/dev/null&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;then
        &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$BACKUP_FILE&lt;/span&gt;&lt;span class="s2"&gt; is in &lt;/span&gt;&lt;span class="nv"&gt;$S3_BUCKET&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
    &lt;span class="k"&gt;else
        &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Upload seemed to succeed but file not found in bucket"&lt;/span&gt;
        &lt;span class="nb"&gt;exit &lt;/span&gt;1
    &lt;span class="k"&gt;fi    
else 
  &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Upload FAILED"&lt;/span&gt;
    &lt;span class="nb"&gt;exit &lt;/span&gt;1
&lt;span class="k"&gt;fi


&lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"=================================================="&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;" Upload completed"&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"=================================================="&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;Note: Replace the S3 bucket name with your own bucket name before running the script.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Make the script executable:&lt;br&gt;
&lt;code&gt;chmod +x backup.sh&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Then run it:&lt;br&gt;
&lt;code&gt;./backup.sh&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fms9lxddawt5wa28do9fa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fms9lxddawt5wa28do9fa.png" alt=" " width="800" height="343"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Confirm your backup was uploaded successfully:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fafrobhk8mx8cipivskjv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fafrobhk8mx8cipivskjv.png" alt=" " width="800" height="285"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  📌 Key Takeaways
&lt;/h3&gt;

&lt;p&gt;From this project, I learned:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;How WordPress and MySQL communicate across containers using Docker's default network&lt;/li&gt;
&lt;li&gt;Why databases need persistent storage and how a bind mount to an EBS volume keeps data alive beyond the container lifecycle&lt;/li&gt;
&lt;li&gt;How to automate an entire server setup — Docker installation, volume mounting,and CLI configuration using a single Bash script&lt;/li&gt;
&lt;li&gt;How to implement a real backup strategy using &lt;code&gt;mysqldump&lt;/code&gt;, compression, 
and S3 uploads, authenticated securely via an IAM role&lt;/li&gt;
&lt;li&gt;How Security Groups act as a firewall, and why only exposing the ports 
you actually need matters&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;This project gave me hands-on experience with deploying a real-world application using cloud and container technologies. More importantly, it helped me understand the why behind key concepts like persistence, networking, and security.&lt;/p&gt;

&lt;p&gt;If you're learning DevOps or Cloud Engineering, I highly recommend building something like this, the lessons stick much better when things break and you fix them yourself.&lt;/p&gt;

&lt;p&gt;💬 Feel free to share feedback or suggestions.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>docker</category>
      <category>tutorial</category>
      <category>wordpress</category>
    </item>
    <item>
      <title>How I Fixed the “Large Files Detected” Error When Pushing a Terraform Project to GitHub</title>
      <dc:creator>Christiana Otoboh</dc:creator>
      <pubDate>Sat, 22 Nov 2025 18:44:09 +0000</pubDate>
      <link>https://dev.to/christiana_otoboh/how-i-fixed-the-large-files-detected-error-when-pushing-a-terraform-project-to-github-3581</link>
      <guid>https://dev.to/christiana_otoboh/how-i-fixed-the-large-files-detected-error-when-pushing-a-terraform-project-to-github-3581</guid>
      <description>&lt;p&gt;When working with Terraform, you may run into this GitHub error:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fisqdho4m8fp4mn7ca8tc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fisqdho4m8fp4mn7ca8tc.png" alt=" " width="800" height="196"&gt;&lt;/a&gt;&lt;br&gt;
This message means GitHub is blocking your push because one or more files in your repository exceed its 100MB upload limit.&lt;/p&gt;

&lt;p&gt;This error is common among Terraform beginners and even experienced engineers because of how Terraform organizes provider binaries. GitHub rejects any file larger than 100MB, and Terraform providers typically weigh between 200MB–500MB.&lt;/p&gt;

&lt;p&gt;In this article, I’ll walk you through exactly why it happens, how I fixed it, and how you can prevent it from ever happening again.&lt;/p&gt;
&lt;h2&gt;
  
  
  Why This Happens
&lt;/h2&gt;

&lt;p&gt;Terraform downloads provider binaries into:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;.terraform/&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;If you run:&lt;br&gt;
&lt;code&gt;git add .&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;before adding .terraform to .gitignore, Git starts tracking these massive files.&lt;/p&gt;

&lt;p&gt;Even if you delete the folder later, the large files remain in your Git history, and GitHub scans the entire history during a push.&lt;br&gt;
If any file is over 100MB, GitHub blocks the push.&lt;/p&gt;
&lt;h2&gt;
  
  
  Step 1: Add .terraform to .gitignore
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh6icyp2uwpebgfaxxp2g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh6icyp2uwpebgfaxxp2g.png" alt=" " width="472" height="262"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git add .gitignore
git commit -m "Add Terraform ignores files"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step 2: Remove .terraform From Git History
&lt;/h2&gt;

&lt;p&gt;Even after ignoring, GitHub still rejects the push because the massive files are stored in old commits. To fix this, we use a powerful tool: &lt;br&gt;
&lt;strong&gt;git-filter-repo&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;git filter-repo is a powerful and versatile tool for rewriting Git repository history. It is a Python script designed to be a faster, more capable, and more user-friendly alternative to git filter-branch&lt;/p&gt;

&lt;p&gt;To install the git filter-repo on my machine (wsl on windows) I used the following commands&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo apt update 
sudo apt install git-filter-repo   
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;run: &lt;br&gt;
&lt;code&gt;git filter-repo --version&lt;/code&gt; &lt;br&gt;
to check if the installation was successful, it should print a version like this&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyt9f4azv93fy2557gz4o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyt9f4azv93fy2557gz4o.png" alt=" " width="800" height="33"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then clean your history:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;git filter-repo --force --path .terraform/ --invert-paths&lt;br&gt;
&lt;/code&gt;&lt;br&gt;
This removes all .terraform files from all past commits.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 3: Force Push the Cleaned Repo
&lt;/h2&gt;

&lt;p&gt;The commit history has changed, you need to force push to upload the cleaned repository to Github without the large files&lt;br&gt;
&lt;code&gt;git push --force&lt;br&gt;
&lt;/code&gt;&lt;br&gt;
This takes care of the large file error. &lt;/p&gt;

&lt;h2&gt;
  
  
  Terraform Git Best Practices
&lt;/h2&gt;

&lt;p&gt;Never commit:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;.terraform/&lt;/li&gt;
&lt;li&gt;terraform.tfstate&lt;/li&gt;
&lt;li&gt;terraform.tfstate.backup&lt;/li&gt;
&lt;li&gt;provider binaries&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;This error is very common for Terraform beginners and even experienced engineers.&lt;br&gt;
The fix is straightforward:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Ignore .terraform&lt;/li&gt;
&lt;li&gt;Clean the Git history with git filter-repo&lt;/li&gt;
&lt;li&gt;Force-push the cleaned repository&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Taking the time to clean your repo ensures you maintain a lightweight, secure, and professional Terraform project.&lt;/p&gt;

</description>
      <category>terraform</category>
      <category>beginners</category>
      <category>github</category>
      <category>devops</category>
    </item>
    <item>
      <title>Automating Backups to S3 with Bash, Crontab &amp; AWS CLI as a Beginner</title>
      <dc:creator>Christiana Otoboh</dc:creator>
      <pubDate>Mon, 07 Jul 2025 09:46:06 +0000</pubDate>
      <link>https://dev.to/christiana_otoboh/automating-backups-to-s3-with-bash-crontab-aws-cli-as-a-beginner-13mm</link>
      <guid>https://dev.to/christiana_otoboh/automating-backups-to-s3-with-bash-crontab-aws-cli-as-a-beginner-13mm</guid>
      <description>&lt;p&gt;As part of a learning assignment, I was tasked with creating an automated backup system. &lt;br&gt;
The goal was to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Backup a specific local directory on my Linux machine.&lt;/li&gt;
&lt;li&gt;Compress the backup as a .tar.gz file.&lt;/li&gt;
&lt;li&gt;Upload it to an Amazon S3 bucket.&lt;/li&gt;
&lt;li&gt;Log each step with timestamps.&lt;/li&gt;
&lt;li&gt;Automate the process with a cron job.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;While this sounds straightforward for an experienced DevOps engineer, I had zero knowledge of Linux scripting or AWS CLI when I started. What followed was a rollercoaster of trial, error, and growth. This post documents my journey, what I learned, the final working solution, and how I overcame some tough beginner mistakes.&lt;/p&gt;
&lt;h2&gt;
  
  
  My Learning Roadmap
&lt;/h2&gt;
&lt;h3&gt;
  
  
  1. Learning Bash Scripting
&lt;/h3&gt;

&lt;p&gt;I started with learning how to write a basic bash script. I needed it to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Define a source and destination path.&lt;/li&gt;
&lt;li&gt;Use tar to compress the files.&lt;/li&gt;
&lt;li&gt;Log each step with a timestamp.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;
  
  
  2. Understanding S3 Uploads
&lt;/h3&gt;

&lt;p&gt;Initially, I installed the AWS CLI without configuring it. Later, I stumbled upon s3fs (a way to mount an S3 bucket as a local file system), but I got stuck for hours trying to make it work.&lt;/p&gt;

&lt;p&gt;Thankfully, I reached out to a senior developer who advised me to use aws s3 cp instead, which was simpler and more appropriate for my use case.&lt;/p&gt;
&lt;h3&gt;
  
  
  3. Configuring AWS CLI
&lt;/h3&gt;

&lt;p&gt;After some dependency issues related to Python (which AWS CLI uses), I was able to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create an IAM user with programmatic access.&lt;/li&gt;
&lt;li&gt;Configure my AWS CLI with the IAM credentials (aws configure).&lt;/li&gt;
&lt;li&gt;Successfully test uploads using aws s3 cp.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;
  
  
  4. Automating with Crontab
&lt;/h3&gt;

&lt;p&gt;Finally, I set up a cron job to run my script every day at 7:00 AM. I used crontab -e and added:&lt;br&gt;
&lt;em&gt;&lt;code&gt;0 7 * * * /home/christiana/backup.sh&lt;/code&gt;&lt;/em&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  The Final Working Script
&lt;/h3&gt;

&lt;p&gt;Here’s the working version of my backup script:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#!/bin/bash

SOURCE=/home/christiana/documents
DEST=/home/christiana/backupfolder
S3_BUCKET=s3://demosample-backupbucket/Backup/

NOW=$(date +"%Y-%m-%d_%H-%M-%S")
BACKUP_NAME="backup_$NOW.tar.gz"
BACKUP_PATH="$DEST/$BACKUP_NAME"
LOG="$DEST/backup_log.txt"

echo "[$NOW] Starting backup..." &amp;gt;&amp;gt; "$LOG"

# Create compressed backup
tar -czvf "$BACKUP_PATH" "$SOURCE"
if [ $? -eq 0 ]; then
  echo "[$NOW] Backup created: $BACKUP_NAME" &amp;gt;&amp;gt; "$LOG"
else
  echo "[$NOW] Backup FAILED" &amp;gt;&amp;gt; "$LOG"
  exit 1
fi 

# Upload to S3
aws s3 cp "$BACKUP_PATH" "$S3_BUCKET/"
if [ $? -eq 0 ]; then
  echo "[$NOW] Upload to S3 Successful" &amp;gt;&amp;gt; "$LOG"
else  
  echo "[$NOW] Upload to S3 FAILED" &amp;gt;&amp;gt; "$LOG"
  exit 1
fi 

echo "[$NOW] Backup completed" &amp;gt;&amp;gt; "$LOG"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Troubleshooting and Lessons Learned
&lt;/h3&gt;

&lt;p&gt;Here are some common problems I faced and how I fixed them:&lt;/p&gt;

&lt;p&gt;1.AWS CLI Installed but Not Working&lt;br&gt;
Problem: I installed the AWS CLI but forgot to configure it.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Solution: Ran aws configure and entered my IAM credentials.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;2.Python Dependency Errors&lt;br&gt;
Problem: AWS CLI wasn't working due to missing Python dependencies.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Solution: I installed the required version of Python and ensured it was available in my PATH.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;3.Using s3fs Instead of aws s3 cp&lt;br&gt;
Problem: I wasted hours trying to mount S3 as a drive with s3fs.&lt;/p&gt;

&lt;p&gt;_Solution: I learned it’s better to use aws s3 cp for one-off uploads. It's simpler, faster, and has fewer moving parts.&lt;br&gt;
_&lt;br&gt;
4.Crontab Didn't Run My Script&lt;br&gt;
Problem: Cron job wasn’t running the script.&lt;/p&gt;

&lt;p&gt;Fixes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Ensured the script had execute permission (chmod +x backup.sh).&lt;/li&gt;
&lt;li&gt;Used absolute paths in the script (cron doesn't know your environment).&lt;/li&gt;
&lt;li&gt;Added full paths for tar, aws, etc., or sourced my environment manually.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Final Thoughts
&lt;/h3&gt;

&lt;p&gt;This small project taught me a lot:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;How to write bash scripts and use tar/gzip for compression.&lt;/li&gt;
&lt;li&gt;The power and simplicity of AWS CLI.&lt;/li&gt;
&lt;li&gt;The importance of reaching out when you're stuck.&lt;/li&gt;
&lt;li&gt;How to schedule recurring tasks in Linux using cron.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I started as a complete beginner and ended with a working automation system that runs every morning at 7 AM, backing up my documents to the cloud.&lt;/p&gt;

&lt;p&gt;If you’re starting your DevOps or scripting journey, don't be afraid to struggle just document everything, ask questions, and keep going. You'll figure it out. Just like I did. &lt;/p&gt;

&lt;p&gt;✅ Tools Used:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Bash&lt;/li&gt;
&lt;li&gt;AWS CLI&lt;/li&gt;
&lt;li&gt;IAM User with S3 permissions&lt;/li&gt;
&lt;li&gt;tar and gzip&lt;/li&gt;
&lt;li&gt;crontab&lt;/li&gt;
&lt;/ol&gt;

</description>
    </item>
  </channel>
</rss>
