Three months ago, I was wasting 2-3 hours every week on repetitive tasks. Setting up environments. Backing up files. Renaming batches of images. Running the same 6 commands every time I started a new feature.
I kept telling myself: "I should automate this."
But automation felt like this mysterious DevOps thing that required learning Docker, Kubernetes, CI/CD pipelines, and sacrificing a weekend to figure it all out.
Then I discovered something: Most of my boring tasks could be automated with simple bash scripts—15-20 lines of code that take 10 minutes to write.
No DevOps experience needed. No complex tools. Just bash, which you already have installed.
Let me show you the scripts that gave me back those hours.
Table of Contents
- Why Bash Scripts (And Not Fancy Tools)
- Your First Script: The 5-Minute Setup
- Script 1: Project Setup Automation
- Script 2: Smart Backup System
- Script 3: Batch File Renaming
- Script 4: Git Workflow Automation
- Script 5: Environment Cleanup
- Script 6: Port Killer (Because Something is Always Using Port 3000)
- Making Scripts Globally Available
- Bash Scripting Essentials You Need to Know
- Debugging Your Scripts
- Real-World Impact: Time Saved
- Your Script Library: Getting Started
- Next Steps: Progressive Automation
- Key Takeaways
Why Bash Scripts (And Not Fancy Tools)
Before we dive in, you might be thinking: "Why not use Python? Or Node.js? Or some automation platform?"
Here's why bash:
- Already installed - Every Mac and Linux has bash. Windows has WSL or Git Bash.
- No dependencies - No npm install, no pip, no virtual environments.
- Perfect for system tasks - File operations, process management, running commands.
- Instant execution - No compilation, no startup time.
- Universal - Works on servers, containers, local machines.
Python is great. Node.js is powerful. But for quick automation of repetitive tasks? Bash is unbeatable.
Your First Script: The 5-Minute Setup
Let's start with the absolute basics. Create a file called hello.sh:
#!/bin/bash
echo "Hello from your first script!"
echo "Current directory: $(pwd)"
echo "Current user: $(whoami)"
echo "Today's date: $(date)"
Making Scripts Executable
chmod +x hello.sh
That's it. The +x flag makes the file executable.
Running Your Script
./hello.sh
Output:
Hello from your first script!
Current directory: /Users/john/projects
Current user: john
Today's date: Wed Dec 11 14:23:45 PST 2024
What just happened:
-
#!/bin/bashtells the system to use bash to run this script -
echoprints to the terminal -
$(command)runs a command and inserts its output -
chmod +xmade it executable -
./runs the script from the current directory
You just automated your first task. Let's get practical.
Script 1: Project Setup Automation
The Manual Way (Before)
Every time you start a new project, you do this:
mkdir my-new-project
cd my-new-project
git init
echo "# My New Project" > README.md
echo "node_modules/" > .gitignore
echo ".env" >> .gitignore
echo ".DS_Store" >> .gitignore
mkdir src
touch src/index.js
npm init -y
npm install express dotenv
git add .
git commit -m "Initial commit"
code .
That's 13 commands. Every. Single. Time.
The Automated Way (After)
Create newproject.sh:
#!/bin/bash
# Check if project name was provided
if [ -z "$1" ]; then
echo "Usage: ./newproject.sh <project-name>"
exit 1
fi
PROJECT_NAME=$1
# Create project structure
echo "📁 Creating project: $PROJECT_NAME"
mkdir "$PROJECT_NAME"
cd "$PROJECT_NAME"
# Initialize git
echo "🔧 Setting up Git..."
git init
# Create README
echo "📝 Creating README..."
echo "# $PROJECT_NAME" > README.md
echo "" >> README.md
echo "## Getting Started" >> README.md
echo "" >> README.md
echo "\`\`\`bash" >> README.md
echo "npm install" >> README.md
echo "npm start" >> README.md
echo "\`\`\`" >> README.md
# Create .gitignore
echo "🚫 Creating .gitignore..."
cat > .gitignore << EOF
node_modules/
.env
.DS_Store
dist/
build/
*.log
.vscode/
coverage/
EOF
# Create project structure
echo "📂 Creating directories..."
mkdir -p src/{components,utils,config}
mkdir -p tests
# Create initial files
echo "📄 Creating initial files..."
cat > src/index.js << EOF
console.log('Hello from $PROJECT_NAME!');
EOF
cat > .env.example << EOF
PORT=3000
NODE_ENV=development
EOF
# Initialize npm
echo "📦 Initializing npm..."
npm init -y
# Install common dependencies
echo "⬇️ Installing dependencies..."
npm install express dotenv
echo "⬇️ Installing dev dependencies..."
npm install -D nodemon
# Update package.json scripts
echo "🔧 Configuring npm scripts..."
npm pkg set scripts.start="node src/index.js"
npm pkg set scripts.dev="nodemon src/index.js"
npm pkg set scripts.test="echo \"Error: no test specified\" && exit 1"
# Initial commit
echo "💾 Creating initial commit..."
git add .
git commit -m "Initial commit: project setup"
# Open in VS Code
echo "🚀 Opening in VS Code..."
code .
echo ""
echo "✅ Project $PROJECT_NAME created successfully!"
echo ""
echo "To get started:"
echo " cd $PROJECT_NAME"
echo " npm run dev"
How It Works
Usage:
./newproject.sh my-awesome-app
What it does:
- ✅ Creates project directory
- ✅ Initializes git repository
- ✅ Creates structured README with getting started instructions
- ✅ Sets up comprehensive .gitignore
- ✅ Creates organized folder structure (src, tests, components)
- ✅ Generates starter files with boilerplate code
- ✅ Initializes npm with package.json
- ✅ Installs express, dotenv, and nodemon
- ✅ Configures npm scripts for start/dev
- ✅ Makes initial git commit
- ✅ Opens project in VS Code
Time saved: 5 minutes per project × 20 projects/year = 100 minutes saved
New concepts introduced:
-
$1- First command-line argument -
[ -z "$1" ]- Check if variable is empty -
cat > file << EOF- Create multi-line files (here document) -
mkdir -p- Create nested directories -
npm pkg set- Modify package.json programmatically
Script 2: Smart Backup System
Why This Matters
You're working on a feature. You want to try something risky. You need a quick backup—not a full git commit, just a safety net.
Manually copying files is tedious. This script does it intelligently.
The Script
Create backup.sh:
#!/bin/bash
# Configuration
BACKUP_DIR="$HOME/backups"
TIMESTAMP=$(date +"%Y%m%d_%H%M%S")
PROJECT_NAME=$(basename "$(pwd)")
BACKUP_NAME="${PROJECT_NAME}_${TIMESTAMP}"
BACKUP_PATH="$BACKUP_DIR/$BACKUP_NAME"
# Colors for output
GREEN='\033[0;32m'
BLUE='\033[0;34m'
YELLOW='\033[1;33m'
NC='\033[0m' # No Color
echo -e "${BLUE}🗄️ Backup Script${NC}"
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
# Create backup directory if it doesn't exist
mkdir -p "$BACKUP_DIR"
# Check if we're in a git repo
if [ -d ".git" ]; then
echo -e "${YELLOW}📍 Git repository detected${NC}"
# Check for uncommitted changes
if [[ -n $(git status -s) ]]; then
echo -e "${YELLOW}⚠️ You have uncommitted changes${NC}"
git status -s
echo ""
fi
fi
# Create the backup
echo -e "${BLUE}📦 Creating backup...${NC}"
echo "Source: $(pwd)"
echo "Destination: $BACKUP_PATH"
# Use rsync for smart copying (only changed files)
rsync -av \
--exclude='node_modules' \
--exclude='.git' \
--exclude='dist' \
--exclude='build' \
--exclude='*.log' \
--exclude='.DS_Store' \
--exclude='coverage' \
. "$BACKUP_PATH"
if [ $? -eq 0 ]; then
echo -e "${GREEN}✅ Backup created successfully!${NC}"
echo ""
echo "Backup location: $BACKUP_PATH"
# Calculate backup size
SIZE=$(du -sh "$BACKUP_PATH" | cut -f1)
echo "Backup size: $SIZE"
# List recent backups
echo ""
echo -e "${BLUE}Recent backups:${NC}"
ls -lth "$BACKUP_DIR" | grep "$PROJECT_NAME" | head -5
# Cleanup old backups (keep last 10)
echo ""
echo -e "${YELLOW}🧹 Cleaning up old backups...${NC}"
cd "$BACKUP_DIR"
ls -t | grep "$PROJECT_NAME" | tail -n +11 | xargs -I {} rm -rf {}
echo "Kept last 10 backups"
else
echo -e "${RED}❌ Backup failed!${NC}"
exit 1
fi
echo ""
echo -e "${GREEN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
echo -e "${GREEN}Done!${NC}"
Breaking It Down
Key features:
-
Smart naming - Uses timestamp:
myproject_20241211_142335 - Excludes bloat - Skips node_modules, .git, build folders
- Uses rsync - Only copies changed files (fast!)
- Colored output - Green/blue/yellow for better UX
- Git awareness - Warns about uncommitted changes
- Auto cleanup - Keeps only last 10 backups
- Shows size - You know how much space you're using
Usage:
./backup.sh
Output:
🗄️ Backup Script
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
📍 Git repository detected
⚠️ You have uncommitted changes
M src/index.js
?? temp.txt
📦 Creating backup...
Source: /Users/john/projects/myapp
Destination: /Users/john/backups/myapp_20241211_142335
✅ Backup created successfully!
Backup location: /Users/john/backups/myapp_20241211_142335
Backup size: 145M
Recent backups:
drwxr-xr-x myapp_20241211_142335
drwxr-xr-x myapp_20241211_093022
drwxr-xr-x myapp_20241210_165544
🧹 Cleaning up old backups...
Kept last 10 backups
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Done!
Scheduling Automatic Backups
Want backups every day at 6 PM? Use cron:
crontab -e
Add this line:
0 18 * * * cd /path/to/your/project && /path/to/backup.sh
Now your project backs itself up daily.
Script 3: Batch File Renaming
The Problem
You have 47 screenshots named:
Screenshot 2024-12-11 at 10.23.45 AM.png
Screenshot 2024-12-11 at 10.24.12 AM.png
Screenshot 2024-12-11 at 10.25.33 AM.png
...
You need them named:
feature-demo-01.png
feature-demo-02.png
feature-demo-03.png
...
Doing this manually? 20 minutes of mind-numbing work.
The Solution
Create rename-files.sh:
#!/bin/bash
# Check arguments
if [ $# -lt 2 ]; then
echo "Usage: ./rename-files.sh <pattern> <new-prefix>"
echo "Example: ./rename-files.sh '*.png' 'screenshot'"
exit 1
fi
PATTERN=$1
PREFIX=$2
COUNTER=1
echo "🔄 Renaming files matching: $PATTERN"
echo "New prefix: $PREFIX"
echo ""
# Preview mode first
echo "Preview (no changes will be made):"
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
for file in $PATTERN; do
if [ -f "$file" ]; then
extension="${file##*.}"
new_name=$(printf "%s-%02d.%s" "$PREFIX" "$COUNTER" "$extension")
echo "$file → $new_name"
COUNTER=$((COUNTER + 1))
fi
done
echo ""
echo "Total files to rename: $((COUNTER - 1))"
echo ""
# Ask for confirmation
read -p "Proceed with renaming? (y/n): " -n 1 -r
echo ""
if [[ $REPLY =~ ^[Yy]$ ]]; then
COUNTER=1
for file in $PATTERN; do
if [ -f "$file" ]; then
extension="${file##*.}"
new_name=$(printf "%s-%02d.%s" "$PREFIX" "$COUNTER" "$extension")
mv "$file" "$new_name"
echo "✅ Renamed: $new_name"
COUNTER=$((COUNTER + 1))
fi
done
echo ""
echo "🎉 Done! Renamed $((COUNTER - 1)) files"
else
echo "❌ Cancelled"
fi
Usage:
./rename-files.sh "Screenshot*.png" "feature-demo"
Output:
🔄 Renaming files matching: Screenshot*.png
New prefix: feature-demo
Preview (no changes will be made):
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Screenshot 2024-12-11 at 10.23.45 AM.png → feature-demo-01.png
Screenshot 2024-12-11 at 10.24.12 AM.png → feature-demo-02.png
Screenshot 2024-12-11 at 10.25.33 AM.png → feature-demo-03.png
Total files to rename: 3
Proceed with renaming? (y/n): y
✅ Renamed: feature-demo-01.png
✅ Renamed: feature-demo-02.png
✅ Renamed: feature-demo-03.png
🎉 Done! Renamed 3 files
Advanced: Pattern-Based Renaming
Want to replace spaces with dashes?
#!/bin/bash
for file in *\ *; do
if [ -f "$file" ]; then
new_name="${file// /-}"
mv "$file" "$new_name"
echo "Renamed: $file → $new_name"
fi
done
Before:
my vacation photo.jpg
final draft version 2.docx
After:
my-vacation-photo.jpg
final-draft-version-2.docx
Script 4: Git Workflow Automation
The Tedious Workflow
Your typical feature branch workflow:
git checkout main
git pull origin main
git checkout -b feature/user-auth
# ... make changes ...
git add .
git commit -m "Add user authentication"
git push origin feature/user-auth
# ... create PR manually on GitHub ...
Every. Single. Time.
The One-Command Solution
Create gitflow.sh:
#!/bin/bash
# Colors
GREEN='\033[0;32m'
BLUE='\033[0;34m'
YELLOW='\033[1;33m'
RED='\033[0;31m'
NC='\033[0m'
# Function to print colored output
print_step() {
echo -e "${BLUE}▶ $1${NC}"
}
print_success() {
echo -e "${GREEN}✓ $1${NC}"
}
print_error() {
echo -e "${RED}✗ $1${NC}"
}
# Check if we're in a git repo
if [ ! -d ".git" ]; then
print_error "Not a git repository!"
exit 1
fi
# Get current branch
CURRENT_BRANCH=$(git rev-parse --abbrev-ref HEAD)
echo -e "${BLUE}╔════════════════════════════════════╗${NC}"
echo -e "${BLUE}║ Git Workflow Automation Tool ║${NC}"
echo -e "${BLUE}╚════════════════════════════════════╝${NC}"
echo ""
# Check for uncommitted changes
if [[ -n $(git status -s) ]]; then
print_error "You have uncommitted changes!"
git status -s
echo ""
read -p "Commit these changes? (y/n): " -n 1 -r
echo ""
if [[ $REPLY =~ ^[Yy]$ ]]; then
read -p "Commit message: " COMMIT_MSG
git add .
git commit -m "$COMMIT_MSG"
print_success "Changes committed"
else
print_error "Please commit or stash your changes first"
exit 1
fi
fi
# Sync with main
print_step "Syncing with main branch..."
git checkout main
git pull origin main
if [ $? -ne 0 ]; then
print_error "Failed to sync with main"
exit 1
fi
print_success "Synced with main"
# Get feature name
echo ""
read -p "Feature branch name (e.g., user-auth): " FEATURE_NAME
if [ -z "$FEATURE_NAME" ]; then
print_error "Feature name cannot be empty"
exit 1
fi
BRANCH_NAME="feature/$FEATURE_NAME"
# Create and checkout feature branch
print_step "Creating branch: $BRANCH_NAME"
git checkout -b "$BRANCH_NAME"
if [ $? -ne 0 ]; then
# Branch might already exist
print_step "Branch exists, checking out..."
git checkout "$BRANCH_NAME"
fi
print_success "On branch: $BRANCH_NAME"
echo ""
echo -e "${GREEN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
echo -e "${GREEN}Ready to work on $BRANCH_NAME${NC}"
echo -e "${GREEN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
echo ""
echo "When done, run:"
echo " git add ."
echo " git commit -m 'your message'"
echo " git push origin $BRANCH_NAME"
Usage:
./gitflow.sh
Interactive output:
╔════════════════════════════════════╗
║ Git Workflow Automation Tool ║
╚════════════════════════════════════╝
▶ Syncing with main branch...
✓ Synced with main
Feature branch name (e.g., user-auth): user-authentication
▶ Creating branch: feature/user-authentication
✓ On branch: feature/user-authentication
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Ready to work on feature/user-authentication
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
When done, run:
git add .
git commit -m 'your message'
git push origin feature/user-authentication
Safety Features
The script includes:
- ✅ Checks if you're in a git repo
- ✅ Warns about uncommitted changes
- ✅ Offers to commit changes for you
- ✅ Syncs with main before creating branch
- ✅ Handles existing branch names gracefully
- ✅ Provides clear next steps
Time saved: 2 minutes per feature × 30 features/month = 60 minutes saved
Script 5: Environment Cleanup
The Bloat Problem
Your node_modules folders are everywhere. Your Docker images are piling up. Your disk space is disappearing.
Time for a cleanup.
The Cleanup Script
Create cleanup.sh:
#!/bin/bash
# Colors
BLUE='\033[0;34m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
RED='\033[0;31m'
NC='\033[0m'
# Function to get directory size
get_size() {
du -sh "$1" 2>/dev/null | cut -f1
}
# Function to format numbers
format_number() {
echo "$1" | sed ':a;s/\B[0-9]\{3\}\>/,&/;ta'
}
echo -e "${BLUE}╔═══════════════════════════════════════╗${NC}"
echo -e "${BLUE}║ Development Environment Cleanup ║${NC}"
echo -e "${BLUE}╚═══════════════════════════════════════╝${NC}"
echo ""
# 1. Find and clean node_modules
echo -e "${YELLOW}📦 Scanning for node_modules directories...${NC}"
NODE_MODULES_DIRS=$(find . -name "node_modules" -type d -prune 2>/dev/null)
NODE_MODULES_COUNT=$(echo "$NODE_MODULES_DIRS" | grep -c "node_modules")
if [ "$NODE_MODULES_COUNT" -gt 0 ]; then
echo "Found $NODE_MODULES_COUNT node_modules directories:"
echo ""
TOTAL_SIZE=0
while IFS= read -r dir; do
if [ -n "$dir" ]; then
SIZE=$(get_size "$dir")
echo " 📁 $dir ($SIZE)"
fi
done <<< "$NODE_MODULES_DIRS"
echo ""
read -p "Delete all node_modules folders? (y/n): " -n 1 -r
echo ""
if [[ $REPLY =~ ^[Yy]$ ]]; then
while IFS= read -r dir; do
if [ -n "$dir" ]; then
rm -rf "$dir"
echo -e "${GREEN}✓ Deleted: $dir${NC}"
fi
done <<< "$NODE_MODULES_DIRS"
echo ""
echo -e "${GREEN}✓ Cleaned up node_modules${NC}"
fi
else
echo "No node_modules directories found"
fi
echo ""
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo ""
# 2. Clean npm cache
echo -e "${YELLOW}🗑️ Checking npm cache...${NC}"
NPM_CACHE_SIZE=$(get_size ~/.npm)
echo "Current npm cache size: $NPM_CACHE_SIZE"
echo ""
read -p "Clear npm cache? (y/n): " -n 1 -r
echo ""
if [[ $REPLY =~ ^[Yy]$ ]]; then
npm cache clean --force
echo -e "${GREEN}✓ npm cache cleared${NC}"
fi
echo ""
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo ""
# 3. Clean Docker (if installed)
if command -v docker &> /dev/null; then
echo -e "${YELLOW}🐳 Checking Docker...${NC}"
# Count images and containers
IMAGE_COUNT=$(docker images -q | wc -l | tr -d ' ')
CONTAINER_COUNT=$(docker ps -a -q | wc -l | tr -d ' ')
VOLUME_COUNT=$(docker volume ls -q | wc -l | tr -d ' ')
echo "Docker images: $(format_number $IMAGE_COUNT)"
echo "Docker containers: $(format_number $CONTAINER_COUNT)"
echo "Docker volumes: $(format_number $VOLUME_COUNT)"
echo ""
# Show disk usage
docker system df
echo ""
read -p "Run Docker system prune? (removes unused data) (y/n): " -n 1 -r
echo ""
if [[ $REPLY =~ ^[Yy]$ ]]; then
docker system prune -af --volumes
echo -e "${GREEN}✓ Docker cleaned${NC}"
fi
else
echo -e "${YELLOW}Docker not installed, skipping...${NC}"
fi
echo ""
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo ""
# 4. Clean build artifacts
echo -e "${YELLOW}🏗️ Scanning for build artifacts...${NC}"
BUILD_DIRS=$(find . -type d \( -name "dist" -o -name "build" -o -name ".next" -o -name "out" \) -prune 2>/dev/null)
BUILD_COUNT=$(echo "$BUILD_DIRS" | grep -c -E "dist|build|\.next|out")
if [ "$BUILD_COUNT" -gt 0 ]; then
echo "Found $BUILD_COUNT build directories:"
echo ""
while IFS= read -r dir; do
if [ -n "$dir" ]; then
SIZE=$(get_size "$dir")
echo " 📁 $dir ($SIZE)"
fi
done <<< "$BUILD_DIRS"
echo ""
read -p "Delete build artifacts? (y/n): " -n 1 -r
echo ""
if [[ $REPLY =~ ^[Yy]$ ]]; then
while IFS= read -r dir; do
if [ -n "$dir" ]; then
rm -rf "$dir"
echo -e "${GREEN}✓ Deleted: $dir${NC}"
fi
done <<< "$BUILD_DIRS"
echo -e "${GREEN}✓ Build artifacts cleaned${NC}"
fi
else
echo "No build artifacts found"
fi
echo ""
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo ""
# 5. Clean log files
echo -e "${YELLOW}📝 Scanning for log files...${NC}"
LOG_FILES=$(find . -type f -name "*.log" 2>/dev/null)
LOG_COUNT=$(echo "$LOG_FILES" | grep -c ".log")
if [ "$LOG_COUNT" -gt 0 ]; then
echo "Found $LOG_COUNT log files"
echo ""
read -p "Delete all .log files? (y/n): " -n 1 -r
echo ""
if [[ $REPLY =~ ^[Yy]$ ]]; then
while IFS= read -r file; do
if [ -n "$file" ]; then
rm "$file"
echo -e "${GREEN}✓ Deleted: $file${NC}"
fi
done <<< "$LOG_FILES"
echo -e "${GREEN}✓ Log files cleaned${NC}"
fi
else
echo "No log files found"
fi
echo ""
echo -e "${GREEN}╔═══════════════════════════════════════╗${NC}"
echo -e "${GREEN}║ Cleanup Complete! ║${NC}"
echo -e "${GREEN}╚═══════════════════════════════════════╝${NC}"
echo ""
# Show disk space freed
echo "Check your available disk space:"
df -h .
What Gets Cleaned
This script intelligently finds and removes:
- node_modules/ - Often 200MB+ per project
- npm cache - Can grow to several GB
- Docker images/containers - Unused ones pile up fast
- Build artifacts - dist/, build/, .next/, out/
- Log files - *.log files everywhere
Safety features:
- ✅ Shows what it found before deleting
- ✅ Asks for confirmation at each step
- ✅ Shows sizes so you can decide
- ✅ Never deletes without permission
Usage:
./cleanup.sh
Real impact: Recovered 47GB on my machine last week.
Script 6: Port Killer (Because Something is Always Using Port 3000)
The Frustration
You try to start your dev server:
npm start
Error:
Error: listen EADDRINUSE: address already in use :::3000
Then you go hunting through Activity Monitor or lsof commands you half-remember.
The Fix
Create killport.sh:
#!/bin/bash
# Check if port number provided
if [ -z "$1" ]; then
echo "Usage: ./killport.sh <port-number>"
echo "Example: ./killport.sh 3000"
exit 1
fi
PORT=$1
echo "🔍 Checking port $PORT..."
echo ""
# Find process using the port
if [[ "$OSTYPE" == "darwin"* ]]; then
# macOS
PID=$(lsof -ti:$PORT)
else
# Linux
PID=$(fuser $PORT/tcp 2>/dev/null | awk '{print $1}')
fi
if [ -z "$PID" ]; then
echo "✓ Port $PORT is free"
exit 0
fi
# Get process details
if [[ "$OSTYPE" == "darwin"* ]]; then
PROCESS_NAME=$(ps -p $PID -o comm=)
else
PROCESS_NAME=$(ps -p $PID -o comm=)
fi
echo "Found process using port $PORT:"
echo " PID: $PID"
echo " Process: $PROCESS_NAME"
echo ""
# Show full process info
ps -p $PID -o pid,ppid,%cpu,%mem,command
echo ""
read -p "Kill this process? (y/n): " -n 1 -r
echo ""
if [[ $REPLY =~ ^[Yy]$ ]]; then
kill -9 $PID
if [ $? -eq 0 ]; then
echo "✓ Process killed"
echo "✓ Port $PORT is now free"
else
echo "✗ Failed to kill process (try with sudo)"
fi
else
echo "Cancelled"
fi
Usage:
./killport.sh 3000
Output:
🔍 Checking port 3000...
Found process using port 3000:
PID: 12345
Process: node
PID PPID %CPU %MEM COMMAND
12345 98765 0.0 0.5 node /Users/john/app/server.js
Kill this process? (y/n): y
✓ Process killed
✓ Port 3000 is now free
Cross-Platform Version
Want it to work everywhere? Enhanced version:
#!/bin/bash
PORT=$1
# Detect OS
if [[ "$OSTYPE" == "linux-gnu"* ]]; then
# Linux
fuser -k $PORT/tcp
elif [[ "$OSTYPE" == "darwin"* ]]; then
# macOS
lsof -ti:$PORT | xargs kill -9
elif [[ "$OSTYPE" == "msys" || "$OSTYPE" == "cygwin" ]]; then
# Windows (Git Bash/WSL)
netstat -ano | findstr :$PORT | awk '{print $5}' | xargs taskkill //PID //F
fi
echo "✓ Port $PORT cleared"
Time saved: 30 seconds × 100 times/year = 50 minutes
Making Scripts Globally Available
Right now, you need to type ./script.sh from the script's directory. Let's make them available everywhere.
Method 1: Add to PATH
Create a bin directory:
mkdir -p ~/bin
Move your scripts there:
mv newproject.sh ~/bin/newproject
mv backup.sh ~/bin/backup
mv killport.sh ~/bin/killport
# etc...
Add to your PATH in ~/.bashrc or ~/.zshrc:
export PATH="$HOME/bin:$PATH"
Reload your shell:
source ~/.bashrc # or source ~/.zshrc
Now you can run from anywhere:
newproject my-app
backup
killport 3000
Method 2: Create Aliases
Add to ~/.bashrc or ~/.zshrc:
alias newproject='~/scripts/newproject.sh'
alias backup='~/scripts/backup.sh'
alias killport='~/scripts/killport.sh'
alias cleanup='~/scripts/cleanup.sh'
alias gitflow='~/scripts/gitflow.sh'
Method 3: Symlinks
Create symbolic links in /usr/local/bin:
sudo ln -s ~/scripts/newproject.sh /usr/local/bin/newproject
sudo ln -s ~/scripts/backup.sh /usr/local/bin/backup
sudo ln -s ~/scripts/killport.sh /usr/local/bin/killport
Now these commands work system-wide.
Bash Scripting Essentials You Need to Know
Let's cover the fundamentals you'll use in 99% of scripts.
Variables
# Simple assignment
NAME="John"
AGE=30
# Use variables
echo "Hello, $NAME"
echo "You are $AGE years old"
# Command output as variable
CURRENT_DIR=$(pwd)
TODAY=$(date +%Y-%m-%d)
# Environment variables
echo $HOME
echo $USER
echo $PATH
User Input
# Read input
read -p "Enter your name: " USERNAME
echo "Hello, $USERNAME"
# Read with default
read -p "Port [3000]: " PORT
PORT=${PORT:-3000} # Use 3000 if empty
# Yes/No prompt
read -p "Continue? (y/n): " -n 1 -r
if [[ $REPLY =~ ^[Yy]$ ]]; then
echo "Continuing..."
fi
Conditionals
# Check if file exists
if [ -f "package.json" ]; then
echo "Node.js project detected"
fi
# Check if directory exists
if [ -d "node_modules" ]; then
echo "Dependencies installed"
fi
# Check if variable is empty
if [ -z "$VAR" ]; then
echo "Variable is empty"
fi
# Check if variable is not empty
if [ -n "$VAR" ]; then
echo "Variable has a value"
fi
# Multiple conditions
if [ -f "package.json" ] && [ -d "src" ]; then
echo "Valid project structure"
fi
# Comparison
if [ $AGE -gt 18 ]; then
echo "Adult"
fi
# String comparison
if [ "$NAME" = "John" ]; then
echo "Hello John"
fi
Loops
# Loop through files
for file in *.txt; do
echo "Processing: $file"
done
# Loop through numbers
for i in {1..10}; do
echo "Number: $i"
done
# While loop
COUNTER=0
while [ $COUNTER -lt 5 ]; do
echo "Count: $COUNTER"
COUNTER=$((COUNTER + 1))
done
# Read file line by line
while IFS= read -r line; do
echo "Line: $line"
done < file.txt
Functions
# Define function
greet() {
local name=$1
echo "Hello, $name!"
}
# Call function
greet "John"
# Function with return value
add() {
local result=$(($1 + $2))
echo $result
}
sum=$(add 5 10)
echo "Sum: $sum"
# Function with multiple parameters
create_file() {
local filename=$1
local content=$2
echo "$content" > "$filename"
echo "Created: $filename"
}
create_file "test.txt" "Hello World"
Exit Codes
# Exit with success
exit 0
# Exit with error
exit 1
# Check exit code of last command
if [ $? -eq 0 ]; then
echo "Command succeeded"
else
echo "Command failed"
fi
# Exit on any error
set -e
# Exit on undefined variable
set -u
Debugging Your Scripts
Enable debug mode:
#!/bin/bash
set -x # Print each command before executing
Or run with debug flag:
bash -x script.sh
Output with debug:
+ PORT=3000
+ echo 'Starting on port 3000'
Starting on port 3000
+ npm start
Check syntax without running:
bash -n script.sh
Common Errors and Fixes
Error: Permission denied
# Fix
chmod +x script.sh
Error: Command not found
# Check if command exists first
if command -v docker &> /dev/null; then
docker --version
else
echo "Docker not installed"
fi
Error: Variable not found
# Use parameter expansion with default
PORT=${PORT:-3000} # Use 3000 if PORT not set
Error: File not found
# Check before operating
if [ -f "$FILE" ]; then
cat "$FILE"
else
echo "File not found: $FILE"
fi
Real-World Impact: Time Saved
Let me show you the actual time I've saved with these scripts:
| Script | Task | Manual Time | Automated Time | Uses/Month | Monthly Savings |
|---|---|---|---|---|---|
| newproject.sh | Project setup | 5 min | 30 sec | 8 | 36 min |
| backup.sh | Backup project | 3 min | 10 sec | 20 | 50 min |
| gitflow.sh | Branch workflow | 2 min | 20 sec | 25 | 42 min |
| killport.sh | Kill port process | 1 min | 5 sec | 30 | 28 min |
| cleanup.sh | Environment cleanup | 15 min | 2 min | 2 | 26 min |
| rename-files.sh | Batch rename | 10 min | 1 min | 4 | 36 min |
Total monthly savings: 218 minutes (3.6 hours)
Annual savings: 2,616 minutes (43.6 hours)
That's more than a full work week every year.
Your Script Library: Getting Started
Here's how to build your own automation library:
1. Create a scripts directory:
mkdir -p ~/scripts
cd ~/scripts
2. Start with one painful task:
- What do you do repeatedly?
- What takes 5+ minutes each time?
- What makes you think "there has to be a better way"?
3. Write a simple script:
#!/bin/bash
# Start simple - automate just one thing
echo "Starting task..."
# Your commands here
echo "Done!"
4. Test it thoroughly:
chmod +x script.sh
./script.sh
5. Refine over time:
- Add error checking
- Add user prompts
- Add color output
- Add help text
6. Make it global:
# Add to PATH
echo 'export PATH="$HOME/scripts:$PATH"' >> ~/.bashrc
source ~/.bashrc
7. Document it:
Create a README in your scripts folder:
# My Automation Scripts
## newproject
Creates a new project with git, npm, and standard structure
Usage: `newproject <project-name>`
## backup
Backs up current directory with timestamp
Usage: `backup`
## killport
Kills process using specified port
Usage: `killport <port>`
Next Steps: Progressive Automation
Start small. Build momentum. Here's your roadmap:
Week 1: The Basics
- Install your scripts in ~/scripts
- Make them executable
- Run them manually when needed
Week 2: Make Them Convenient
- Add to PATH or create aliases
- Use them daily until they become habit
Week 3: Add Intelligence
- Add error checking
- Add confirmation prompts
- Add colored output
Week 4: Share and Expand
- Push to GitHub
- Get feedback from team
- Automate one more task
Month 2: Advanced Automation
- Learn cron for scheduling
- Create git hooks
- Chain scripts together
Month 3: Team Automation
- Share scripts with team
- Create onboarding automation
- Build deployment scripts
Key Takeaways
- Start simple - A 10-line script is still automation
- Solve real pain - Automate what frustrates you
- Test thoroughly - Use confirmation prompts at first
- Build incrementally - Don't try to create the perfect script
- Document usage - Add help text and examples
- Share with team - Your automation can help everyone
- Iterate constantly - Scripts evolve with your needs
Remember: Every minute you spend writing automation saves you hours in the future.
The best time to start automating was yesterday. The second best time is right now.
Your Turn
Pick one task you did manually this week that you'll do again. Write a script for it.
Start with 10 lines. No fancy features. Just automate the basic flow.
Then run it tomorrow instead of doing it manually.
That's how you start.
Have you automated something recently? Share your scripts in the comments—I'd love to see what you're building!
And if this helped you save time, share it with a developer who's still doing everything manually. Let's spread the automation mindset.
Resources
- Bash Guide for Beginners
- ShellCheck - Lint your bash scripts
- Bash Scripting Cheatsheet
- My Scripts Repository - Fork and customize
Pro tip: Install shellcheck to catch errors:
brew install shellcheck # macOS
sudo apt install shellcheck # Linux
# Check your script
shellcheck script.sh
Happy automating! 🚀
Top comments (0)