DEV Community

Cover image for 5 Python Scripts Every AWS Developer Should Have
Prashik besekar
Prashik besekar

Posted on

5 Python Scripts Every AWS Developer Should Have

Save these. Run these. Thank me later.


Why I Wrote This

When I started learning AWS I spent hours doing repetitive tasks manually.

Checking instance status in the console. Downloading files from S3 one by one. Manually checking my bill every week. Logging into EC2 just to check if it was running.

Then I discovered something — everything I was doing manually could be automated with Python and boto3 in minutes.

These are the 5 scripts I wish someone had given me when I started. Each one solves a real problem. Each one saves real time. And every AWS developer — beginner or experienced — should have these in their toolkit. 🐍


Script 1 — EC2 Instance Manager

The problem: Forgetting to stop EC2 instances and getting unexpected bills.

The solution: A complete EC2 manager that lets you start, stop, and check all your instances from terminal in seconds.

import boto3
import sys
from datetime import datetime

def get_ec2_client(region='ap-south-1'):
    return boto3.client('ec2', region_name=region)

def list_all_instances():
    ec2 = get_ec2_client()
    response = ec2.describe_instances()

    print("\n" + "="*60)
    print(f"{'ID':<22} {'Type':<12} {'State':<12} {'IP':<16} {'Name'}")
    print("="*60)

    for reservation in response['Reservations']:
        for instance in reservation['Instances']:
            instance_id = instance['InstanceId']
            instance_type = instance['InstanceType']
            state = instance['State']['Name']
            public_ip = instance.get('PublicIpAddress', 'No IP')

            name = 'No Name'
            if instance.get('Tags'):
                for tag in instance['Tags']:
                    if tag['Key'] == 'Name':
                        name = tag['Value']

            # Color code by state
            state_display = f"🟢 {state}" if state == 'running' else f"🔴 {state}"
            print(f"{instance_id:<22} {instance_type:<12} {state:<12} {public_ip:<16} {name}")

    print("="*60 + "\n")

def start_instance(instance_id):
    ec2 = get_ec2_client()
    print(f"⏳ Starting {instance_id}...")
    ec2.start_instances(InstanceIds=[instance_id])

    waiter = ec2.get_waiter('instance_running')
    waiter.wait(InstanceIds=[instance_id])

    response = ec2.describe_instances(InstanceIds=[instance_id])
    ip = response['Reservations'][0]['Instances'][0].get('PublicIpAddress', 'No IP')
    print(f"✅ Instance started! Public IP: {ip}")

def stop_instance(instance_id):
    ec2 = get_ec2_client()
    print(f"⏳ Stopping {instance_id}...")
    ec2.stop_instances(InstanceIds=[instance_id])

    waiter = ec2.get_waiter('instance_stopped')
    waiter.wait(InstanceIds=[instance_id])
    print(f"✅ Instance stopped! You are no longer being charged.")

def main():
    if len(sys.argv) < 2:
        print("Usage:")
        print("  python ec2_manager.py list")
        print("  python ec2_manager.py start <instance-id>")
        print("  python ec2_manager.py stop <instance-id>")
        return

    command = sys.argv[1].lower()

    if command == 'list':
        list_all_instances()
    elif command == 'start' and len(sys.argv) == 3:
        start_instance(sys.argv[2])
    elif command == 'stop' and len(sys.argv) == 3:
        stop_instance(sys.argv[2])
    else:
        print("Invalid command. Use: list, start, or stop")

if __name__ == '__main__':
    main()
Enter fullscreen mode Exit fullscreen mode

How to use:

python ec2_manager.py list                          # See all instances
python ec2_manager.py start i-0abc123def456789     # Start instance
python ec2_manager.py stop i-0abc123def456789      # Stop instance
Enter fullscreen mode Exit fullscreen mode

Why this saves you: Never forget a running instance again. Check everything in 2 seconds from terminal. 💰


Script 2 — S3 File Manager

The problem: Uploading, downloading, and managing S3 files through the console is slow and tedious.

The solution: A Python script that handles all S3 operations from your terminal.

import boto3
import os
import sys
from pathlib import Path

def get_s3_client():
    return boto3.client('s3', region_name='ap-south-1')

def list_buckets():
    s3 = get_s3_client()
    response = s3.list_buckets()

    print("\n📦 Your S3 Buckets:")
    print("-" * 40)
    for bucket in response['Buckets']:
        print(f"{bucket['Name']} (created: {bucket['CreationDate'].strftime('%Y-%m-%d')})")
    print()

def list_files(bucket_name, prefix=''):
    s3 = get_s3_client()

    try:
        response = s3.list_objects_v2(Bucket=bucket_name, Prefix=prefix)

        if 'Contents' not in response:
            print(f"📭 Bucket '{bucket_name}' is empty")
            return

        print(f"\n📁 Files in {bucket_name}/{prefix}:")
        print("-" * 50)

        total_size = 0
        for obj in response['Contents']:
            size_kb = obj['Size'] / 1024
            total_size += obj['Size']
            print(f"{obj['Key']} ({size_kb:.1f} KB) — {obj['LastModified'].strftime('%Y-%m-%d')}")

        print(f"\n  Total: {len(response['Contents'])} files ({total_size/1024:.1f} KB)\n")

    except Exception as e:
        print(f"❌ Error: {e}")

def upload_file(local_path, bucket_name, s3_key=None):
    s3 = get_s3_client()

    if not os.path.exists(local_path):
        print(f"❌ File not found: {local_path}")
        return

    if s3_key is None:
        s3_key = Path(local_path).name

    file_size = os.path.getsize(local_path) / 1024
    print(f"⏳ Uploading {local_path} ({file_size:.1f} KB) to s3://{bucket_name}/{s3_key}...")

    try:
        s3.upload_file(local_path, bucket_name, s3_key)
        print(f"✅ Upload successful!")
        print(f"   URL: https://{bucket_name}.s3.amazonaws.com/{s3_key}")
    except Exception as e:
        print(f"❌ Upload failed: {e}")

def download_file(bucket_name, s3_key, local_path=None):
    s3 = get_s3_client()

    if local_path is None:
        local_path = Path(s3_key).name

    print(f"⏳ Downloading s3://{bucket_name}/{s3_key} to {local_path}...")

    try:
        s3.download_file(bucket_name, s3_key, local_path)
        file_size = os.path.getsize(local_path) / 1024
        print(f"✅ Download successful! ({file_size:.1f} KB saved to {local_path})")
    except Exception as e:
        print(f"❌ Download failed: {e}")

def delete_file(bucket_name, s3_key):
    s3 = get_s3_client()

    confirm = input(f"⚠️  Delete s3://{bucket_name}/{s3_key}? (yes/no): ")
    if confirm.lower() != 'yes':
        print("Cancelled.")
        return

    try:
        s3.delete_object(Bucket=bucket_name, Key=s3_key)
        print(f"✅ File deleted successfully!")
    except Exception as e:
        print(f"❌ Delete failed: {e}")

# Quick usage
if __name__ == '__main__':
    # List all buckets
    list_buckets()

    # Example usage — replace with your bucket name
    BUCKET = 'your-bucket-name'

    # List files
    list_files(BUCKET)

    # Upload a file
    # upload_file('myfile.txt', BUCKET)

    # Download a file
    # download_file(BUCKET, 'myfile.txt', 'downloaded_file.txt')
Enter fullscreen mode Exit fullscreen mode

Why this saves you: No more clicking through S3 console. Upload, download, delete from terminal in seconds. ⚡


Script 3 — AWS Cost Tracker

The problem: AWS bills surprise you at end of month.

The solution: Check your costs any time in seconds — broken down by service.

import boto3
from datetime import datetime, timedelta

def get_aws_costs(days=30):
    # Cost Explorer is only in us-east-1
    ce = boto3.client('ce', region_name='us-east-1')

    end_date = datetime.now().strftime('%Y-%m-%d')
    start_date = (datetime.now() - timedelta(days=days)).strftime('%Y-%m-%d')

    print(f"\n💰 AWS Cost Report")
    print(f"Period: {start_date} to {end_date}")
    print("="*50)

    try:
        response = ce.get_cost_and_usage(
            TimePeriod={'Start': start_date, 'End': end_date},
            Granularity='MONTHLY',
            Metrics=['UnblendedCost'],
            GroupBy=[{'Type': 'DIMENSION', 'Key': 'SERVICE'}]
        )

        total_cost = 0
        service_costs = []

        for result in response['ResultsByTime']:
            for group in result['Groups']:
                service = group['Keys'][0]
                cost = float(group['Metrics']['UnblendedCost']['Amount'])
                if cost > 0.001:
                    service_costs.append((service, cost))
                    total_cost += cost

        # Sort by cost — highest first
        service_costs.sort(key=lambda x: x[1], reverse=True)

        for service, cost in service_costs:
            inr = cost * 83
            bar = "" * int(cost / total_cost * 20)
            print(f"{service[:35]:<35} ${cost:.4f} (₹{inr:.2f}) {bar}")

        print("="*50)
        print(f"{'TOTAL':<35} ${total_cost:.4f} (₹{total_cost*83:.2f})")

        # Warning if cost is high
        if total_cost > 5:
            print(f"\n⚠️  WARNING: Your bill is above $5!")
            print("Check for forgotten resources — NAT Gateways, Load Balancers, EC2 instances")
        else:
            print(f"\n✅ Cost is within safe range")

    except Exception as e:
        print(f"❌ Error fetching costs: {e}")
        print("Make sure Cost Explorer is enabled in your AWS account")

def check_free_tier_usage():
    print("\n📊 Checking key services usage...")

    ec2 = boto3.client('ec2', region_name='ap-south-1')
    s3 = boto3.client('s3', region_name='ap-south-1')

    # Count running instances
    response = ec2.describe_instances(
        Filters=[{'Name': 'instance-state-name', 'Values': ['running']}]
    )
    running_count = sum(len(r['Instances']) for r in response['Reservations'])
    print(f"  EC2 Running Instances: {running_count}")

    if running_count > 0:
        print(f"  ⚠️  Remember: Free tier only covers 750 hours/month for t2.micro")

    # Count S3 buckets
    buckets = s3.list_buckets()['Buckets']
    print(f"  S3 Buckets: {len(buckets)}")
    print()

if __name__ == '__main__':
    get_aws_costs(days=30)
    check_free_tier_usage()
Enter fullscreen mode Exit fullscreen mode

How to use:

python cost_tracker.py
Enter fullscreen mode Exit fullscreen mode

Why this saves you: Know exactly what you're spending before the bill arrives. No more surprises. 💸


Script 4 — Automatic S3 Backup System

The problem: Important files on your EC2 or local machine could be lost anytime.

The solution: Automatic daily backup to S3 with timestamps.

import boto3
import os
import gzip
import shutil
from datetime import datetime
from pathlib import Path

def backup_to_s3(source_path, bucket_name, backup_prefix='backups'):
    s3 = boto3.client('s3', region_name='ap-south-1')

    timestamp = datetime.now().strftime('%Y-%m-%d_%H-%M-%S')

    print(f"\n🔄 Starting backup at {timestamp}")
    print(f"Source: {source_path}")
    print(f"Destination: s3://{bucket_name}/{backup_prefix}/")
    print("-"*50)

    backed_up = 0
    failed = 0
    total_size = 0

    source = Path(source_path)

    if source.is_file():
        files = [source]
    else:
        files = list(source.rglob('*'))
        files = [f for f in files if f.is_file()]

    for file_path in files:
        try:
            if source.is_dir():
                relative_path = file_path.relative_to(source)
                s3_key = f"{backup_prefix}/{timestamp}/{relative_path}"
            else:
                s3_key = f"{backup_prefix}/{timestamp}/{file_path.name}"

            file_size = file_path.stat().st_size
            total_size += file_size

            s3.upload_file(str(file_path), bucket_name, s3_key)
            print(f"{file_path.name} ({file_size/1024:.1f} KB)")
            backed_up += 1

        except Exception as e:
            print(f"  ❌ Failed: {file_path.name}{e}")
            failed += 1

    print("-"*50)
    print(f"✅ Backup complete!")
    print(f"   Files backed up: {backed_up}")
    print(f"   Failed: {failed}")
    print(f"   Total size: {total_size/1024:.1f} KB")
    print(f"   S3 location: s3://{bucket_name}/{backup_prefix}/{timestamp}/")

def list_backups(bucket_name, backup_prefix='backups'):
    s3 = boto3.client('s3', region_name='ap-south-1')

    response = s3.list_objects_v2(
        Bucket=bucket_name,
        Prefix=backup_prefix,
        Delimiter='/'
    )

    print(f"\n📦 Available backups in s3://{bucket_name}/{backup_prefix}/:")

    if 'CommonPrefixes' not in response:
        print("  No backups found")
        return

    for prefix in response['CommonPrefixes']:
        backup_name = prefix['Prefix'].split('/')[-2]
        print(f"{backup_name}")

if __name__ == '__main__':
    # Configuration — change these
    BUCKET_NAME = 'your-backup-bucket'
    SOURCE_PATH = '/path/to/your/files'  # File or folder to backup

    # Run backup
    backup_to_s3(SOURCE_PATH, BUCKET_NAME)

    # List all backups
    list_backups(BUCKET_NAME)

    # To automate daily — add to crontab:
    # 0 2 * * * /usr/bin/python3 /home/ubuntu/backup.py
    # This runs every day at 2am automatically
Enter fullscreen mode Exit fullscreen mode

Why this saves you: Never lose important files. Automated backups running while you sleep. 🛡️


Script 5 — AWS Security Checker

The problem: Security misconfigurations in AWS can expose your entire infrastructure.

The solution: A script that automatically scans your AWS account for common security issues.

import boto3

def check_aws_security():
    print("\n🔐 AWS Security Check")
    print("="*50)

    issues_found = 0
    checks_passed = 0

    ec2 = boto3.client('ec2', region_name='ap-south-1')
    iam = boto3.client('iam')

    # Check 1 — Security Groups with open SSH
    print("\n📋 Checking Security Groups...")
    response = ec2.describe_security_groups()

    for sg in response['SecurityGroups']:
        for rule in sg.get('IpPermissions', []):
            if rule.get('FromPort') == 22:
                for ip_range in rule.get('IpRanges', []):
                    if ip_range.get('CidrIp') == '0.0.0.0/0':
                        print(f"  ⚠️  {sg['GroupName']} — SSH open to ALL (0.0.0.0/0)")
                        print(f"     Fix: Restrict SSH to your IP only")
                        issues_found += 1

    if issues_found == 0:
        print("  ✅ No security groups with open SSH found")
        checks_passed += 1

    # Check 2 — IAM Users without MFA
    print("\n📋 Checking IAM Users for MFA...")
    users = iam.list_users()['Users']

    mfa_issues = 0
    for user in users:
        mfa_devices = iam.list_mfa_devices(UserName=user['UserName'])['MFADevices']
        if not mfa_devices:
            print(f"  ⚠️  User '{user['UserName']}' has NO MFA enabled")
            mfa_issues += 1
            issues_found += 1

    if mfa_issues == 0:
        print("  ✅ All IAM users have MFA enabled")
        checks_passed += 1

    # Check 3 — Old Access Keys
    print("\n📋 Checking Access Key Age...")
    from datetime import datetime, timezone

    old_key_issues = 0
    for user in users:
        keys = iam.list_access_keys(UserName=user['UserName'])['AccessKeyMetadata']
        for key in keys:
            if key['Status'] == 'Active':
                age = (datetime.now(timezone.utc) - key['CreateDate']).days
                if age > 90:
                    print(f"  ⚠️  {user['UserName']} — Access key is {age} days old")
                    print(f"     Fix: Rotate keys every 90 days")
                    old_key_issues += 1
                    issues_found += 1

    if old_key_issues == 0:
        print("  ✅ All access keys are less than 90 days old")
        checks_passed += 1

    # Check 4 — Public S3 Buckets
    print("\n📋 Checking S3 Bucket Public Access...")
    s3 = boto3.client('s3')
    buckets = s3.list_buckets()['Buckets']

    public_bucket_issues = 0
    for bucket in buckets:
        try:
            acl = s3.get_bucket_acl(Bucket=bucket['Name'])
            for grant in acl['Grants']:
                if grant['Grantee'].get('URI', '') == 'http://acs.amazonaws.com/groups/global/AllUsers':
                    print(f"  ⚠️  Bucket '{bucket['Name']}' is PUBLIC")
                    print(f"     Fix: Remove public access unless intentional")
                    public_bucket_issues += 1
                    issues_found += 1
        except:
            pass

    if public_bucket_issues == 0:
        print("  ✅ No unexpected public S3 buckets found")
        checks_passed += 1

    # Final Report
    print("\n" + "="*50)
    print(f"🔐 Security Check Complete")
    print(f"   ✅ Checks passed: {checks_passed}")
    print(f"   ⚠️  Issues found: {issues_found}")

    if issues_found == 0:
        print("\n🎉 Your AWS account looks secure!")
    else:
        print(f"\n⚠️  Fix {issues_found} issue(s) to improve your security")
    print()

if __name__ == '__main__':
    check_aws_security()
Enter fullscreen mode Exit fullscreen mode

Why this saves you: Catch security issues before attackers do. Run this weekly. 🔒


How to Set Up All 5 Scripts

# 1. Install dependencies
pip install boto3

# 2. Configure AWS credentials
aws configure

# 3. Create a folder for your scripts
mkdir aws-python-scripts
cd aws-python-scripts

# 4. Save each script as:
# ec2_manager.py
# s3_manager.py
# cost_tracker.py
# backup.py
# security_checker.py

# 5. Run any script
python ec2_manager.py list
python cost_tracker.py
python security_checker.py
Enter fullscreen mode Exit fullscreen mode

Put These on GitHub Right Now

Create a repo called aws-python-scripts and push all 5 scripts.

This becomes a real portfolio project showing:

  • Python proficiency ✅
  • AWS knowledge ✅
  • Security awareness ✅
  • Automation skills ✅
  • Clean code practices ✅

Mention it in every job application. Link it in your resume. 💪


Quick Reference

Script What it does Run with
ec2_manager.py Manage EC2 instances python ec2_manager.py list
s3_manager.py Manage S3 files python s3_manager.py
cost_tracker.py Track AWS costs python cost_tracker.py
backup.py Backup files to S3 python backup.py
security_checker.py Check security issues python security_checker.py

Final Thoughts

These 5 scripts represent something important.

The difference between a developer who uses AWS and a developer who masters AWS — is automation.

Anyone can click buttons in the AWS console. But the developer who automates repetitive tasks, monitors costs, secures infrastructure, and backs up data automatically — that's the developer companies want to hire.

Save these scripts. Customize them. Put them on GitHub. Use them in interviews.

They're yours now. 💪


Follow LearnWithPrashik for more practical AWS and Python content.

Connect with me:
LinkedIn: linkedin.com/in/prashik-besekar
GitHub: github.com/prashikBesekar

Top comments (0)