DEV Community

Cover image for 4 Practical Boto3 Scripts for S3 Every DevOps Engineer Should Know
Muhammad Zubair Bin Akbar
Muhammad Zubair Bin Akbar

Posted on

4 Practical Boto3 Scripts for S3 Every DevOps Engineer Should Know

Working with AWS S3 through the console is fine until you need automation, repeatability, and control. That’s where Boto3 comes in. In this post, we’ll walk through four practical Python scripts to manage S3 efficiently.

1. List All S3 Buckets with Creation Dates

A simple script to get visibility into your S3 environment.

import boto3
s3 = boto3.client('s3')
response = s3.list_buckets()
print("S3 Buckets:\n")
for bucket in response['Buckets']:
    print(f"Name: {bucket['Name']} | Created On: {bucket['CreationDate']}")
Enter fullscreen mode Exit fullscreen mode

Why this matters:

Useful for audits, inventory tracking, or quick checks across accounts.

2. Upload a File to S3 with Error Handling

Uploading files is common but handling failures properly is what makes scripts production-ready.

import boto3
from botocore.exceptions import FileNotFoundError, NoCredentialsError, ClientError
s3 = boto3.client('s3')
file_name = "test.txt"
bucket_name = "your-bucket-name"
object_name = "uploads/test.txt"
try:
    s3.upload_file(file_name, bucket_name, object_name)
    print("File uploaded successfully.")
except FileNotFoundError:
    print("The file was not found.")
except NoCredentialsError:
    print("Credentials not available.")
except ClientError as e:
    print(f"AWS Error: {e}")
Enter fullscreen mode Exit fullscreen mode

Why this matters:

Prevents silent failures and gives clear debugging output.

3. Download Files from S3 with Progress Tracking

For large files, progress tracking makes a big difference.

import boto3
s3 = boto3.client('s3')
bucket_name = "your-bucket-name"
object_name = "large-file.zip"
file_name = "downloaded.zip"
def progress_callback(bytes_transferred):
    print(f"Transferred: {bytes_transferred} bytes")
s3.download_file(
    bucket_name,
    object_name,
    file_name,
    Callback=progress_callback
)
print("Download complete.")
Enter fullscreen mode Exit fullscreen mode

Why this matters:

Gives visibility into long running downloads especially useful in automation pipelines.

4. Create and Delete S3 Buckets Programmatically

Automating bucket lifecycle management is useful in testing and dynamic environments.

import boto3
from botocore.exceptions import ClientError
s3 = boto3.client('s3')
bucket_name = "my-unique-bucket-name-12345"
# Create Bucket
try:
    s3.create_bucket(
        Bucket=bucket_name,
        CreateBucketConfiguration={
            'LocationConstraint': 'eu-west-1'
        }
    )
    print("Bucket created successfully.")
except ClientError as e:
    print(f"Error creating bucket: {e}")
# Delete Bucket
try:
    s3.delete_bucket(Bucket=bucket_name)
    print("Bucket deleted successfully.")
except ClientError as e:
    print(f"Error deleting bucket: {e}")
Enter fullscreen mode Exit fullscreen mode

Note:

Make sure the bucket is empty before deleting, otherwise the delete operation will fail.

Final Thoughts

These four scripts cover the most common S3 operations:

  • Visibility (listing buckets)
  • Data movement (upload/download)
  • Resource lifecycle (create/delete)

They’re simple, but extremely useful when building automation around AWS.

As you scale, you can extend these with:

  • Logging
  • Retry mechanisms
  • Parallel uploads/downloads

This is the kind of practical automation that saves time in real environments.

Top comments (0)