DEV Community

Cover image for AWS S3 Simplified: Automate Operations Without CLI on Remote Server
SOM4N
SOM4N

Posted on

AWS S3 Simplified: Automate Operations Without CLI on Remote Server

Creating a Helper Script for AWS S3 Operations on Remote Servers Without AWS CLI

In a world where cloud computing is becoming the backbone of modern infrastructure, it is high time that accessing AWS services like S3 efficiently must be accomplished. But imagine you are working on some remote UNIX server where the AWS CLI is not installed, and you want to publish the files to an S3 bucket. This blog will walk you through how to create a helper script that will solve this problem by using IAM to secure access and automatically obtain AWS credentials.

The Problem

You are working on a remote UNIX server that will be used to do the following:

  • Publish files to an AWS S3 bucket.
  • Read and write to S3. The server you are using does not have AWS CLI, and manual management of credentials is error-prone and inefficient. You need a more robust solution to deal with the following:
  • Obtain AWS credentials securely.
  • Automate file uploads or downloads.
  • Eliminate dependence on AWS CLI.

Solution Overview

The solution includes:

  • Using an IAM user with proper S3 permissions.
  • A helper script that retrieves the Access Key ID and Secret Access Key from AWS.
  • Performing S3 operations using these credentials.
  • Automate key rotation every 30 days.

Step-by-Step Implementation

  1. IAM Configuration

Create an IAM user or role with the necessary permissions to access your S3 bucket. Below is an example of an IAM policy:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": ["s3:PutObject", "s3:GetObject"],
      "Resource": "arn:aws:s3:::your-bucket-name/*"
    }
  ]
}
Enter fullscreen mode Exit fullscreen mode

Replace your-bucket-name with the name of your S3 bucket.
Attach this policy to your IAM user or role.

Deploy the Template:
Use the AWS Management Console or AWS CLI to deploy the CloudFormation stack. For example:

aws cloudformation deploy --template-file template.yaml --stack-name S3AccessStack

Retrieve the Credentials:
After the stack is created, you can retrieve the exported outputs:

aws cloudformation describe-stacks --stack-name S3AccessStack \
--query "Stacks[0].Outputs[?ExportName=='S3AccessKeyId'].OutputValue" --output text

Similarly, retrieve the Secret Access Key:
aws cloudformation describe-stacks --stack-name S3AccessStack \
--query "Stacks[0].Outputs[?ExportName=='S3SecretAccessKey'].OutputValue" --output text

  1. Writing the Helper Script

The script achieves the following:

  • Retrieves AWS credentials from a secure source (e.g., AWS Secrets Manager or a pre-configured file).
  • Automates S3 operations like file upload.
  • Rotates keys every 30 days to enhance security.
#!/bin/bash

# File containing AWS credentials
CREDENTIALS_FILE="/path/to/credentials_file"
S3_BUCKET="your-bucket-name"

# Function to load credentials from file
load_credentials() {
  if [ ! -f "$CREDENTIALS_FILE" ]; then
    echo "Credentials file not found: $CREDENTIALS_FILE"
    exit 1
  fi

  ACCESS_KEY_ID=$(grep 'AccessKeyId' $CREDENTIALS_FILE | awk -F '=' '{print $2}')
  SECRET_ACCESS_KEY=$(grep 'SecretAccessKey' $CREDENTIALS_FILE | awk -F '=' '{print $2}')
}

# Function to update credentials
update_credentials() {
  echo "Updating credentials..."
  ACCESS_KEY_ID=$(aws cloudformation describe-stacks --stack-name S3AccessStack \
    --query "Stacks[0].Outputs[?ExportName=='S3AccessKeyId'].OutputValue" --output text)

  SECRET_ACCESS_KEY=$(aws cloudformation describe-stacks --stack-name S3AccessStack \
    --query "Stacks[0].Outputs[?ExportName=='S3SecretAccessKey'].OutputValue" --output text)

  echo -e "AccessKeyId=$ACCESS_KEY_ID\nSecretAccessKey=$SECRET_ACCESS_KEY" > $CREDENTIALS_FILE
  echo "Credentials updated successfully."
}

# Function to upload file to S3
upload_to_s3() {
  local file=$1
  if [ ! -f "$file" ]; then
    echo "File does not exist: $file"
    exit 1
  fi

  # Using curl to perform PUT operation
  curl -X PUT -T "$file" \
    -H "Host: $S3_BUCKET.s3.amazonaws.com" \
    -H "Date: $(date -u '+%Y-%m-%dT%H:%M:%SZ')" \
    -H "Authorization: AWS $ACCESS_KEY_ID:$SECRET_ACCESS_KEY" \
    "https://s3.amazonaws.com/$S3_BUCKET/$(basename $file)"

  echo "File uploaded successfully: $file"
}

# Main execution
if [ "$1" == "update-credentials" ]; then
  update_credentials
  exit 0
fi

if [ -z "$1" ]; then
  echo "Usage: $0 <file-to-upload> | update-credentials"
  exit 1
fi

load_credentials
upload_to_s3 "$1"
Enter fullscreen mode Exit fullscreen mode

Save this script as aws_helper.sh and grant execution permissions
Run ./aws_helper.sh update-credentials every 30 days to rotate the keys and update the credentials file.

How This Script Helps

Eliminates AWS CLI Dependency:
The script uses curl for S3 operations, ensuring compatibility with environments where AWS CLI is not installed.
Improves Security:
Automates key rotation and securely manages credentials.
Automation:
Enables seamless and automated S3 operations, reducing manual errors.
Customizable:
Can be extended to include additional S3 operations, such as deleting or listing files.

Extending the Script

For larger-scale automation, consider integrating this script with:
AWS SDKs: For more complex logic.
AWS CloudFormation: To manage infrastructure as code.
AWS Secrets Manager: To securely manage credentials.
Refer to the




Documentation for creating and managing your AWS resources programmatically.

Conclusion

This helper script provides a lightweight and efficient solution for performing AWS S3 operations on remote servers without AWS CLI. By leveraging IAM, automating credential retrieval, and rotating keys, it enhances security and reliability. Try it out and adapt it to fit your specific needs!

Top comments (0)