Accidental exposure of sensitive data in public Amazon S3 buckets is still a major security risk. It's easy to misconfigure permissions, and attackers actively scan for these vulnerabilities. Let's look at how to find these buckets using the AWS CLI, AWS SDK for Python (Boto3), and a few other helpful techniques.
Why Public S3 Buckets Are Dangerous
Publicly accessible S3 buckets can expose sensitive data like:
- Personally Identifiable Information (PII)
- API keys and credentials
- Proprietary code or data
- Internal documents
Attackers can quickly find these buckets using automated tools, leading to data breaches, compliance violations, and reputational damage. Regularly auditing your S3 bucket permissions is crucial.
Methods for Finding Public Buckets
Here are several methods you can use to find public S3 buckets in your AWS environment:
1. AWS CLI
The AWS Command Line Interface (CLI) is a powerful tool for interacting with AWS services. You can use it to list your buckets and check their permissions.
List All Buckets:
aws s3 ls
This command lists all S3 buckets in your account. However, it doesn't show their permissions. To check permissions, you need to use the get-bucket-policy and get-bucket-acl commands.
Check Bucket Policy:
aws s3api get-bucket-policy --bucket <bucket-name>
If the bucket is public, the policy will likely contain statements with "Principal": "*" or "Principal": {"AWS": "*"} and "Effect": "Allow" for actions like "s3:GetObject".
Check Bucket ACL:
aws s3api get-bucket-acl --bucket <bucket-name>
Look for Grant elements with Grantee types like Everyone or AnyAuthenticatedUser and permissions like READ or WRITE.
Scripting with AWS CLI and jq:
To automate this process, you can use jq to parse the JSON output and filter for public buckets:
aws s3 ls | awk '{print $3}' | while read bucket; do
policy=$(aws s3api get-bucket-policy --bucket "$bucket" 2>/dev/null)
acl=$(aws s3api get-bucket-acl --bucket "$bucket" 2>/dev/null)
if [[ ! -z "$policy" ]]; then
if echo "$policy" | jq -e '.Policy | contains({Statement: [{Principal: "*", Effect: "Allow"}]})'; then
echo "Bucket $bucket is PUBLIC (Policy)"
fi
fi
if [[ ! -z "$acl" ]]; then
if echo "$acl" | jq -e '.Grants | any(.Grantee.Type == "Group" and (.Permission == "READ" or .Permission == "WRITE"))'; then
echo "Bucket $bucket is PUBLIC (ACL)"
fi
fi
done
This script iterates through your buckets, retrieves their policies and ACLs, and flags those that appear to be public based on the presence of broad "Allow" statements or public ACL grants.
2. Boto3 (AWS SDK for Python)
Boto3 is the AWS SDK for Python. It provides a more programmatic way to interact with AWS services.
Install Boto3:
pip install boto3
Python Script to Check Bucket Policies and ACLs:
import boto3
import json
s3 = boto3.client('s3')
def check_bucket_permissions():
buckets = s3.list_buckets()['Buckets']
for bucket in buckets:
bucket_name = bucket['Name']
try:
policy = s3.get_bucket_policy(Bucket=bucket_name)['Policy']
policy_json = json.loads(policy)
for statement in policy_json['Statement']:
if ('Principal' in statement and statement['Principal'] == '*') and statement['Effect'] == 'Allow':
print(f"Bucket {bucket_name} is PUBLIC (Policy)")
except Exception as e:
pass # Bucket might not have a policy
try:
acl = s3.get_bucket_acl(Bucket=bucket_name)
for grant in acl['Grants']:
if 'Grantee' in grant and grant['Grantee']['Type'] == 'Group' and \
(grant['Permission'] == 'READ' or grant['Permission'] == 'WRITE'):
print(f"Bucket {bucket_name} is PUBLIC (ACL)")
except Exception as e:
pass # Bucket might not have an ACL
if __name__ == "__main__":
check_bucket_permissions()
This script uses Boto3 to list buckets, retrieve their policies and ACLs, and print out any buckets that have public permissions.
3. AWS Trusted Advisor
AWS Trusted Advisor provides recommendations for optimizing your AWS infrastructure, including security checks. It has a check for "Amazon S3 Bucket Permissions" that identifies buckets with open access permissions. While it doesn't provide the detailed insights of the CLI or SDK methods, it's a quick way to get a high-level overview.
4. AWS Config
AWS Config allows you to track the configuration of your AWS resources over time and evaluate them against desired configurations. You can create custom rules to check for public S3 buckets.
Practical Takeaways
- Automate: Regularly run the CLI scripts or Python scripts using Boto3 to check for public buckets. Integrate these checks into your CI/CD pipelines or scheduled tasks.
- Least Privilege: Always grant the least privileges necessary. Avoid using
"*"in your bucket policies. - Regular Audits: Conduct regular audits of your S3 bucket permissions.
- Monitoring and Alerting: Set up monitoring and alerting to detect and respond to changes in bucket permissions.
- Bucket Policies vs. ACLs: Understand the difference between bucket policies and ACLs, and use bucket policies as the preferred method for controlling access.
- Consider using S3 Access Points: S3 Access Points simplify managing data access at scale for shared datasets by creating unique access points with specific permissions.
Bonus: Open-Source Tool
As an alternative to scripting, the open-source tool nuvu-scan can automatically discover cloud assets and detect security risks like public S3 buckets. You can install it via pip install nuvu-scan.
By actively searching for and remediating public S3 buckets, you can significantly reduce your risk of data breaches and maintain a more secure cloud environment.
Top comments (0)