Table of Contents
- Introduction
- Prerequisites
- Architecture Overview
- Step-by-Step Implementation
- Bash Script Breakdown
- Testing and Validation
- Monitoring and Troubleshooting
Introduction
Maintaining the security posture of your AWS infrastructure is not a one-time task; it requires continuous monitoring and assessment. One of the critical security concerns organizations face is the detection and management of outdated Amazon Machine Images (AMIs) running on EC2 instances. Outdated AMIs may contain unpatched vulnerabilities, deprecated software, and security misconfigurations that expose your infrastructure to risk.
Prowler is an open-source security tool that automates AWS security assessment against industry standards including CIS Benchmarks, AWS Foundational Security Best Practices, and other compliance frameworks. By combining Prowler with EC2 cron jobs and Amazon S3 storage, you can implement a fully automated, cost-effective security scanning solution that runs on a schedule without requiring external infrastructure or significant operational overhead.
This comprehensive guide demonstrates how to configure Prowler on an EC2 instance to automatically scan for outdated AMIs, generate detailed security reports in multiple formats, and securely store those reports in Amazon S3 for compliance auditing and analysis.
Prerequisites
Before implementing this solution, ensure you have the following:
AWS Account Requirements
- An active AWS account with appropriate permissions
- EC2 instance with Ubuntu 22.04 LTS or later (t3.small or larger recommended for adequate performance)
- Amazon S3 bucket for storing Prowler reports
- IAM role attached to the EC2 instance with the following permissions:
- ec2:DescribeInstances
- ec2:DescribeImages
- ec2:DescribeImageAttribute
- s3:PutObject
- s3:ListBucket
- logs:CreateLogGroup (optional, for CloudWatch logs)
Local System Requirements
- SSH access to the EC2 instance
- Basic understanding of Linux bash scripting
- Familiarity with AWS CLI operations
- Text editor (vi, nano, or similar)
AWS CLI Configuration
Ensure AWS CLI v2 is installed and configured with appropriate credentials:
bash
aws --version
aws sts get-caller-identity # Verify credentials
Architecture Overview
Solution Components
The complete automation solution consists of the following components:
- - EC2 Instance: Hosts the Prowler binary and bash automation script
- - Prowler: Open-source security scanning tool
- - Cron Scheduler: Native Linux scheduling mechanism
- - Bash Script: Orchestrates scan execution, report generation, and S3 upload
- - Amazon S3: Centralized repository for security reports
- - CloudWatch Logs: Optional centralized logging for audit trails
Data Flow
Cron Timer
↓
Bash Script (prowler-scan.sh)
↓
Prerequisite Validation
↓
Prowler Scan (ec2_instance_with_outdated_ami check)
↓
Report Generation (JSON, CSV, HTML)
↓
S3 Upload with Timestamp Organization
↓
Logging to /var/log/prowler-scan.log
Step-by-Step Implementation
Phase 1: EC2 Instance Preparation
Step 1.1: Create IAM Role and Policy
Create an IAM policy document named prowler-scan-policy.json:
json
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ProwerEC2Permissions",
"Effect": "Allow",
"Action": [
"ec2:DescribeInstances",
"ec2:DescribeImages",
"ec2:DescribeImageAttribute",
"ec2:DescribeInstanceAttribute",
"ec2:DescribeInstanceStatus"
],
"Resource": "*"
},
{
"Sid": "ProwlerS3Permissions",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:ListBucket",
"s3:GetBucketVersioning"
],
"Resource": [
"<paste S3 arn here>",
]
}
]
}
Create the IAM role via AWS CLI:
bash
# Create the role
aws iam create-role \
--role-name ProwlerScanRole \
--assume-role-policy-document '{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "ec2.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]
}'
# Create inline policy
aws iam put-role-policy \
--role-name ProwlerScanRole \
--policy-name ProwlerScanPolicy \
--policy-document file://prowler-scan-policy.json
# Create instance profile
aws iam create-instance-profile \
--instance-profile-name ProwlerScanProfile
# Add role to instance profile
aws iam add-role-to-instance-profile \
--instance-profile-name ProwlerScanProfile \
--role-name ProwlerScanRole
# Attach to running instance (if applicable)
aws ec2 associate-iam-instance-profile \
--iam-instance-profile Name=ProwlerScanProfile \
--instance-id i-xxxxxxxxx
Step 1.2: Install Required Packages
SSH into your EC2 instance and update system packages:
bash
# Update package manager
sudo apt-get update
sudo apt-get upgrade -y
# Install pipx (required for Prowler)
sudo apt install pipx
pipx ensurepath
# Prowler installation
pipx install prowler
# Install unzip and curl
sudo apt install unzip curl
# Install AWS CLI v2
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
sudo ./aws/install
rm -rf aws awscliv2.zip
# Verify installations
prowler -v
aws --version
Step 1.3: Create Amazon S3 Bucket
bash
# Create S3 bucket with unique name
aws s3 mb s3://<your bucket name> \
--region ap-south-1
# Enable versioning for audit trail
aws s3api put-bucket-versioning \
--bucket <your bucket name> \
--versioning-configuration Status=Enabled
# Enable encryption (AES256 default)
aws s3api put-bucket-encryption \
--bucket <your bucket name> \
--server-side-encryption-configuration '{
"Rules": [{
"ApplyServerSideEncryptionByDefault": {
"SSEAlgorithm": "AES256"
}
}]
}'
# Block public access
aws s3api put-public-access-block \
--bucket <your bucket name> \
--public-access-block-configuration \
"BlockPublicAcls=true,IgnorePublicAcls=true,BlockPublicPolicy=true,RestrictPublicBuckets=true"
Phase 2: Prowler Test Scan
Step 2.1: Validate Prowler Permissions
Test Prowler with your EC2 instance credentials:
bash
# Run a quick test scan on one check
prowler aws --checks ec2_instance_with_outdated_ami -f ap-south-1
# Verify output directory
ls -la output/
Note:
If you encounter a “prowler: command not found” error, first locate where Prowler is installed by running:
which prowler
Example output:
/home/ubuntu/.local/bin/prowler
If Prowler is installed in a custom path, use the full path instead of just prowler in your command. For example:
/home/ubuntu/.local/bin/prowler aws --checks ec2_instance_with_outdated_ami -f ap-south-1
Phase 3: Directory and File Setup
Step 3.1: Create Directory Structure
bash
Create output directory for Prowler reports
Note:
Mostly prowler creates this on its own you just need to change the permission and ownership.
sudo mkdir -p /home/ubuntu/output
sudo chown ubuntu:ubuntu /home/ubuntu/output
sudo chmod 755 /home/ubuntu/output
# Create log directory
sudo mkdir -p /var/log
sudo touch /var/log/prowler-scan.log
sudo chown ubuntu:ubuntu /var/log/prowler-scan.log
sudo chmod 644 /var/log/prowler-scan.log
# Create lock file directory
sudo mkdir -p /var/run
sudo touch /var/run/prowler-scan.lock
sudo chown ubuntu:ubuntu /var/run/prowler-scan.lock
sudo chmod 744 /var/run/prowler-scan.lock
# Verify permissions
ls -la /home/ubuntu/output
ls -la /var/log/prowler-scan.log
ls -la /var/run/prowler-scan.lock
# Expected output (Respectively)
drwxr-xr-x 2 ubuntu ubuntu 4096 Jan 11 12:34 /home/ubuntu/output
-rw-r--r-- 1 ubuntu ubuntu 0 Jan 11 12:34 /var/log/prowler-scan.log
-rwxr--r-- 1 ubuntu ubuntu 0 Jan 11 12:34 /var/run/prowler-scan.lock
Phase 4: Bash Script Creation
Step 4.1: Create the Main Automation Script
Create the file /usr/local/bin/prowler-scan.sh:
bash
sudo vi /usr/local/bin/prowler-scan.sh
Paste the following script:
bash
#!/bin/bash
##################################################################
# Prowler Security Scan Automation Script for EC2
##################################################################
set -uo pipefail # Exit on undefined variables and pipe failures
IFS=$'\n\t' # Set Internal Field Separator for safer word splitting
umask 002
# Set PATH for cron environment
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/ubuntu/.local/bin
export PATH
################################################################################
# Configuration Variables
################################################################################
PROWLER_BIN="/home/ubuntu/prowler/prowler"
OUTPUT_DIR="/home/ubuntu/output"
S3_BUCKET="ami-prowler-report-bucket"
S3_PREFIX="prowler-reports"
AWS_REGION="ap-south-1"
CHECKS="ec2_instance_with_outdated_ami"
LOG_FILE="/var/log/prowler-scan.log"
LOCK_FILE="/var/run/prowler-scan.lock"
################################################################################
# Color codes for output
################################################################################
readonly RED='\033[0;31m'
readonly GREEN='\033[0;32m'
readonly YELLOW='\033[1;33m'
readonly NC='\033[0m' # No Color
################################################################################
# Logging Functions
################################################################################
log_info() {
local msg="[INFO] $(date '+%Y-%m-%d %H:%M:%S') - $*"
echo -e "${GREEN}${msg}${NC}"
echo "$msg" >> "$LOG_FILE" 2>/dev/null || true
}
log_warn() {
local msg="[WARN] $(date '+%Y-%m-%d %H:%M:%S') - $*"
echo -e "${YELLOW}${msg}${NC}" >&2
echo "$msg" >> "$LOG_FILE" 2>/dev/null || true
}
log_error() {
local msg="[ERROR] $(date '+%Y-%m-%d %H:%M:%S') - $*"
echo -e "${RED}${msg}${NC}" >&2
echo "$msg" >> "$LOG_FILE" 2>/dev/null || true
}
################################################################################
# Error Handler
################################################################################
error_exit() {
log_error "$1"
exit "${2:-1}"
}
################################################################################
# Cleanup Function
################################################################################
cleanup() {
local exit_code=$?
if [ $exit_code -ne 0 ]; then
log_warn "Script exited with error code: $exit_code"
fi
# Remove lock file if it exists
if [ -f "$LOCK_FILE" ]; then
rm -f "$LOCK_FILE"
log_info "Lock file removed"
fi
}
trap cleanup EXIT
################################################################################
# Lock Management (Prevent concurrent runs)
################################################################################
acquire_lock() {
if [ -f "$LOCK_FILE" ]; then
local lock_pid=$(cat "$LOCK_FILE" 2>/dev/null || echo "")
if [ -n "$lock_pid" ] && kill -0 "$lock_pid" 2>/dev/null; then
log_warn "Another instance is already running (PID: $lock_pid)"
exit 0
else
log_warn "Stale lock file found, removing it"
rm -f "$LOCK_FILE"
fi
fi
echo $$ > "$LOCK_FILE"
log_info "Lock acquired (PID: $$)"
}
################################################################################
# Validation Functions
################################################################################
validate_prerequisites() {
log_info "Validating prerequisites..."
# Check if prowler binary exists
if [ ! -f "$PROWLER_BIN" ]; then
error_exit "Prowler binary not found at: $PROWLER_BIN" 1
fi
# Check if prowler is executable
if [ ! -x "$PROWLER_BIN" ]; then
error_exit "Prowler binary is not executable: $PROWLER_BIN" 1
fi
# Check if output directory exists, create if not
if [ ! -d "$OUTPUT_DIR" ]; then
log_warn "Output directory not found. Creating: $OUTPUT_DIR"
mkdir -p "$OUTPUT_DIR" || error_exit "Failed to create output directory: $OUTPUT_DIR" 1
fi
# Check if AWS CLI is installed
if ! command -v aws &> /dev/null; then
error_exit "AWS CLI is not installed or not in PATH" 1
fi
# Check AWS credentials
if ! aws sts get-caller-identity &> /dev/null; then
error_exit "AWS credentials not configured or invalid" 1
fi
log_info "Prerequisites validation completed successfully"
}
################################################################################
# Run Prowler Scan
################################################################################
run_prowler_scan() {
log_info "Starting Prowler security scan..."
log_info "Check: $CHECKS"
log_info "Region: $AWS_REGION"
# Change to output directory to ensure report files are created there
cd "$OUTPUT_DIR" || error_exit "Failed to change directory to $OUTPUT_DIR" 1
# Run prowler and capture exit code
# Prowler exit codes:
# 0 = All checks passed (no findings)
# 3 = Some checks failed (security findings detected) - THIS IS NORMAL
# Other = Actual execution error
"$PROWLER_BIN" aws --checks "$CHECKS" -f "$AWS_REGION" 2>&1 | tee -a "$LOG_FILE"
local prowler_exit_code=${PIPESTATUS}
if [ $prowler_exit_code -eq 0 ]; then
log_info "Prowler scan completed - No security issues found"
return 0
elif [ $prowler_exit_code -eq 3 ]; then
log_warn "Prowler scan completed with findings (exit code 3)"
log_info "Security issues were detected - this is NORMAL behavior"
log_info "Reports will be processed and uploaded to S3"
return 0
else
error_exit "Prowler scan encountered an execution error (exit code: $prowler_exit_code)" $prowler_exit_code
fi
}
################################################################################
# Find Latest Prowler Output Files
################################################################################
find_latest_reports() {
log_info "Searching for generated report files in: $OUTPUT_DIR"
# Find all three report types (most recent files modified in last 10 minutes)
OCSF_FILE=$(find "$OUTPUT_DIR" -name "prowler-output-*.ocsf.json" -type f -mmin -10 2>/dev/null | sort -r | head -n1)
CSV_FILE=$(find "$OUTPUT_DIR" -name "prowler-output-*.csv" -type f -mmin -10 2>/dev/null | sort -r | head -n1)
HTML_FILE=$(find "$OUTPUT_DIR" -name "prowler-output-*.html" -type f -mmin -10 2>/dev/null | sort -r | head -n1)
# Validate that all files were found
if [ -z "$OCSF_FILE" ] || [ -z "$CSV_FILE" ] || [ -z "$HTML_FILE" ]; then
log_error "Missing report files:"
[ -z "$OCSF_FILE" ] && log_error " - OCSF JSON file not found"
[ -z "$CSV_FILE" ] && log_error " - CSV file not found"
[ -z "$HTML_FILE" ] && log_error " - HTML file not found"
error_exit "Could not find all required report files in $OUTPUT_DIR" 1
fi
# Validate files exist and are readable
for file in "$OCSF_FILE" "$CSV_FILE" "$HTML_FILE"; do
if [ ! -r "$file" ]; then
error_exit "Report file not readable: $file" 1
fi
done
log_info "Found OCSF JSON report: $(basename "$OCSF_FILE")"
log_info "Found CSV report: $(basename "$CSV_FILE")"
log_info "Found HTML report: $(basename "$HTML_FILE")"
}
################################################################################
# Upload Files to S3
################################################################################
upload_to_s3() {
local file="$1"
local s3_folder="$2"
local filename=$(basename "$file")
local s3_path="s3://${S3_BUCKET}/${S3_PREFIX}/${s3_folder}/${filename}"
log_info "Uploading: $filename -> $s3_path"
if aws s3 cp "$file" "$s3_path" \
--region "$AWS_REGION" \
--sse AES256 \
--no-progress 2>&1 | tee -a "$LOG_FILE"; then
log_info "Successfully uploaded: $filename"
return 0
else
log_error "Failed to upload: $filename"
return 1
fi
}
################################################################################
# Upload All Reports to S3
################################################################################
upload_all_reports() {
log_info "Starting S3 upload process..."
# Create S3 folder name with human-readable timestamp format
# Format: YYYY-MM-DD_HH-MM-SS
local S3_FOLDER_NAME=$(date '+%Y-%m-%d_%H-%M-%S')
log_info "Using S3 folder name: $S3_FOLDER_NAME"
local upload_failed=0
# Upload OCSF JSON
if ! upload_to_s3 "$OCSF_FILE" "$S3_FOLDER_NAME"; then
upload_failed=1
fi
# Upload CSV
if ! upload_to_s3 "$CSV_FILE" "$S3_FOLDER_NAME"; then
upload_failed=1
fi
# Upload HTML
if ! upload_to_s3 "$HTML_FILE" "$S3_FOLDER_NAME"; then
upload_failed=1
fi
if [ $upload_failed -eq 1 ]; then
error_exit "One or more file uploads failed" 1
fi
log_info "All reports uploaded successfully to S3"
log_info "S3 Location: s3://${S3_BUCKET}/${S3_PREFIX}/${S3_FOLDER_NAME}/"
}
################################################################################
# Main Function
################################################################################
main() {
# Initialize log file
touch "$LOG_FILE" 2>/dev/null || LOG_FILE="/tmp/prowler-scan.log"
log_info "=========================================="
log_info "Prowler Security Scan Script Started"
log_info "Version: 2.0 (Fixed Exit Code Handling)"
log_info "=========================================="
log_info "Scanning Account: $(aws sts get-caller-identity --query Account --output text)"
log_info "AWS Region: $AWS_REGION"
# Acquire lock to prevent concurrent runs
acquire_lock
# Step 1: Validate prerequisites
validate_prerequisites
# Step 2: Run Prowler scan
run_prowler_scan
# Step 3: Find generated report files
find_latest_reports
# Step 4: Upload reports to S3
upload_all_reports
log_info "=========================================="
log_info "Script Completed Successfully"
log_info "=========================================="
}
################################################################################
# Script Entry Point
################################################################################
main "$@"
Step 4.2: Set Script Permissions
bash
# Make script executable
sudo chmod +x /usr/local/bin/prowler-scan.sh
# Change ownership to ubuntu user
sudo chown ubuntu:ubuntu /usr/local/bin/prowler-scan.sh
# Verify permissions
ls -la /usr/local/bin/prowler-scan.sh
Step 4.3: Test Script Execution
bash
# Run the script manually to ensure it works
/usr/local/bin/prowler-scan.sh
# Check log output
tail -f /var/log/prowler-scan.log
# Verify S3 upload
aws s3 ls s3://<your_s3_bucket_name>/prowler-reports/
Phase 5: Cron Job Configuration
Step 5.1: Edit Crontab
bash
# Open crontab editor for current user
crontab -e
# If first time, select editor (usually nano or vi)
Step 5.2: Add Cron Expression
Add the following line to run the scan daily at 2 AM:
Note:
Use cronguru to generate custom schedules.
bash
# Daily Prowler scan - runs every day at 2:00 AM
0 2 * * * /usr/local/bin/prowler-scan.sh
# Alternative: Weekly scan on Sunday at 2:00 AM
0 2 ? * SUN /usr/local/bin/prowler-scan.sh
# Alternative: Twice daily (2 AM and 2 PM)
0 2,14 * * * /usr/local/bin/prowler-scan.sh
Step 5.3: Verify Cron Configuration
bash
# List scheduled cron jobs
crontab -l
# Monitor cron execution
sudo tail -f /var/log/syslog | grep CRON
# Alternative: Check cron logs (varies by system)
sudo journalctl -u cron --follow
Bash Script Breakdown
Script Architecture
The automation script follows AWS best practices for
production-grade bash scripting:
1. Header and Metadata
- Script version and purpose declaration
- License information for compliance
- Clear authorship for maintainability
2. Strict Mode Configuration
bash
set -uo pipefail
IFS=$'\n\t'
umask 002
- set -u: Fails on undefined variable references (prevents silent errors)
- set -o pipefail: Pipe command fails if any command fails
- IFS: Prevents word splitting issues in shell loops
- umask 002: Sets secure default file permissions
3. Path and Environment Setup
bash
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/ubuntu/.local/bin
export PATH
Essential for cron execution, which doesn't inherit the user's PATH.
4. Logging System
- Color-coded output: Different colors for INFO, WARN, ERROR messages
- Dual logging: Console output + file logging for audit trails
- Timestamp inclusion: Every log entry includes date/time for forensics
5. Error Handling and Recovery
bash
trap cleanup EXIT
Ensures lock file removal and cleanup occurs regardless of exit condition.
6. Lock Mechanism
Prevents concurrent Prowler executions:
- Checks for existing lock file with PID validation
- Detects stale locks (process no longer running)
- Acquires new lock when safe to proceed
7. Prerequisite Validation
Before running expensive operations:
- Verifies Prowler binary existence and executability
- Confirms AWS CLI availability
- Validates AWS credentials
- Creates missing directories with appropriate error handling
8. Prowler Execution
bash
"$PROWLER_BIN" aws --checks "$CHECKS" -f "$AWS_REGION" 2>&1 | tee -a "$LOG_FILE"
local prowler_exit_code=${PIPESTATUS}
- Redirects stderr to stdout for complete logging
- Tees output to both console and log file
- Captures exit code from PROWLER_BIN (not tee)
9. Exit Code Handling
Prowler uses non-standard exit codes:
- 0: Checks passed (no findings)
- 3: Checks failed with findings (NORMAL, not an error)
- Other: Actual execution errors
10. Report Discovery
bash
OCSF_FILE=$(find "$OUTPUT_DIR" -name "prowler-output-*.ocsf.json" -type f -mmin -10 2>/dev/null | sort -r | head -n1)
- Finds files modified in last 10 minutes (prevents stale report uploads)
- Sorts by modification time to get most recent
- Validates all three report formats exist before proceeding
11. S3 Upload with Security
bash
aws s3 cp "$file" "$s3_path" \
--region "$AWS_REGION" \
--sse AES256 \
--no-progress
- Enforces AES256 server-side encryption
- Organizes reports by timestamp for easy retrieval
- Validates upload success before declaring completion
Testing and Validation
Manual Execution Tests
Test 1: Dry Run with Single Report
bash
# Execute script with verbose output
bash -x /usr/local/bin/prowler-scan.sh
# Expected output:# [INFO] ... Prowler Security Scan Script Started
# [INFO] ... Validating prerequisites...
# [INFO] ... Lock acquired (PID: XXXX)
# [INFO] ... Starting Prowler security scan...
# [INFO] ... Successfully uploaded: prowler-output-*.ocsf.json
# [INFO] ... All reports uploaded successfully to S3`
Test 2: Verify Log Output
bash
# Check last 50 lines of log
tail -50 /var/log/prowler-scan.log
# Search for errors
grep ERROR /var/log/prowler-scan.log
# Count successful uploads
grep "Successfully uploaded" /var/log/prowler-scan.log | wc -l
Test 3: Validate S3 Bucket Structure
bash
# List all uploaded reports with timestamps
aws s3 ls s3://ami-prowler-report-bucket/prowler-reports/ --recursive
# Verify encryption on specific object
aws s3api head-object \
--bucket ami-prowler-report-bucket \
--key prowler-reports/2024-01-15_02-00-00/prowler-output-123456789012-20240115T020000Z.ocsf.json
# Expected output includes: ServerSideEncryption: AES256
Test 4: Concurrent Execution Prevention
bash
# Terminal 1: Start first execution
/usr/local/bin/prowler-scan.sh
# Terminal 2: While first is running, try to start another
/usr/local/bin/prowler-scan.sh
# Expected: Second script exits immediately with warning about concurrent instance
# Output: [WARN] Another instance is already running (PID: XXXX)
Cron Execution Validation
Simulate Cron Environment
bash
# Run script with cron environment variables
env -i HOME=$HOME /usr/sbin/cron -f
Monitor Cron Logs
bash
# Watch cron execution in real-time (requires enabling cron logging)
sudo tail -f /var/log/cron
# or
sudo journalctl -u cron -f
Create Test Cron Entry
bash
# Edit crontab
crontab -e
# Add test job to run in 2 minutes
*/2 * * * * /usr/local/bin/prowler-scan.sh
# Wait 3 minutes, then verify execution
sleep 180
tail -20 /var/log/prowler-scan.log
Monitoring and Troubleshooting
Common Issues and Solutions
Issue 1: Prowler Not Found
Symptom: Prowler not found.
Solutions:
bash
# Verify installation
which prowler
# Try running prowler this way.
/home/ubuntu/.local/prowler aws --checks ec2_instance_with_outdated_ami -f ap-south-1
# Reinstall if necessary
Issue 2: AWS Credentials Not Found
Symptom: AWS credentials not configured or invalid
Solutions:
bash
# Check IAM instance profile attachment
aws sts get-caller-identity
# If using IAM keys, verify in ~/.aws/credentials
cat ~/.aws/credentials
# Troubleshoot IAM role
aws ec2 describe-instances --instance-ids i-xxxxxxxxx \
--query 'Reservations.Instances.IamInstanceProfile'
Issue 3: Report Files Not Found in Output Directory
Symptom: Could not find all required report files
Solutions:
bash
# Check output directory exists and has correct permissions
ls -la /home/ubuntu/output/
df -h /home/ubuntu # Check disk space
# Run Prowler manually to debug
cd /home/ubuntu/output
/home/ubuntu/prowler/prowler aws --checks ec2_instance_with_outdated_ami -f ap-south-1
# Verify report generation
ls -lh prowler-output-*
Issue 4: S3 Upload Failures
Symptom: Failed to upload: prowler-output-*.csv
Solutions:
bash
# Verify S3 bucket exists and is accessible
aws s3 ls s3://<your_s3_bucket_name>/
# Check IAM permissions
aws iam simulate-principal-policy \
--policy-source-arn <your_s3_bucket_arn> \
--action-names s3:PutObject s3:ListBucket \
--resource-arns <your_s3_bucket_arn>/*
# Test S3 upload manually
echo "test" > /tmp/test.txt
aws s3 cp /tmp/test.txt s3://<your_s3_bucket_name>/test.txt --sse AES256
# Check S3 bucket policy
aws s3api get-bucket-policy --bucket <your_s3_bucket_name>
Issue 5: Concurrent Execution Lock Issues
Symptom: Script exits immediately, Another instance is already running
Solutions:
bash
# Check for stale lock
cat /var/run/prowler-scan.lock
# Verify if PID is still running
ps aux | grep [PID]
# If stale, manually remove lock
rm /var/run/prowler-scan.lock
# Increase scan timeout or adjust schedule
# Modify LOCK_FILE check in script if scans consistently exceed frequency
S3 Lifecycle Policies
bash
# Archive old reports to Glacier after 30 days
aws s3api put-bucket-lifecycle-configuration \
--bucket <your_s3_bucket_name> \
--lifecycle-configuration '{
"Rules": [{
"Id": "ArchiveOldReports",
"Status": "Enabled",
"Prefix": "prowler-reports/",
"Transitions": [{
"Days": 30,
"StorageClass": "GLACIER"
}]
}]
}'
Top comments (0)