One Lambda function. Logs AWS cost logs daily and gives instant alerts. With Beautiful Dashboards.
This is Serverless Cost Intelligence —a project I built from scratch using Serverless AWS tools to automate cost tracking like never before.
Alright, enough with the theory; let's build this intelligence system from scratch. Even a beginner can follow these instructions. I have attached screenshots for everything we are going to build.
🚀 Tech Stack & AWS Services Used
Before we dive into the steps, here’s a quick summary of the core AWS services and tools we are using for this project:
- AWS Lambda – For running backend logic to fetch cost data automatically
- Amazon CloudWatch Logs – To log the cost data and debug if needed
- AWS Cost Explorer API – To fetch cost and usage details programmatically
- Amazon SNS – For optional cost alerts/notifications
- Amazon S3 – For storing logs or future data exports
- Amazon EventBridge – To automate the Lambda trigger on a schedule
- Amazon QuickSight – For building a dashboard to visualize the cost data.
Step-1: IAM Role Setup for Lambda
To make our Lambda function communicate with the Cost Explorer Service, we need to define an IAM Role with appropriate permissions. For that, we'll create an IAM role with an inline policy attached to that role.
Go to the AWS Console and type IAM and select the IAM Service from the UI.
On the Next page, for the use case, select the lambda in the dropdown.
Now let's create a Role name, name anything but make it more descriptive, and click next, review the changes, and create the role.
We have now created the Role, but wait, we haven't attached any policy. Now go to the Permissions tab and click Add inline policy.
Here is the Policy. You can copy it.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"ce:GetCostAndUsage",
"s3:PutObject",
"s3:PutObjectAcl",
"sns:Publish"
],
"Resource": "*"
}
]
}
- Now, name the Policy and click Create Policy.
✅ Step-1: Completed the IAM Role Setup
In Step 1, we created a dedicated IAM Role and attached a custom inline policy with all the necessary permissions. This role allows our Lambda function to:
- Access AWS Cost Explorer to fetch cost data
- Write data to S3 for storing cost reports
- Publish messages to SNS Topics for alerts or notifications
Step 2: Creating an S3 Bucket to Store our Cost Data.
We'll set up an S3 Bucket that our Lambda function uses to save the AWS Cost Data in CSV Format. Let's Create the Bucket, Navigate to the AWS Console, and Search for S3.
Just name the bucket, and leave all the other settings as default.
Click next, review the bucket settings, click Create bucket, and wait for the bucket to get created.
✅ Step 2 Complete – S3 Bucket Creation
In this step, we created a dedicated S3 bucket where our Lambda function can securely store the daily cost reports fetched from AWS.
This bucket is the central storage for our cost logs and data.
Step-3: Setting Up SNS for Email Alerts
Now, we are setting up Amazon SNS(Simple Notification Service) to send cost alerts directly to our email inbox.
Now we have to create a TopicName. Name your topic and click next step.
Here you need to select the SNS topic we have created previously and select the protocol as Email, Then Enter your Email and click Create the Subscription.
One Thing, you will receive the confirmation mail in your inbox, click to verify, and you have completed this Step.
✅ Step 3 Complete – SNS Topic + Subscription Setup
We have created an SNS topic and added an email subscription — this is the system that will notify us as soon as the AWS costs exceed our Budget.
🚀 Step 4: Lambda Function Setup
In Step 4, we’ll create the actual Lambda function with Custom Logic that fetches cost data from AWS Cost Explorer, stores the data in S3, and then notifies us via the SNS Service. We'll also attach the IAM role we created before so Lambda has the right permissions to do communicate with other services.
Here, select the Author from scratch, and name your function. Select the runtime as python:3.13
Below, select the existing role we have created in our Step-1 and attach it to our lambda function.
Now, click Next, review the changes, and click Create Function and wait for the function to get created.
Let's start writing our function logic. I will be uploading the Code snippet here.
import boto3
import json
import datetime
import csv
import os
# Replace with your values
SNS_TOPIC_ARN = "arn:aws:sns:ap-south-1:123456789012:aws-cost-alerts"
S3_BUCKET = "smart-cost-tracker-logs"
def lambda_handler(event, context):
today = datetime.date.today()
start = (today - datetime.timedelta(days=2)).strftime('%Y-%m-%d')
end = (today - datetime.timedelta(days=1)).strftime('%Y-%m-%d')
client = boto3.client('ce')
response = client.get_cost_and_usage(
TimePeriod={'Start': start, 'End': end},
Granularity='DAILY',
Metrics=['UnblendedCost'],
GroupBy=[{'Type': 'DIMENSION', 'Key': 'SERVICE'}]
)
services = response['ResultsByTime'][0]['Groups']
# CSV content
csv_lines = [["Service", "Cost (USD)"]]
alert_lines = []
total_cost = 0
for service in services:
name = service['Keys'][0]
amount = float(service['Metrics']['UnblendedCost']['Amount'])
total_cost += amount
csv_lines.append([name, f"{amount:.4f}"])
if amount > 1: # Alert threshold (₹80+)
alert_lines.append(f"{name}: ${amount:.2f}")
# Save to S3
file_name = f"daily-cost-{start}.csv"
local_file_path = f"/tmp/{file_name}"
with open(local_file_path, 'w', newline='') as file:
writer = csv.writer(file)
writer.writerows(csv_lines)
s3 = boto3.client('s3')
s3.upload_file(local_file_path, S3_BUCKET, file_name)
# Send Alert if above threshold
if total_cost > 5: # Total > ₹400
sns = boto3.client('sns')
msg = f"AWS Daily Cost Alert - {start}\n\nTotal: ${total_cost:.2f}\n\n" + "\n".join(alert_lines)
sns.publish(TopicArn=SNS_TOPIC_ARN, Subject="AWS Daily Cost Alert 🚨", Message=msg)
return {
'statusCode': 200,
'body': json.dumps('Cost fetched and logged successfully!')
}
Basically, this is How the Code Works:)
📅 It fetches daily AWS cost data from the Cost Explorer.
📂 Converts it into a CSV file and stores it in S3.
💸 Checks for services costing more than $1 and prepares an alert.
📬 If the total cost exceeds $5, it sends a notification via SNS to your email.
⚠️NOTE
Replace your SNS Topic ARN and S3 Bucket name at the top of the Code.
Another Quick thing to do, to make our function run faster, we are changing the Memory limit and Execution Time of our Lambda function.
- Here is the Screenshot for your reference.
✅ Step 4 Complete – Lambda Function Ready!
Awesome🎉, We have successfully created our Lambda function, attached the necessary IAM role, and plugged in the logic that:
- Fetches AWS cost data
- Stores it safely in S3
- Sends alerts if the cost crosses the limit
Your smart cost tracker is now fully functional and ready to Function Well.
Step-5: Setting up a CloudWatch Events Rule with a Cron expression to automate this Lambda function, so it runs every day without Manual Intervention.
- Let's enter the Cron,
Schedule expression: cron(0 3 * * ? *) (Runs daily at 8:30 AM IST), change as per the need, and click next.
Select the target as Lambda function, and select our created Lambda function, then click Create.
We have Successfully Created a Cron and Schedule, which
auto triggers the lambda function that fetches the AWS Costs logs and stores them in an S3 Bucket.
-Finally, let's deploy our lambda function, and test it once manually, we can see that the Status code to 200. and with a message body "Cost fetched and Logged Successfully," which means our project is 100% working well and good.
- Also, let's confirm that the logs of the cost reports are stored in S3.
We can see that a cost log file has been created by the lambda function and stored in S3.
Congratulations! You’ve successfully built and tested Serverless Cost Tracker. Here's a quick recap of what we achieved so far:
✅ Created and configured The IAM Role with inline policies for S3, Cost Explorer, SNS
✅ Set up a secure S3 bucket to store cost reports
✅ Created an SNS Topic with email subscription for alerts
✅ Built and deployed a fully working Lambda Function
✅ Manually tested the function – verified a 200 OK response and confirmed the log file was stored in S3
✅ Set up CloudWatch Cron Scheduler to automate daily runs
📊 Bonus Visualization with QuickSight (Optional)
You can also integrate the S3 bucket (where our cost data is stored) with Amazon QuickSight to generate clear, beautiful cost graphs and dashboards.
I'll attach a sample screenshot to give you an idea — Which is optional, but very useful if you want a quick visual summary of daily AWS Costs and Associated Service Usage. It's even a better way to showcase the project and make it look even better.
Top comments (0)