DEV Community

Winnie Kiragu
Winnie Kiragu

Posted on • Edited on

AWS CUR (Cost and Usage Reporting) for beginners

ec2_to_poverty

When anyone sees this image, they instinctively laugh because we have all heard the stories. We all know how cloud platform bills like AWS bills sky rocket because of the simplest things like not decommissioning an ec2 instance.

While managing costs on cloud computing platforms is a topic on every person's mind, it's not really something everyone would want to get involved with. To be honest, when it comes down to overviewing this discussion, most would rather avoid it and cross their fingers hoping for the best.

I'd like to debunk this fear in this article.

Cost and Usage reporting is such a broad topic that you can never really find an article/blog that is very specific to your needs.
This shouldnt dis-hearten you, on the contratry, you simply have to envision the journey with the following steps:

1. Understand your cloud expenditure.
2. Data collection and analysis on existing usage
3. Presentation of your report data
Enter fullscreen mode Exit fullscreen mode

These 3 steps sum up what you need to do. But when said like that, it can still appear like an uphill task for someone new to creating CURs (Cost and Usage Reports).

Let's break these steps down a little further to de-mystify the whole process, shall we?

1. Understand your cloud expenditure

This can often be challenging, esp for individuals/organizations that do not have any kind of reporting procedures in place.

Ultimately this goal can be simplified into understanding these 3 areas:

1. How do you govern usage?
    - Are there specified individuals who can build cloud infrastructure?
    - Are there any existing policies/processes in place for this?
2. How do you monitor usage and cost?
3. How do you decommission resources?
Enter fullscreen mode Exit fullscreen mode

If your the one establishing this new culture, you may need to evaluate the 3 areas and set up strategies around the same. If an existing cost reporting culture already exists, then you may have gotten off on a less upill task.

LIke with any discussion on policies and procedures, you may need to have a conversation with relevant project/product owners to gain better understanding on reources in use, resource creation policies and of course decommissioning policies.

For readers of this article that need to build these processes and procedures from scratch, have a deeper look at how current operations handle the following:

1. Review existing tagging strategies, if any. Cost Allocation Tags (CATs) are crucial to understand here, if they do have any in place.
2. Review existing billing alerts, if any.
3. Review decommissioning policies, if any.
Enter fullscreen mode Exit fullscreen mode

If none of the above already exists, you may need to define these before you proceed. Ensure to build these out with relevant product owners as stated earlier in this article.

To learn more about best practices when creating tags to your cloud resources, see AWS Tagging Strategies. Remember, the same strategies can be applied to other cloud platforms, not just AWS, but with slight tweaks here and there.

Billing alerts and decommissioning policies creation are outside the scope of this article but also play a vital role in establishing an appropriate cost reporting culture within every organisation. Look them up tafadhali.

2. Data collection and analysis on existing usage

So in the first step, we were doing a high level overview for how your particular org goes about reporting. But now we need to put values to these overviews.

All cloud providers provide these figures to you a billing dashboard of some sort. You can always look these up according to your preferred cloud platform.

Fetching this usage data from AWS, for example, can be done in 2 ways; UI Console/ or CLI (Command Line Interface)

If you are working with multiple cloud providers, you will need to aggregate cost data from the different vendors. Same logic, just different user interfaces or CLI commands.

Via Console

AWS, for example, provides to help you get started:

1. AWS Cost and Usage Report feature.
2. AWS Cost explorer.
Enter fullscreen mode Exit fullscreen mode

They sound the same, but they are slightly different.

AWS Cost and Usage Report(CUR) tracks AWS costs and usage at the account or organization level by usage type, operation, and product code.

To view cost and usage breakdown by service, you can view the Cost Explorer tab and generate the same report by service for whichever stipulated timeline you would need this for.

Via CLI

The aws ce command with the appropriate directives returns your data easily. For e.g

To get the daily expenses for the last X days

aws ce get-cost-and-usage \
    --time-period Start=$(date +"%Y-%m-%d" --date="-240 hours"),End=$(date +"%Y-%m-%d") \
    --granularity=DAILY \
    --metrics BlendedCost UnblendedCost \
    --query "ResultsByTime[].[TimePeriod.Start, Total.BlendedCost.[Amount][0], Total.BlendedCost.[Unit][0]]" \
    --output table
Enter fullscreen mode Exit fullscreen mode

To get monthly expenses for the AWS Account

aws ce get-cost-and-usage \
    --time-period Start=2023-01-01,End=2023-06-01 \
    --granularity MONTHLY \
    --metrics "BlendedCost" "UnblendedCost" "UsageQuantity"\
    --query "ResultsByTime[].[TimePeriod.Start, Total.BlendedCost.[Amount][0], Total.UnblendedCost.[Amount][0], Total.UsageQuantity.[Amount][0], Total.BlendedCost.[Unit][0]]" \
    --output table
Enter fullscreen mode Exit fullscreen mode

The two alternatives will give you an easy way to acquire the usage data you need to get started on analysis.

For this, no amount of explaining will help you understand/ get a visual of what I'm saying. You just have to get your hands dirty and explore how to fetch this information from your preferred cloud provider and how to analyse the information you receive in turn.

3. Presentation of your report data

AWS documentation states that a good Cost and Usage report should offer a comprehensive details on the following:

  • Metadata about AWS services,
  • Credit,
  • Pricing,
  • Fees,
  • Discounts,
  • Taxes,
  • Cost categories,
  • Savings Plans
  • Reserved Instances

But this should not get you tripping. You can always start with a simple report containing the following columns:

Service in Use Spenditure category Credits Taxes Saving Opportunities
RDS Storage 0.00 0.00 Text
EC2 Compute 0.00 0.00 Text
EKS Outbount transfer 0.00 0.00 Text
ECS Outbount transfer 0.00 0.00 Text

N.B

  • The level of detail and granularity in cost and usage reporting and monitoring is dependant on your target audience, so always start simple and build according to feedback you receive from your team.

The segment detailing Saving Opportunities is also an important aspect to include in these discussions with your team, so it is essential to make room for this on the report, even if you have nothing on this segment when you are presenting the report, as this will help you easily remember this during your meeting.

Some terminologies you may need to know when working with AWS Billings are:

  • Blended costs - Blended costs are calculated by multiplying each account’s service usage against something called a blended rate. A blended rate is the average rate of on-demand usage, as well as Savings Plans- and reservation-related usage, that is consumed by member accounts in an organization for a particular service.
  • Unblended costs - Unblended costs represent your usage costs on the day they are charged to you.
  • Amortized costs - Amortized costs are a powerful tool if you seek to gain insight into the effective daily costs associated with your reservation portfolio or are looking for an easy way to normalize cost and usage information when operating at scale.

As always refer to aws documentation on costs for reference. Other cloud providers may have different terminologies, so get familiar with those as well.

Lastly, a Cost and Usage report may be needed weekly/monthly, so you can easily automate this process by building a simple REST API application and leveraging the AWS command line. I personally think that is a great next project for anyone who has gotten this far in myblog, don't you reader?

Top comments (0)