DEV Community

Cover image for Google Cloud Run always-on vs on-demand CPU allocation
mkdev.me for mkdev

Posted on • Originally published at mkdev.me

Google Cloud Run always-on vs on-demand CPU allocation

As you remember from our previous article, we were discussing how to reduce costs in our Google Cloud environment. We mentioned that we were working with different cloud environments and had successfully reduced our expenses.

As you can see here, these are the expenses we had:

01

There was a moment when we explained that there is a way to set up our Cloud Run, specifically deciding whether we want to use our Cloud Run CPU all the time or only connect the CPU when we need it.

On the screen, you can see the block showing, for example, Cloud Run costing €1.53 when it's connected all the time, and €0.36 when the CPU is only connected when using the service. We mentioned that it's cheaper to connect the CPU only when needed. However, this is not always true, and I want to clarify that in here.

02

In which cases is this not true? It is not true when we want to use the CPU all the time. Imagine we have an application that constantly requires CPU resources. To illustrate, I'm going to show you a blog I found on the internet called "Google Cloud Run Monthly Pricing Breakdown 2024".

In this blog, we can see that prices depend on the tier we are in. For example, in tier one, the price is $0.0000180 vCPU per second. For on-demand usage, which is the case here, the price is $0.0000240 per second, approximately 30-40% more expensive than always allocated.

03

Although the price per second is higher, the total cost can be cheaper if our application is not used continuously. For instance, for one vCPU and 0.5 GB of RAM, the always-allocated tier one price is around $50 per month. If we use the same resources on-demand for a similar duration (2.6 million seconds, roughly a month), the cost would be $65.

For one vCPU and 0.5 GB of RAM, always allocated costs $49, whereas on-demand costs $65. For higher configurations like one vCPU and 1 GB of RAM, the costs jump from $52 to $68, and for one vCPU and 2 GB of RAM, from $57 to $75. For two vCPUs and 2 GB of RAM, it jumps from $114 to $150, showing that on-demand is significantly more expensive.

However, in our case, our application, Claimora, is only used when people log in to register their time. Therefore, on-demand usage is cheaper for us, costing around $0.38 per day compared to the always-allocated cost.

So, always check how your application is used and compare costs. We did this comparison and found that on-demand was cheaper for us. Which one will be cheaper for you - you have to decide.


Here's the same article in video form for your convenience:

.

Top comments (0)