DEV Community

Discussion on: Cost of delay: simultaneous feature development

Collapse
 
kspeakman profile image
Kasey Speakman

Maybe for future posts. I'm curious about your ops infrastructure, and how much it costs per deployment or a ballpark of what it cost the company to automate deployments, and what things still have to be done manually. It seems like many companies do simultaneous feature development in order to minimize costs and risks of deployment. Risks can include outage due to mistakes, customers upset at things changing too frequently (depends on the app), data migration. Costs might include dev/ops time, versioning, documentation, integration testing, etc. In an ideal world most of the costly things would be automated, but doing this also costs time to develop. And afterwards, some manual work still often remains. Thanks for the post!

Collapse
 
eljayadobe profile image
Eljay-Adobe

At my company, the cost of a deployment is in the ballpark of $3 million.

I look on with envy at Google Chrome and their release channels. Stable (3 weeks per release), Beta (weekly), Dev (weekly), Canary (daily), Waterfall (per check-in), Last Known Good (tagged from waterfall).

I think Chrome's cost of deployment has been streamlined to the point that it costs pennies to deploy. Now that is an agile feather in their cap!

Collapse
 
kspeakman profile image
Kasey Speakman

I can't help but wonder how much it cost Google to develop the continuous delivery process they now enjoy.

Collapse
 
bosepchuk profile image
Blaine Osepchuk

You are asking about economically optimal batch sizes. I covered a specific case of this in my post: Optimal pull request size. The graph in the linked post is directly applicable to your question.

You need to find your economically optimal outcome, which is a balance between the cost of waiting to deploy some set of features (holding cost including cost of delay) and the cost of a single deployment (transaction cost).

In my workplace we can deploy our code in about 2 minutes. But I can imagine deployments that are very expensive. For example manually reprogramming all the nuclear weapons in the US arsenal would likely cost hundreds of millions of dollars. Vastly different transaction costs effect the economically optimal batch size.

I can profitably push code updates into production several times per hour but the US military might calculate that their optimal deployment schedule for nuclear weapons software updates is once every 5 years.

However, if the US military found a critical error in their nuclear weapon software, the cost of delay would shoot up, and then their economically optimal release time might be today even if they deployed yesterday. And even if it cost them billions of dollars to redeploy. Does that make sense?

Thanks for the question. I'm planning more advanced cost of delay posts including a post on optimal batch sizes. But I wanted to start with an easy one and work up to the harder cases.