DEV Community

Cover image for The Answer to Life, the Universe, and Project Estimate Failure (Spoiler: It's Not 42)
Petr Filaretov
Petr Filaretov

Posted on

The Answer to Life, the Universe, and Project Estimate Failure (Spoiler: It's Not 42)

TL;DR

An accurate estimate is a kind of art. And there are some reasons why the original, simple estimate could be drastically wrong.

Intro

Are you tired of successful success stories? I'm continuing the series of posts on what I learned and here is an anecdote of how I failed to estimate a project.

What can go wrong with a small project estimation?

I was working in one of the product teams. Once I was tasked to estimate a small project. It was about migrating legacy desktop product plugin functionality to the brand new and shiny web one.

So, I had a look at the old code, broke it into logical pieces and estimated every piece in man-days. That's how you usually do it, don't you? The total was something around 20 man-days, which is two sprints for one senior developer.

Our manager communicated the estimate to the leadership group and sometime later the team started an implementation.

And at the end of the very first sprint, it became obvious how wrong was that estimate. The person working on this was doing everything very quickly but still, it was not enough. I joined him to speed up the process and that helped a little bit.

Finally, we released the new functionality, which took about four sprints of calendar time and 105 man-days. (10 days * 1 dev + 30 days * 2 devs + 35 days * 1 QA)

I felt terrible. And only sometime later, I understood what were the main reasons for that estimation error.

Optimism

First of all, developers tend to give optimistic estimates for tasks.

This is easy.

Everything is clear and fairly simple.

What can go wrong here?

Does that sound familiar to you?

But in reality, something goes wrong. Every time.

So, to make estimates look more or less realistic, I found PERT very useful. In a nutshell, you give three estimates for every task: optimistic (o), most likely (m), and pessimistic (p). And then, an expected (e) estimate can be calculated as

e = (o + 4m + p) / 6

The coefficients in this formula can be adjusted if you feel that would give a more accurate result, e.g.

e = (o + 3m + 2p) / 6

So, if I take my original estimate of 20 man-days as an optimistic one, and assume that most likely it would be 30, and in the worst case it would be 40, then the expected estimate would be:

(20 + 30*4 + 40) / 6 = 30 man-days

That's a little bit better, but still, it is far from reality.

We don't know what we don't know

There is always something unknown. And we don't know what it is until we face it. Otherwise, it wouldn't be unknown, right? So, how do we take it into account then?

Elementary, my dear Watson! Just multiply an estimate by some number. I found that 1.5 is good enough for this. (not 42, unfortunately)

Even though it may look unreasonably high at the first glance, especially for managers, this is the reality. And I had some projects later when this coefficient proved itself and helped to provide more realistic estimates.

So, if I take the expected estimate of 30 man-days calculated previously and multiply it by 1.5, I will get:

30 * 1.5 = 45 man-days

Better, but not enough.

Coding is not the only effort

Another reason for inaccurate estimates is the effort needed to complete the task.

Developers tend to estimate coding efforts only. However, there could be some more stuff to do:

  • Unit tests
  • Integration tests
  • QA (both manual and automated)
  • Documentation
  • Bug fixing
  • Release activities
  • Business analysis
  • UX
  • Solution architecture
  • Project management

You may not need all of these every time or may need something else.

In my case, I estimated the coding effort only and didn't take into account everything else.

The other work could be approximately estimated as a percentage of the coding, for instance:

  • Unit tests - 20%
  • Integration tests - 20%
  • QA - 50%
  • Documentation - 5%
  • Bug fixing - 10%
  • Release activities - 5%
  • Business analysis - 10%
  • UX - 10%
  • Solution architecture - 10%
  • Project management - 10%

You will eventually find numbers that suit you if you try this approach a few times.

So, if I take the previously calculated estimate of 45 man-days and add tests, QA, documentation, bug fixing, release activities, and business analysis (that's what we needed), I will get:

45 * (1 + 0.2 + 0.2 + 0.5 + 0.05 + 0.1 + 0.05 + 0.1) = 99 man-days

Now, that's close to 105 man-days spent in reality.

Conclusion

So, what did I learn?

I learned that an accurate estimate is a kind of art. You need to practice a lot to be able to provide estimates that are close to reality. Practice here means not only estimating the project and forgetting about it. But also taking part in its implementation and launch. It will give you feedback on the real estimate accuracy.

I also learned that you should be prepared to argue your estimate to a manager. Especially, the multiplier for unknowns.

And one more thing is that the lesson you learned from your mistake is one that you will never forget. The harder the lesson and the worse the result - the better you will learn it.

Take care. Tomorrow will be... better!

Top comments (0)