5 Factors That Could Be Making Your Project Estimates Go Wrong
Why do our estimates for a project or a testing phase so often turn out wrong?
Improving our ability to estimate accurately is important, as our inability to deliver to an estimate affects the confidence of the business in IT projects and teams.
What is puzzling is that not only do we make estimation errors, but they are systematic errors of underestimation. If errors were simply due to a random spread of noise, then we should see overestimates just as often as underestimates. However, we only tend to underestimate, suggesting that factors are systematically biasing the estimate.
Systematic underestimation is not the only effect we observe. Typically, the eventual delivery date is not just after the estimated date, but far outside the range of predicted delivery dates.
And whatever causes underestimation, we clearly do not learn from experience, as we repeatedly make estimation errors, despite feedback showing previous errors. It’s a chronic problem.
What could be driving these errors? Here are five potential factors.
Technology uncertainty
IT projects frequently deal with new technology, which often is not fully understood and may cause estimation errors.
Intentionally manipulated estimates
Estimates may be intentionally manipulated, possibly to receive project funding or due to a belief that teams are more productive when kept under pressure.
Developer gold-plating
Gold-plating, where time saved by early task completion is consumed by completing the task to a higher standard than is required, could cause projects to deliver late, even if the average estimate for all tasks is accurate.
Adverse selection
Adverse selection is where undesirable members of a group are preferentially selected. For example, an all-you-can-eat buffet attracts customers who eat to excess. Adverse selection probably occurs in software projects as well, as underestimated projects will get funded in preference to accurately estimated projects.
However, even on the (admittedly few) projects where the above causes are absent, underestimation still occurs. We must ask what other effects could contribute to estimation errors, and to find an answer, we need to shift our gaze away from external and project factors and look inside ourselves.
Human biases
Most of us jump to hasty conclusions, think that we are smarter than average, and believe that we would have spotted others’ past mistakes. All these effects are the result of cognitive biases. There are many such biases, but those most relevant to underestimation errors are the anchoring effect, optimistic bias, planning fallacy, and overconfidence effect.
Teams attempting to improve their estimation techniques tend to focus on the known external drivers, like technology uncertainty, while ignoring the human biases, but we are then disappointed when we continue to miss estimates.
If we wish to improve our estimates, perhaps we need to spend more time trying to understand our human biases and how they impact our estimates.
Andrew Brown is presenting the session Improve Planning Estimates by Reducing Your Human Biases at STARWEST 2018, September 30–October 5 in Anaheim, California.
Why do we estimate for a testing project?
Designing test cases are dependent on requirements, so those must be completed and hopefully not change too much. But they always do. In waterfall shops, the Business Analyst is long gone by the time testers join the project, so even if testers analyze the requirements for ambiguities, contradictions or missing requirements, there may not be anyone around to clarify, make decisions and/or update the requirements doc.
Test execution relies on code being delivered to the testers. The testing effort could be estimated for a feature or cycle, but what good does that do when you don't know how many cycles of testing are needed?
I find estimation - in the traditional sense - a waste of time. Now if you're talking about SCRUM-type estimations - which are time-boxed and really serve as a communications facilitation device - I'm all for that.