In the last post we covered the importance of engaging the right people in the estimation process and the need to use more than one estimation approach. Today we will look at some examples of team based estimation approaches and practical ways to combine estimate result sets.
Estimation approaches (agile or traditional) can be divided into Heuristic (expert judgment based) and Parametric (calculation based) approaches.
• Comparison to similar systems
• Expert Judgment
• Activity Based (top down)
• Task Based (bottom up)
• Function Points
• Use Case Points
• Object Points
Both sets of approaches have some merit, but they are also have their limitations and are open to misuse and over reliance too. For instance Activity Based (top-down) estimating is the most commonly employed estimation approach, but has been found to be the least accurate. Capers Jones in his book “Estimating Software Costs” instead recommends task based (bottom up) estimating approaches that tend to yield better results by encouraging a more thorough investigation into the likely tasks.
Involving many stakeholders
We should ask the people who will be doing the work how long they think it will take. Not only are they closest to the technical details and therefore theoretically in a better position to create a better estimate, but also because of the psychological benefits also.
If someone just hands you an estimate for your work and tells you it should take two weeks, you can either comply and try to get it done in two weeks or rebel and either finish it early or explain why it will take much longer to illustrate the poor nature of the estimate. Even complying and doing the work in two weeks can be a problem; given two weeks work will expand to fill the time. If a solution is found early time will likely be spent finding a better solution or refining the existing one. We do not get enough of the early finishes to cancel out the late ones. As Don Reinertsen observes in “Managing the Design Factory” in engineering we get few early finishes.
Instead we need to engage the people doing the work in the estimation process. This way not only do we get better technical insight, but also more commitment to the estimates. As the adage states “no involvement, no commitment” we need to build commitment to our estimates by involving the team.
As involvement increases so does commitment to try and meet the estimates. When people helped create the estimates they will work harder to meet them.
Wideband Delphi is a group based estimation approach seeing a resurgence with agile methods recently. Developed at the Rand Corporation in 1948, the Delphi estimation method asks a small team of experts to anonymously generate individual estimates from a problem description and reach consensus on a final set of estimates through iteration. In the early 1970s, Barry Boehm and his Rand colleagues modified this method into Wideband Delphi, which included more estimation team interaction via iterations, assumption discussions and re-estimation.
In a Wideband Delphi workshop the facilitator explains the component / unit to be estimated and brainstorms estimation assumptions with the group. For instance, estimate in days, assume a standard level developer, include time for unit testing, etc. Then the team members individually create lists of the tasks they believe necessary and estimate these tasks. The facilitator plots the estimates on a board, but keeps the values anonymous. We do not want the architect’s estimates unduly influencing those of junior developers for instance.
Tasks, issues and assumptions are discussed as a group and the estimation process repeated. In light of this new information people’s estimates change slightly and we usually see a convergence of the estimate figures as the process iterates. Driving to consensus or an acceptable degree of variance is a powerful way for building support for estimates. The team was involved, they sorted through issues and emerged with an estimate range all were comfortable with.
Karl Wiegers wrote up a great account of Wideband Delphi estimation in the February 2000 edition of Software Development magazine.
Today’s Agile Estimation Techniques
Today’s Agile estimation techniques build on the principles of Wideband Delphi estimation. They embody the idea that we need to engage the entire team in the creation of credible, supported estimates. While the names and terms used in agile estimation approaches vary, at their heart is some variant of the Wideband Delphi approach.
Planning Poker is a fast, card based approach to team estimation. Team members are given cards with the numbers 0, 1, 2, 3, 5, 8, 13, 20, 40, and 100 on them to represent development effort estimates.
User stories are presented to the team and discussed as a group to round out understanding, Then everyone selects a card that represents their estimated effort for the development. The cards are kept private until everyone has selected a card and then they are all revealed together.
If the resulting estimates are close then it is safe to record the estimate and move onto the next story. If there is significant divergence then people are invited to discuss why they believe it will be so different (for example last time we added a column to the account table it took 8hrs to regression test the legacy applications). Then in light of this new information people estimate again, as before only revealing their cards when everyone has selected a card. The process iterates until sufficient consensus has been reached.
Planning Poker takes the group based consensus building and anonymous estimating concepts of Wideband Delphi estimation and packages them in a faster, more enjoyable format. Research by Nils Haugen presented at the Agile 2006 conference in Minneapolis showed that planning poker based estimates were at least as good as traditional task based estimates yet they were rated as more enjoyable for the team to produce and faster to generate. You can read more about Planning Poker in the short 2002 paper by James Grenning
XP, Scrum, Crystal and the Wisdom of Crowds
The idea of involving the whole group and rapidly iterating through discussion to reach consensus is also used in iteration planning within many agile methods. XP has the Planning Game, Scrum has the Sprint Planning Meeting, and Crystal as Blitz Planning. As usual in agile, the names are different, but the concepts are very similar. They all use a group of people to produce a better result than a single expert or small number of experts could do.
This is an example of applying the Wisdom of Crowds to problem solving. The Wisdom of Crowds, popularized by James Surowiecki, is the counter-intuitive notion that a group will, under the right circumstances, repeatedly produce better answers to complex problems than a single expert or small number of experts. A classic example in Surowiecki’s book involves the search for the Scorpion, a missing submarine. The actual location of the vessel turned out to be the mean of all the locations suggested by experts polled individually.
In a similar way we hope to create a combined solution more accurate than a single person’s projection. The other important benefit in a project setting is to build support and consensus for the estimate / plan by engaging everyone in the process.
Don’t Over Analyze
It is important to remember the fidelity of most estimation input data is poor, i.e. we are usually dealing with approximations and best-guesses for work effort. Software development is difficult to predict and we get diminishing returns beyond a certain point of investing more effort in the estimation process. Beyond engaging a group of people to independently estimate work on a task by task level, doubling the time and effort put into producing estimates is unlikely to yield noticeably better results.
So, be thorough, discuss implications, assumptions and risks, but move on when a broad agreement on the likely estimate range for work has been established. A better use of estimation time is to checkpoint estimates frequently and re-estimate the project at regular intervals with the benefit of actual velocity data and project learning’s.
Beware of Huge Estimation Teams
Just because using multiple people to estimate is better than one person, it does not automatically follow that more will be even better. While in theory we increase the chances of identifying issues that some people may miss, as group sizes increase, the potential for communication failures increases at a faster rate.
As team numbers grow beyond 10-12 people face-to-face based communications become harder to organize. Then, as people miss explanations of assumptions, or issues the estimation quality will deteriorate and the process will slow down. So, more is only better until a point, keep the teams manageable and be on the look out for scaling problems.
Estimating throughout the project
At the beginning of a project we have little concrete evidence of team effectiveness and production rates. We have upfront estimates based on how we think we will do, but nothing has been proven yet. Therefore our reliance on upfront estimates is high. Then, as the project progresses our ability to estimate improves as team velocity and domain knowledge increases.
We should acknowledge this and rely more on emerging velocity data to predict final costs and end dates than our upfront estimates as the project progresses.
Be Aware of Common Estimating Omissions
No article on estimating would be complete without a list of common omissions to watch out for. Capers Jones in his book “Estimating Software Costs” outlines Ten Estimating Omissions that, in his analysis of project estimates, most frequently get missed. These are:
1) Underestimating the effort of reviews, walkthroughs, inspections and testing
2) Underestimating the effort to produce any required paper documentation
3) Underestimating travel and meeting costs (especially large projects)
4) Ignoring requirements creep (you should estimate at 1-5 % per month)
5) Exaggerating the effect of tools, languages and methods
6) Missing special testing requirements
7) Ignoring / underestimating project management / support effort
8) Forgetting a specialist area e.g. Estimators, FP specialists, performance tuners, QA, etc
9) Forgetting to include user effort
10) Ignoring maintenance after delivery if a support period is required
When you estimate for your project scan the list for things you may have forgotten or underestimated. For instance while we may hope the system hand-over document will be small and quick to produce, if we typically sink a week’s effort into it then why should this project be any different? Being forewarned of likely problem areas we can hopefully avoid making the mistakes of others (and focus on our own unique estimating omissions!).