Risk Management

Risk Management

Are Late Software Projects a Victim of 'The Planning Fallacy'?

Software Project Planning FallacyToo many projects are late, over-budget, under-delivered, or a combination.  The problems continue despite widespread awareness and improvements in project management knowledge, tools, and process maturity.  

A recent piece in the Washington Post business section identified a likely culprit: “the planning fallacy”.  Princeton psychologist Daniel Kahneman and Amos Tversky of Stanford describe it as “the tendency to underestimate the time, costs, and risks of future actions and overestimate the benefit of those actions”.  The results are time and cost overruns as well as benefit shortfalls.  The concept is not new: the pair coined the term in the 1970s and has been researching it since.

According to the Post, cognitive biases such as optimism bias (the tendency to expect positive outcomes from one’s actions) and overconfidence can be causes of the planning fallacy. There is a growing body of evidence, collected by researcher Bent Flyvbjerg at Oxford University, that optimism bias is an important bias affecting the quality of forecasts in project planning. 

Other explanations of the fallacy include possible intentional and deliberate considerations on behalf of the planner - such as incentives, organizational pressures and strategic deception. 

Blog Post Categories 
Risk Management Project Management

Making Project Decisions Early is Risky Business

At QSM, we have one of the largest industry databases in the world of completed software projects. The data comes from our clients with their permission and this data has been the backbone of our software estimation business for over 35 years. We can see what is reasonable on software development projects as it relates to cost, team size, effort, duration, size, and reliability. Because of our experience we are often asked about risk factors and estimation accuracy early in the project lifecycle. We explain that increased accuracy comes with having historical data and good sizing information.

But what happens on the early estimates when clients don’t have history and detailed sizing information? Can they still generate scope level estimates so they can make good business decisions? The answer is yes. Risk management techniques can be applied and project uncertainty can be calculated so organizations can plan effectively. This is very important because big business decisions are often made early. Decision-makers need to know if they should move forward with a project and they need to know how much time and effort to allocate.

We use SLIM-Estimate, which is a leading estimation tool that leverages the Putnam Model. It generates reliable estimates based on QSM’s time-tested forecasting models and historical data and it also provides scope level estimates when project information is hard to find. It will allow you to see the chance you have of hitting your project goals and it will allow you to factor in your uncertainty.

Blog Post Categories 
Risk Management Estimation

Managing Project Risk through Early Defect Detection

Managing Software Project RiskWith the most recent spurt of inclement weather, there is really no denying that winter is here.  After awaking to about 4 inches of snow accumulation, I begrudgingly bundled myself up in my warmest winter gear and proceeded to dig out my car.  Perhaps the brisk air woke me up faster than usual because as I dug a path to the car, I began to think about software testing, specifically how effective early testing can reduce the chances of schedule slippages and cost overruns.  Allow me to explain.

Being an eternal optimist, I was grateful that the snow I was shoveling and later brushing off my car was light and powdery.  Despite the frigid temperature and large quantity of snow, I realized that it was good that I had decided to complete this task first thing in the morning.  At the time the snow was relatively easy to clear, and had I waited until the afternoon, the sun would have melted enough of the snow to make this task significantly more difficult and time consuming.

From Proposal to Project: An Interview with Larry Putnam, Jr.

In the software project management field, projects go badly about 43% of the time and fail completely 18% of the time. While there are several reasons for this, and plenty of blame to go around, one of the easiest ways to reduce the risk is to start at the beginning – with the proposal. In a recent interview with Cameron Philipp-Edmonds of StickyMinds, Larry Putnam, Jr. talks about the importance of the proposal when executing a successful project. He identifies five key questions that should be answered before any project starts and how software estimation ties into the proposal process.

Read the full interview transcript here!

Modeling Uncertainty in Software Development Projects

I am a professional software project estimator.  While not blessed with genius, I have put in sufficient time that by Malcolm Gladwell’s 10,000 hour rule, I have paid my dues to be competent.  In the past 19 years I have estimated over 2,000 different software projects.  For many of these, the critical inputs were provided and all I had to do was the modeling.  Other times, I did all of the leg work, too: estimating software size, determining critical constraints, and gathering organizational history to benchmark against.  One observation I have taken from years of experience is that there is overwhelming pressure to be more precise with our estimates than can be supported by the data.  

In 2010 I attended a software conference in Brasil.  As an exercise, the participants were asked to estimate the numerical ranges into which 10 items would fall.  The items were such disparate things as the length of coastline in the United States, the gross domestic product of Germany, and the square kilometers in the country of Mali: not things a trivia expert would be likely to know off hand.  Of 150 participants, one person made all of the ranges wide enough.  One other person (me) got 9 out of 10 correct.

Blog Post Categories 
Risk Management Estimation

Seven Steps to Software Project Failure

In spite of 30 years of structured programming, CASE tools, OO development, 4th GL languages, CMMI, and PMI, the failure rate for larger projects has failed to respond to all of this love and attention. We normally think of failure as a negative thing; but it can have its upside. Saddling a competitor or enemy with a doomed project could stain their career or at the very least inflict a high level of pain on them. A CEO about to retire, or whose focus is on near term stock options, may be able to boost quarterly profits by continuing to add staff to a doomed effort:  one for which the customer pays for the added staff, of course.

Since failure is a constant, here is a management guide on how to assure failure. While any one step in the process can be overcome, taken together they create the perfect software project storm.

Step 1: Start work as soon as you can

Come on. You don’t really need to spend all that time in requirements meetings and documenting assumptions. Real projects take the ball and run.  Be sure to begin coding as quickly as possible. Call it prototyping if you will; but do get started. You can always circle back to tweak things if needed.

Step 2: Estimation is overhead

Nothing is more frustrating and time wasting as having to go to some external group who know nothing about your project and have them tell you how long your project should take, how many people should be on it, and what the trade-offs are. What can their mathematical models possibly know about your project? A good end run around this situation is to create a project plan and call it your estimate. Make sure that it is very detailed and contains decimal points, since these will make it more difficult to challenge.

Blog Post Categories 
Risk Management Program Management

How do the uncertainty ranges in SLIM-Estimate relate to Control Bounds in SLIM-Control?

I am often asked this question during SLIM Training classes.  I remember wondering about that myself.  It is a logical question since SLIM-Estimate workbooks are often imported into SLIM-Control to create the baseline project plan.  The answer is ‐‐ they are not directly related, because uncertainty ranges, probability curves, and control bounds are designed to perform different tasks.  This post is the first in a series looking at risk associated with an estimate, risk of your project plan, and handling deviations from the plan.

What are we talking about?

The first thing we need to do is define some very important terms that are often misused (I am the first to admit I have been guilty!).  I went to good old Dictionary.com and looked up the following:

Blog Post Categories 
Risk Management SLIM-Control SLIM-Estimate

For More Accurate Software Estimates, Avoid Hidden Risk Buffers

A colleague of mine recently sent me a blog post explaining the difference between project contingency and padding.  The blogger made the distinction that padding is what often gets added to an individual’s estimate of the effort required to perform a task (in her example, a software development task) to account for project ‘unknowns’.  The estimator determines the most likely required effort, then pads it with a little more effort in order to arrive at an estimate to which he or she can commit.  Thus, padding represents an undisclosed effort reserve (and implied schedule reserve) to buffer against potential risk.  Contingency reserve, she explains, is “an amount of money in the budget or time in the schedule seen and approved by management.  It is documented.  It is measured and therefore managed.”  Ms. Brockmeier is correct in promoting contingency as the better management tool.  The challenge is having a method to measure and document this contingency and the known unknowns it is buffering.

Implicit Risk Buffer

Padding is a natural result of bottoms-up, effort-based estimation techniques.  Estimating low-level WBS elements creates more opportunity for padding, because the number of unknowns grows with the task list.  The estimator is consciously or unconsciously assessing the risk of each task, considering its dependencies and complexities.  What is being implied in the effort estimate is: 1] an assessment of product size and complexity, and 2] a productivity valuation.

Blog Post Categories 
Risk Management Estimation

Managing Software Risk via "Whether Forecasting"

Here's a risk question for you:

If today’s weather forecast predicts a 40% chance of rain and it actually rains, was that forecast “inaccurate”?  If the weather channel predicts a 40% chance of rain, but the sun shines all day, was the forecast “accurate”?

Software project estimates, like weather forecasts, should always be accompanied by some explicit attempt to quantify the risk that the actual outcome will differ significantly from the estimated outcome.  Estimates delivered without explicit risk assessment are more like targets: goals someone wants to achieve.

It turns out that whether it rains or not is actually a poor measure of forecasting accuracy.  A 40% chance of rain forecast is accurate if, on 100 days where the forecast said 40%, it rained on 40 days and didn’t rain on 60. Likewise, the accuracy of an individual software project estimate is not determined by whether the project actually achieves its committed estimate.  We can see this with a simple example: if SLIM-Estimate predicts only a 10% probability of achieving a schedule but the organization decides to commit to a plan with a 90% chance of failing, we could actually consider the estimate to be “more accurate” if the software project fails than if it is successful.

What most organizations are really looking for is not so much accurate estimates as accurate commitments, where the commitment is based on the estimate plus an appropriate level of risk resourcing.  But even with best contingency planning, there is always a finite chance of “failure”—it’s just a lot less than if we don’t resource risk. 

Blog Post Categories 
Risk Management Program Management

Why Does Project Size Grow?

Seen from an airplane window, the ground looks almost two dimensional.  Only the largest features: cities, rivers, and mountain ranges, stand out against the background.  The true complexity of the terrain only becomes apparent after we land and have to navigate through congested traffic, bad weather, and one-way streets.

Software projects are similar.  Staffing and budget plans are often based on high level requirements that tell us what needs to be done, but not how to accomplish it.  As business objectives are translated into the actions that need to be taken and the work products that must be produced, the size of the project, whether expressed in lines of code, function points, or RICEF objects, increases along with the time and effort required to create them.

This level of detail cannot be seen at the Requirements stage; it is invisible.  But, it can be accounted for and managed.  Software consultant, Capers Jones, has stated that software projects grow 1.5% per month.  A QSM study based on IT projects found that 90% of those projects were larger than they were initially estimated to be.  The average size growth was 15%.  This bias towards size growth was not the result of poor estimating.  At the time the initial estimates were done, the components that accounted for the size growth were simply not apparent.

Blog Post Categories 
Risk Management Estimation