Estimation

Estimation

Software Cost Estimation Article in The DACS Journal

The February issue of the DACS Journal of Software Technology focuses on Software Cost Estimation and Systems Acquisition. My contribution, which you can read here, addresses the challenges faced by estimators and the value of establishing a historical baseline to support smarter planning, counter unrealistic expectations, and maximize productivity.

Using several recent studies, my paper addresses the following questions:

  • What is estimation accuracy, and how important is it really?
  • What is the connection between the Financial Crisis of 2008 and software estimation?
  • Why do small team projects outperform large team projects?
  • How can you find the optimal team size for your project?

Read the full article.

Blog Post Categories 
Estimation Articles

Part III: The Caveats

In Part 1 of How Much Estimation? we noted that there is an optimal amount of time and effort that should be spent in producing an estimate based on the target cost of a project and business practice being supported.

In Part 2: Estimate the Estimate, we saw that the formula to calculate this optimal time (as measured at NASA)  calculates the Cost of Estimate as the Target_Cost raised to the power 0.35 (approximately the cube root of the Target Cost).  The factor that defines the business practice (either by early lifecycle phase or perhaps by the “expected precision” of the estimate) is a linear factor ranging from a value of 24 to a value of 115.

Those Caveats!

I mentioned that there were caveats with the calculation.  Here they are:

Blog Post Categories 
Estimation SLIM-Estimate

Part II: Estimate the Estimate

In Part 1 of How Much Estimation, we observed that both too much time and effort and too little time and effort spent on estimating are less than optimal.  Combining:

  • The cost of producing an estimate—which is a function of the number of people working on the estimate and how long they work
  • The cost of variance in the results of the estimate—that is, how much the estimate varies from experienced actuals and what that variance will likely cost the project.  This is typically a function of the number of unknowns at the time of estimating for which the project cannot easily adjust and which will require additional unplanned resources of time, effort, and staff.

We get a U-shaped curve, at the bottom of which is the optimal time: we’ve spent enough time and effort to minimize the sum of the cost of estimate and the cost of variance.

The question is: how to calculate this point?  It will not be the same for a very large complex project and a very small simple project.  Also we don’t want a complicated and time-consuming approach to calculate the cost of estimate—it should be quick and simple.

NASA’s Deep Space Network (DSN) projecti developed a mechanism for this calculation based on two simple parameters:

Target Cost of Project

This is goal cost of the project as first envisaged in the project concept.  It is NOT the estimated cost of the project (which hasn’t been calculated yet).  Projects for which we expect and plan to spend a lot of money should clearly have more time and effort spent in estimating simply because more is at risk.

Blog Post Categories 
Estimation

How Much Estimation?

How much time and effort should we spend to produce an estimate?  Project estimation, like any other activity, must balance the cost of the activity with the value produced.

There are two extreme situations that organizations must avoid:

The Drive-By Estimate

The drive-by estimate occurs when a senior executive corners a project manager or developer and requires an immediate answer to the estimation questions: “When can we get this project done?”  “How much will it cost?” and “How many people do we need?" (the equally pertinent questions: “How much functionality will we deliver?” and “What will the quality be?” seem to get much less attention).

Depending on the pressure applied, the estimator must cough up some numbers rather quickly. Since the estimate has not been given much time and attention, it is usually of low quality. Making a critical business decision based on such a perfunctory estimate is dangerous and often costly.

The Never-Ending Estimate

Less common is the estimation process that goes on and on.  In order to make an estimate “safer” an organization may seek to remove uncertainty in the project and the data used to create the estimate. One way to do that is to analyze the situation more and more. Any time we spend more time and more effort in producing an estimate we will generally produce a more precise and defensible result. The trouble is the work we have to do to remove all the uncertainty is pretty much the same work we have to do to run the project. So companies can end up in the odd situation where, in order to decide if they should do the work what resources they should allocate to the project, they actually do the work and use up the resources.

Blog Post Categories 
Estimation

What If? The Power of the Question

After being away from QSM and the software world for three years, I was blown away by SLIM v8.0's dynamic product integration. I knew it was coming, yet I was still impressed by the simplicity and power of analysis promoted by real-time data and tool links across the SLIM Suite that frees managers to focus on the important program issues.

SLIM-MasterPlan is the center of the SLIM Suite product integration.  It improves upon previously existing program management features of aggregating multiple SLIM-Estimate projects and ancillary tasks with two new capabilities: 

  • Linking SLIM-Control workbooks to provide real-time project tracking and control at the program level 
  • Performing What If analysis at this higher management view to consider a wider range of potential outcomes.

The What If analysis feature is what I want to highlight.

A good personal development coach knows the "power of the question."  Questions lead to discovery, innovation, and action that brings about positive change.  Better questions lead to better answers.  SLIM's power and distinction has always been fast and easy evaluation of the impact of change, and exploring the realm of possible outcomes.  That's what we are doing when we ask ourselves "What If…?" (or our boss asks us - and we better know the answer!).  SLIM's solution logs make it easy to compare estimates, plans, and forecasts to alternative solutions, QSM trends, and your historical project database.

Why Does Project Size Grow?

Seen from an airplane window, the ground looks almost two dimensional.  Only the largest features: cities, rivers, and mountain ranges, stand out against the background.  The true complexity of the terrain only becomes apparent after we land and have to navigate through congested traffic, bad weather, and one-way streets.

Software projects are similar.  Staffing and budget plans are often based on high level requirements that tell us what needs to be done, but not how to accomplish it.  As business objectives are translated into the actions that need to be taken and the work products that must be produced, the size of the project, whether expressed in lines of code, function points, or RICEF objects, increases along with the time and effort required to create them.

This level of detail cannot be seen at the Requirements stage; it is invisible.  But, it can be accounted for and managed.  Software consultant, Capers Jones, has stated that software projects grow 1.5% per month.  A QSM study based on IT projects found that 90% of those projects were larger than they were initially estimated to be.  The average size growth was 15%.  This bias towards size growth was not the result of poor estimating.  At the time the initial estimates were done, the components that accounted for the size growth were simply not apparent.

Blog Post Categories 
Risk Management Estimation

Two Tools Are Better Than One

Have you ever been excited to discover a new use for something familiar, like learning that lighter fluid can be used to remove ink stains from your clothes?  I recently discovered a way to leverage the tie between SLIM-Estimate and SLIM-DataManager that I was previously unaware of.  

My limited view of SLIM-DataManager as a tool for historical data and SLIM-Estimate as a tool for software project estimation limited my creativity in applying the rich set of capabilities in the entire SLIM tools suite.  I recently observed a more experienced SLIM user use both tools to model a history project where very little data was available, using both applications.  Here is a description of the situation.

Scenario: 

You have gathered metrics from a completed project to serve as the basis of estimation for your next project.  Software size, lifecycle effort, lifecycle duration (phases 1-3), and defects are known, but you do not have a break out of individual phase data.  How can you best model this project and capture the results in SLIM-DataManager?

Solution A: 

Blog Post Categories 
Estimation Tips & Tricks

Introducing the SLIM-Estimate Certification Program

QSM is pleased to announce the SLIM-Estimate® Certification Program.  Specifically designed to help our customers ensure the technical excellence of their SLIM users, this program increases the business value of our software project estimation tools to your organization. 

How Certification Is Achieved:

In order to be awarded certification, a user must demonstrate competence in the following areas:

Blog Post Categories 
Estimation SLIM-Estimate Training

Replay Now Available for "Best in Class SLIM Estimation Processes for Package Implementations"

If you missed our webinar, Best in Class SLIM Estimation Processes for Package Implementations, a replay is now available.

To be able to estimate package implementations, we need to be able to size them and support productivity assumptions with relevant data. Presented by Keith Ciocco, this webinar demonstrates package implementation sizing processes and how to calibrate SLIM to package implementation project trends.

As Vice President of QSM, Keith has more than 23 years of experience working in sales and customer service, with 15 of those years spent at QSM. Keith’s primary responsibilities include managing business development, existing client relations, customer retention and response.

View the replay.

Blog Post Categories 
Webinars Estimation

Webinar: Best in Class SLIM Estimation Processes for Package Implementations

On Thursday, June 16 at 1:00 PM EDT, QSM will host a webinar focused on estimating package implementations.

To be able to estimate package implementations, we need to be able to size them and support productivity assumptions with relevant data. Presented by Keith Ciocco, this webinar demonstrates package implementation sizing processes and how to calibrate SLIM software estimation tools to package implementation project trends.

As Vice President of QSM, Keith has more than 23 years of experience working in sales and customer service, with 15 of those years spent at QSM. Keith’s primary responsibilities include managing business development, existing client relations, customer retention and response.

Watch the replay!

Blog Post Categories 
Webinars Estimation