Software Estimation Best Practices

Blogs

"Building an Estimation Center of Excellence" Webinar Replay and Q&A Highlights

If you were unable to attend "Building an Estimation Center of Excellence," the webinar replay and slides are now available. Here are the Q&A highlights:

How do you handle estimating change requests (scope creep). Do you estimate the entire project again, or do you just estimate the impact of the change requests?

It would depend on where we were in the project lifecycle. If we were still fairly early on (somewhere between the feasibility assessment and the refined estimate), I would add those into my sizing assumptions and re-estimate the project. If I'm already farther along and I get changes when I'm already constructing the system, then I would use my adaptive forecasting and add those in within the context of everything else I have to build as part of the deliverable release. This is because the impact will be bigger if we're farther along and we already have everything integrated and we're into testing versus earlier on when not a lot has been constructed. QSM's forecasting capabilities will be able to tell us the impact on schedule and cost.

Should the center of excellence estimate all projects regardless of size, or if the project is small, then have the project teams estimate it?

Blog Post Categories 
Webinars Estimation

Updated Performance Benchmark Table

The latest version of QSM’s Performance Benchmark Table is live!

QSM is excited to announce the release of their latest version of the Performance Benchmark Table.  Last updated in 2009, the table provides a high-level reference for benchmarking and estimating IT, Engineering, and Real Time Systems.  It displays industry average duration, effort, staff, and SLOC (or FP) per Person Month for the full range of project sizes encompassed by each trend group. 

The results were analyzed from a database of 1,115 high or moderate confidence projects completed between 2008 and 2012.  Sixteen countries and 52 different languages were represented in this sample.  In addition to the industry average, minimum and maximum values were also provided for each metric to help give a range of possible results.

The project sizes differed somewhat from the previous version to accommodate the new range of sizes present in the data.  Rather than using the same project sizes across trend groups, we selected project sizes specific to each trend.  Since Business projects are typically smaller than Engineering or Real Time projects, this allows readers to select a size relevant to the type of project they’re estimating or benchmarking.  

This tool can be particularly useful to developers and/ or project managers who are new to estimation or do not have historical project data.  

Blog Post Categories 
Benchmarking Estimation

How Does Uncertainty Expressed in SLIM-Estimate Relate to Control Bounds in SLIM-Control? Part III

In the previous articles in this series I presented SLIM-Estimate’s use of uncertainty ranges for size and productivity to quantify project risk, and how to build an estimate that includes contingency amounts that cover your risk exposure.  In this post I will identify the project work plan reports and charts that help you manage the contingency reserve.  You will see how to use SLIM-Control bounds and default metrics to keep your project on track. 

Understand the project work plan documents.

In our example so far, you have estimated a project to deliver a software product in 11.7 Months, with a budget of $988,223.  This estimate includes an 80% contingency reserve, or risk buffer, on both effort and duration.  Your work plan is based upon SLIM-Estimate’s 50% solution; 11 Months and $755,400.  Thus, the uncertainty about size and productivity are accounted for; it is built into your plan.  The probability that you will meet the project goals is driven by many factors ‒ too many to measure.  You can only manage what is within your control, and escalate issues so they can be resolved in a timely manner.

Managing the project well begins with a solid understanding of the detailed project plan.  SLIM-Estimate provides several default and customizable charts and reports that document the plan.  Here are a few key reports1 to study in order to identify the core metrics you will want to monitor closely.

Blog Post Categories 
SLIM-Control SLIM-Estimate

Webinar - Building an Estimation Center of Excellence

On Thursday, June 13, at 1:00 PM EDT, Larry Putnam, Jr. will present Building an Estimation Center of Excellence.

The pressure to succeed in software development is higher than ever - the current economic climate demands we do more with less, there is fierce global and domestic competition, time-to-market expectations are high, and your company's reputation is on the line. When projects fail, the failure to meet expectations is more often an estimation or business decision failure than a production or execution issue. In this webinar, industry expert Larry Putnam, Jr. takes you through the key elements and step-by-step process for setting up an estimation center of excellence that will ensure your projects succeed.

Larry Putnam, Jr. has 25 years of experience using the Putnam-SLIM Methodology. He has participated in hundreds of estimation and oversight service engagements, and is responsible for product management of the SLIM Suite of software measurement tools and customer care programs. Since becoming Co-CEO, Larry has built QSM's capabilities in sales, customer support, product requirements and most recently in creating a world class consulting organization. Larry has delivered numerous speeches at conferences on software estimation and measurement, and has trained - over a five-year period - more than 1,000 software professionals on industry best practice measurement, estimation and control techniques and in the use of the SLIM Suite.

Watch the replay!

Blog Post Categories 
Webinars Estimation

How does uncertainty expressed in SLIM-Estimate relate to Control Bounds in SLIM-Control? Part II

Several months ago, I presented SLIM-Estimate’s use of uncertainty ranges for size and productivity to quantify project risk.  Estimating these two parameters using low, most likely, and high values predicts the most probable effort and time required to complete the project.  This post shows you how to use SLIM-Estimate’s probability curves to select the estimate solution and associated work plan that includes contingency amounts appropriate to your risk.

Begin with an unconstrained solution

The default solution method used for new estimates, whether you are using the Detailed Method or another solution option, is what we call an unconstrained solution.  Just as it sounds, no limits have been placed on the effort, schedule, or staffing SLIM-Estimate can predict.  It will calculate the resources required to build your product (size) with the capabilities of your team (PI).  Assuming you have configured SLIM-Estimate to model your life cycle and based your inputs on historical data, you have produced a reasonable, defensible estimate.  

Solution Panel

Blog Post Categories 
SLIM-Control SLIM-Estimate

QSM Announces Latest Update to the QSM Project Database

We are pleased to announce the the latest update to the QSM Project Database! The 8th edition of this database includes more than 10,000 completed real-time, engineering and IT projects from 19 different industry sectors.

The QSM Database is the cornerstone of our business. We leverage this project intelligence to keep our products current with the latest tools and methods, to support our consulting services, to inform our customers as they move into new areas, and to develop better predictive algorithms. It ensures that the SLIM Suite of tools is providing customers with the best intelligence to identify and mitigate risk and efficiently estimate project scope, leading to projects that are delivered on-time and on-budget. In addition, the database supports our benchmarking services, allowing QSM clients to quickly see how they compare with the latest industry trends.

To learn more about the project data included in our latest update, visit the QSM Database page.

Blog Post Categories 
QSM Database

New Article: Data-Driven Estimation, Management Lead to High Quality

Software projects devote enormous amounts of time and money to quality assurance. It's a difficult task, considering most QA work is remedial in nature - it can correct problems that arise long before the requirements are complete or the first line of code has been written, but has little chance of preventing defects from being created in the first place. By the time the first bugs are discovered, many projects are already locked into a fixed scope, staffing, and schedule that do not account for the complex and nonlinear relationships between size, effort, and defects. 

At this point, these projects are doomed to fail, but disasters like these can be avoided. When armed with the right information, managers can graphically demonstrate the tradeoffs between time to market, cost, and quality, and negotiate achievable deadlines and budgets that reflect their management goals. 

Leveraging historical data from the QSM Database, QSM Research Director Kate Armel equips professionals with a replicable, data-driven framework for future project decision-making in an article recently published in Software Quality Professional

Read the full article here.

Blog Post Categories 
Articles Data Quality

Let's Get Serious About Productivity

Recently I conducted a study on projects sized in function points that covers projects put into production from 1990 to the present, with a focus on ones completed since 2000. For an analyst like myself, one of the fun things about a study like this is that you can identify trends and then consider possible explanations for why they are occurring. A notable trend from this study of over 2000 projects is that productivity, whether measured in function points per person month (FP/PM) or hours per function point, is about half of what it was in the 1990 to 1994 time frame.

Median Productivity

 1990-19941995-19992000-20042005+
FP/PM11.1179.215.84
FP/Mth17.163.929.7422.10
PI15.316.413.910.95
Size (FP)394167205144

 

Part of this decline can be attributed to a sustained decrease in average project size over time. The overhead on small projects just doesn’t scale to their size, thus they are inherently less productive. Technology has changed, too. But, aren’t the tools and software languages of today more powerful than they were 25 years ago?

Blog Post Categories 
Productivity Project Management

They Just Don't Make Software Like They Used to… Or do they?

With the release of SLIM-Suite 8.1 quickly approaching, I thought I’d take a moment to share a preview of the updated QSM Default Trend Lines and how it affects your estimates.  In this post I wanted to focus on the differences in quality and reliability between 2010 and 2013 for the projects in our database.  Since our last database update, we’ve included over 200 new projects in our trend groups.

Here are the breakouts of the percent increases in the number of projects by Application Type:

  • Business Systems: 14%
  • Engineering Systems: 63%
  • Real Time Systems: 144%

Below you will find an infographic outlining some of the differences in quality between 2010 and 2013.

Changes in Software Project Quality between 2010 and 2013

From the set of charts above, we can see some trends emerging which could indicate the changes in quality between 2010 and 2013.  By looking at the data, it’s apparent that two distinct stories are being told:

1. The Quality of Engineering Systems has Increased

Blog Post Categories 
Software Reliability Quality

Updated Function Point Gearing Factor Table

Version 5.0 of the QSM's Function Point Gearing Factor table is live!

The Function Point Gearing Factor table provides average, median, minimum, and maximum gearing factors for recently completed function point projects. A gearing factor is the average number of basic work units in your chosen function unit. Originally, it was designed to be used as a common reference point for comparing different sizing metrics by mapping them to the smallest sizing unit common to all software projects. QSM recommends that organizations collect both code counts and final function point counts for completed software projects and use this data for estimates. Where there is no completed project data available for estimation, we provide customers with a starting point to help them choose an appropriate gearing factor for their chosen programming language.

For this version of the table, we looked at 2192 recently completed function point projects out of the 10,000+ in QSM's historical database. The sample included 126 different languages, 37 of which had enough data to be included in the table. Interestingly, this year we added three new languages: Brio, Cognos Impromptu Scripts, and Cross Systems Products (CSP).

One trend we noticed is that, in general, the range for gearing factors has decreased over time. Similarly, the average and median values have decreased, which we attribute to having more data to work with.

Read the full press release or visit the new table!

Blog Post Categories 
QSM News Function Points