Software Estimation Best Practices

Blogs

Webinar Replay Now Available: Shifting to Agile Methods - The Keys for Long-Term Success

If you were unable to attend our webinar, Shifting to Agile Methods - The Keys for Long-Term Success, a replay is now available. 

Changes to the software development process, such as moving toward Agile methods, must demonstrate sustainable results over time versus just short-term wins.  There are two keys to reaching long-term success that should be considered up front – the new process must be repeatable and measurable. 

In this session, AccuRev’s Chris Lucca and QSM’s Larry Putnam, Jr. explore these two keys to success.  

Specifically, they cover:

  • The state of software development projects yesterday versus today and the impact to the software development process
  • The techniques and tools that can help a team to build a process that is repeatable and scalable, even across a distributed team
  • Which metrics and measurement processes are important to measuring the results and improvements of implementing repeatable and scalable processes
  • How to use metrics to estimate project schedules, resources and reliability, and monitor project progress and forecast completion
  • Ways to benchmark the results at project completion for time to market, cost performance and reliability – all of which provide the business case for continued investments in technology and repeatable and scalable processes

View the webinar replay.

View all recordings of all of our past webinars.

Blog Post Categories 
Webinars Agile

Part III: Finding the Optimal Team Size for Your Project

In part one of our team size series, we looked at Best and Worst in Class software projects and found that using small teams is a best practice for top performing projects. Part two looked at differences in cost and quality between small and large team projects and found that small teams use dramatically less effort and create fewer defects.  But simply knowing that small teams perform better doesn’t tell us how small a team to use. Most software metrics scale with project size, and team size is no exception. Management priorities must also be taken into account. Small projects can realize some schedule compression by using slightly larger teams but for larger projects, using too many people drives up cost but does little to reduce time to market:

Larger teams create more defects, which in turn beget additional rework… These unplanned find/fix/retest cycles take additional time, drive up cost, and cancel out any schedule compression achieved by larger teams earlier in the lifecycle.

In a study conducted in the spring of 2011, QSM Consultant Don Beckett designed a study that takes both system size and management priorities into account. He divided 1920 IT projects into four size quartiles. Using median effort productivity (SLOC/PM) and schedule productivity (SLOC/Month) values for each size bin, he then isolated top performing projects for schedule, effort, and balanced performance (better than average for effort and schedule):

Effort vs. Schedule

Blog Post Categories 
Team Size

Tuning Effort for Best in Class Analysis and Design

After reading Best Projects/Worst Projects in the QSM IT Almanac, a SLIM-Estimate® user noted that the Best in Class Projects expended around 28% of their total project effort in analysis and design (SLIM Phase II) compared to 10% for the Worst in Class Projects. She wanted to know how she could tune her SLIM-Estimate templates to build in the typical best in class standard for Analysis and Design.

In SLIM-Estimate, effort and duration for phases I and II are calculated as a percentage of Phase III time and effort. To create a template for estimating phases II and III that will automatically allocate 28% of total project effort to analysis and design (Phase II), follow these simple steps.

  • From the Estimate menu, select Solution Assumptions.  Make sure the “Include” check boxes for Phases II and III are selected.  Then click on the Phase Tuning tab.
  • Click on the tab for Phase II.  (If you have previously customized the phase names, the default name for Phase II will reflect that).
  • Click on the Manual button under Effort, and enter 28% for the effort percent.

That’s it. Your estimates based on this template will now automatically allocate 28% of total project effort to Analysis and Design (Phase II).

This procedure assumes that your estimates will be for SLIM Phases II and III, which, we have found, is the typical scope for most project estimates. However, if your estimates include Phases I and/or IV, you may have to increase the effort percent a bit to achieve the desired result.

Blog Post Categories 
SLIM-Estimate Tips & Tricks Effort

Webinar: Shifting to Agile Methods - The Keys for Long-Term Success

On Thursday, February 16, 2012 1:00 PM PM EST, QSM will co-host "Shifting to Agile Methods - The Keys for Long-Term Success" together with Accurev.

Changes to the software development process, such as moving toward Agile methods, must demonstrate sustainable results over time versus just short-term wins.  There are two keys to reaching long-term success that should be considered up front – the new process must be repeatable and measurable. 

In this session, AccuRev’s Chris Lucca and QSM’s Larry Putnam, Jr. will explore these two keys to success.  

Blog Post Categories 
Webinars Agile

Part II: Small Teams Deliver Lower Cost, Higher Quality

This is the second post in a three part investigation of how team size affects project performance, cost, quality, and productivity. Part one looked at cost and schedule performance for Best in Class and Worst in Class IT projects. For this study, Best in Class projects were those that delivered more than one standard deviation faster, but used more than one standard deviation less effort than the industry average for projects of the same size. A key characteristic of these top performing projects was the use of small teams: median team size for best in class projects was 4 FTEs (full time equivalent) people versus 17 FTEs for the worst performers.

What is the relationship between team size and management metrics like cost and defects? To find out, I recently looked at 1060 medium and high confidence IT projects completed between 2005 and 2011. These projects were drawn from the QSM database of over 10,000 completed software projects. The projects were divided into two staffing bins:

  • Small team projects (4 or fewer FTE staff)
  • Large team projects (5 or more FTE staff)

Average Staff vs. System Size

These size bins bracket the median team size of 4.6 for the overall sample, producing roughly equal groups of projects that cover the same size range. Our best/worst in class study found a 4 to 1 team size ratio between the best and worst performers. 

Blog Post Categories 
Team Size

Top Performing Projects Use Small Teams

Last week, Carl Erickson of Atomic Spin referenced a study performed by Doug Putnam several years ago:

A study done by consultancy QSM in 2005 seems to indicate that smaller teams are more efficient than larger teams. Not just a little more efficient, but dramatically more efficient. QSM maintains a database of 4000+ projects. For this study they looked at 564 information systems projects done since 2002. (The author of the study claims their data for real-time embedded systems projects showed similar results.) They divided the data into “small” teams (less than 5 people) and “large” teams (greater than 20 people).

To complete projects of 100,000 equivalent source lines of code (a measure of the size of the project) they found the large teams took 8.92 months, and the small teams took 9.12 months. In other words, the large teams just barely (by a week or so) beat the small teams in finishing the project!

Since then, QSM has performed several studies investigating the relationship between team size and metrics like project scope, productivity, effort/cost, and reliability. The results have been surprisingly consistent regardless of application domain, technology, or year group.  I’ll be reviewing what we found in a series of posts.

Blog Post Categories 
Team Size

QSM Consulting Receives Four “Exceptional” Ratings from Army CPAR

We are pleased to announce that QSM has received four "Exceptional" ratings from the Army Contractor Performance Assessment Report (CPAR). A CPAR assessment is based on objective facts and supported by program and contract management data, such as cost performance reports, customer comments, quality reviews, and earned contract incentives. The Contractor Performance Assessment Reporting System (CPARS) is the Department of Defense (DoD) Enterprise Solution for collection of contractor Past Performance Information (PPI) as required by the Federal Acquisition Regulation (FAR).

Having provided software estimation tools, training and consulting services to the Army since 2004, we very much appreciate their feedback. We try to hire the best people in the business and hold ourselves to a high standard of exceptional performance on all our contracts. These recent CPAR ratings clearly validate the quality of our staff.

Read the full press release.

Blog Post Categories 
Consulting QSM News

QSM Presentation at Better Software West 2012

Paul Below will be presenting "Optimal Project Performance: Factors that Influence Project Duration" at the 2012 Better Software West Conference on Thursday, June 14 at 4:00 PM.

Speedy delivery is almost always a primary project goal or a significant project constraint. To shorten project duration without sacrificing quality or budget, you need to know where to focus the team’s efforts. Mining the QSM database containing many quantitative metrics and numerous qualitative attributes, Paul Bellow shares the factors that have the greatest influence on project duration. While he’s at it, Paul debunks a couple of myths. For example, many managers consider team skill to be important in determining duration of software projects—not so. The most important factors are certain types of tooling, architecture, testing efficiency, and management/leadership skills, which Paul explores in depth. Learn a technique for normalizing your projects for size by computing the standardized residual of duration. Leave with new facts and data on how to improve your development skills and practices to increase velocity and keep the quality your customers expect.

Blog Post Categories 
QSM News

QSM Awarded Contract with Army Cost Center

We are pleased to announce that the Office of the Deputy Assistant Secretary of the Army for Cost and Economics (ODASA-CE) has contracted QSM to provide a comprehensive cost methodology, which includes tools, consulting support, and on-site training. This contract is a continuation of a successful working relationship between QSM and the Army that began in 2004. With the new contract, QSM will be providing estimation assistance on major ACAT 1 programs (major acquisition programs).  QSM will also help establish a metrics database with software data from the Defense and Cost Research Center (DCARC) and from DASA-CE data collection efforts. This database will be used for program analysis and evaluation. 

Read the full press release here.

 

Blog Post Categories 
Consulting QSM News SLIM Suite

How's Your Metrics Program Doing?

"Everything should be made as simple as possible, but not simpler."

-  Albert Einstein

How’s your software measurement program doing?  Is it well funded and supported by management, or do you worry about your job the next time the organization decides it needs to be “leaner and meaner”?  Many measurement programs are cancelled or fade into meaningless obscurity.  Why?  Some things are out of your control; but here are a few that will improve your odds for success:

Blog Post Categories 
Metrics Benchmarking