Practical Software Estimation Measurement

Roots Run Deep: The Journey to Software Application Estimation and Risk Management

The story of QSM and software application estimation begins during my time in the Army. I was assigned to Sandia Base, NM to research methods for protecting soldiers from the effects of nuclear explosions.  I had to do several calculations to determine the impact of an explosion (blast calculations) on soldiers using a slide rule, which was very tedious.  Sandia National Laboratory was next door to my office, and they had just gotten the biggest and best engineering computer available at the time.  They offered computer time for anyone needing it and even offered to teach me programming, so I decided to take a course in FORTRAN programming over my lunch hour so I could do my blast calculations quicker. These lessons aided me in completing my work at Sandia and followed me to my future assignment at the Pentagon. 

For my tour at the Pentagon in the 1970s, there was not a lot of need for my nuclear experience so I was assigned to the Army’s computer program. We had to defend our program budget to the Department of Defense (DoD) budget review authority (OSD). One system, SIDPERS, the Army enterprise personnel system, had been in development for five years and after having a peak staff of 110, we were projecting 93 people for the next five years. The analyst looking at the budget asked what should have been a simple question, “What are these people going to do?” I did not have a good answer, and later, going back to the project team, neither did they. Because of this we lost $10M in our budget.

My boss, who was a major general in the Corps of Engineers, said, “You know, Larry, when we have a big construction project, such as building a dam or a new airfield, we have some high level parameters like the number of cubic yards of concrete it will take to do the job that allow us to get in the ballpark for cost and schedule.   Whenever I talk to the IT guys I hear about bits and bytes, programming languages and bandwidth but nothing that relates to time, effort, and cost.”

It became my mission to figure out a way to project and explain software engineering staffing and effort as easily as we did real-world engineering.

I started this journey with the Rayleigh equation as the ideal way to apply people to a design-intensive project.  You start at zero, build up staff as the problem is decomposed, and reduce staffing as the release is constructed, integrated, and tested. This equation was being applied mostly to hardware and microprocessor chip designs, but from what I could tell software followed a similar design process.

I began collecting some Army data to see if it followed the Rayleigh pattern.  Just from comparing the budget data from a group of about 15 systems, I knew we had a match.  Initially, we just used simple projections of the Rayleigh curves to get our 50 systems in development under financial control.  The harder part was figuring out what the right Rayleigh curve should be when we needed to provide an early estimate.

I applied it to historical data from 19 product releases from a single organization. For my situation I had two unknowns (time and effort) and four or five potential equations. I graphed several equations in the Rayleigh curve dimensions of schedule and effort.  When I did this they all intersected in a very similar region.

Larry Putnam, Sr. software application estimation equation

The resulting software production equation was:

Size = Productivity × Time4/3 × Effort1/3

The world has changed a lot since I started on this journey.  Waterfall development methodologies have morphed into Incremental and Agile methods.  Languages have totally changed.  Platforms went from centralized time-sharing to desktops, then to servers, and are now full circle back to cloud-based platforms.  Today people “configure packages” and develop in much more powerful languages.  Developers are certainly better at re-using code.  However, in the final analysis we are still writing some type of code, so the software equation that we started out with close to 40 years ago is still relevant. 

I think that I uncovered something close to a fundamental law of nature in our software production equation. When matched up with the Rayleigh staffing model, we have a very powerful decision tool.  Sophisticated enough to work yet simple enough to use.

The QSM Software Almanac: 2016 Edition focuses on the history of software estimation and explores how estimation principles remain relevant despite significant changes in software development methodologies. Download this free resource to read the full foreward by industry pioneer Larry Putnam, Sr.

Blog Post Categories 
Estimation Risk Management