Practical Software Measurement
One of the things I hear from many c-level managers is how difficult it is and how long it takes to generate reliable resource plans at the enterprise level. Many organizations take months to generate their annual budgets and often times the negotiated budgets end up being unrealistic. To fix this problem we need to combine good capacity planning with good demand management. There are a number of project portfolio management tools to help with the capacity planning. The problem is the numbers will be off if we don’t get the demand management part right.
Outsourcing was supposed to make government IT executives’ lives easier. Yet in too many cases, it’s had the opposite effect, leading to cost overruns, inefficiencies, and solutions that do not work. Remember the initial rollout of Healthcare.gov? Exactly.
It doesn’t have to be this way. Believe it or not, there’s a proven solution that has stood the test of time. In 1977, Lawrence Putnam Sr. discovered the “physics” of how engineers build software by successfully modeling the nonlinear relationship between the five core metrics of software: product size, process productivity, schedule duration, effort and reliability.
The story of QSM and software application estimation begins during my time in the Army. I was assigned to Sandia Base, NM to research methods for protecting soldiers from the effects of nuclear explosions. I had to do several calculations to determine the impact of an explosion (blast calculations) on soldiers using a slide rule, which was very tedious. Sandia National Laboratory was next door to my office, and they had just gotten the biggest and best engineering computer available at the time. They offered computer time for anyone needing it and even offered to teach me programming, so I decided to take a course in FORTRAN programming over my lunch hour so I could do my blast calculations quicker. These lessons aided me in completing my work at Sandia and followed me to my future assignment at the Pentagon.
QSM's C. Taylor Putnam-Majarian and Doug Putnam recently published an article, Understanding Quality and Reliability, on InfoQ.
Often I speak with IT project measurement folks about the permeation of agile into their project estimation processes. While the Agile Manifesto recently celebrated its 15th birthday, it’s just been the last several years that I’ve seen agile gain substantial momentum in becoming the official method many companies shepherd their projects. Or has it…?
The time has come, once again for QSM’s annual March Madness tournament. As we enter our 6th year of friendly office competition, I looked back at some of my previous strategies to help me figure out how I wanted to go about completing my bracket this year. In doing this, I realized that many of these concepts can be applied towards IT project management.
I recently came across this blog post by David Gordon and I believe it nicely summarizes the problem many C-level executives find with the #NoEstimates agile view. At the end of the day, they need realistic numbers in order for their organizations to make a profit and remain competitive in their markets. It's simply not enough to start development without some sort of upfront plan as to how much the overall project will cost. Gordon gives a nice analogy:
In software estimation, some discovered relationships turn out to be true primary principals of software development.
Way back in 1978, Larry Putnam, Sr. discovered that the relationship between project duration and project effort was exponential.1 His equation equated to:
Duration in months = 2.15 times the cube root of effort in manmonths
In his 1981 book, Barry Boehm described the nominal relationship in COCOMO2 as:
Duration in months = 2.5 times the cube root of effort in manmonths
Very similar results. Is that something specific to the way projects were managed way back then? Or, is this a true fundamental law of software project management?
Many of the project managers that I speak with track their software and systems projects at a very detailed level. They use detailed spreadsheets or other tools to track hours and tasks on a daily basis. This is fine, but it's important to manage the big picture so we can avoid assigning detailed tasks to duration and budget goals that are unrealistic.
By "big picture" I mean tracking at the project release level and focusing on a few key actuals: size, duration, effort, reliability, and efficiency. It's important to track these actuals to a reliable plan. These are the measures that can give us the biggest and quickest insight into a project’s potential success or failure. You can see this analysis in the SLIM-Control graphs below, showing the blue plans versus the red actuals.