Software Estimation Best Practices

Blogs

The “Secret Sauce” in SLIM-Estimate

SLIM-Estimate

For over 20 years I’ve been an advocate of using metrics for improving IT processes. Shortly into my career as a COBOL developer, I was introduced to Function Point Analysis; and ever since it’s been the most powerful tool in my toolkit. After all: size matters! Once I learned to quantify the amount of functionality delivered by a project or an application, I could zoom in on cost, effort, duration, productivity, and quality because I now had a normalization factor to perform comparisons (Cost per Function Point, Hours per Function Point, Defects per Function Point, etc.).

Shortly after getting my Certified Function Point Specialist certification, I became obsessed with the different measures and metrics pertaining to software and IT. Soon I became a Certified Software Measurement Specialist, where I learned everything there was to know about how to measure everything there was to measure in software (or so I thought). It’s a pretty powerful feeling being able to help organizations baseline their current capabilities so they could determine if implementing the latest and greatest silver bullet was really going to give them the gains in productivity they had been striving for. 

Blog Post Categories 
SLIM-Estimate Training

Estimating Program Increment Capacity in Scaled Agile (SAFe)

Scaled Agile (SAFe) is a methodology that applies Agile concepts to large complex environments.  QSM recently worked with an organization that had implemented SAFe to develop an estimation methodology specifically tailored to it.  This article discusses how it was implemented.

Software estimation typically addresses three concerns: staffing, cost/effort, and schedule.  In the SAFe environment, however, development is done in program increments (PI) that in this case were three months in duration with two-week sprints throughout.  Staffing was set at a predetermined level and varied very little during the PI.  Thus, the three variable elements that are normally estimated (staff, cost/effort, and schedule) had already been determined in advance.  So, our job was done, right?  Wrong!  What remained to be determined was capacity: the amount to be accomplished in a single PI.  And that was a very sore “pain point” for the organization. 

Blog Post Categories 
Agile Estimation Capacity Planning

New Article - The Three Software Project Development Traps (And How to Avoid Them)

Software Executive Magazine

Why do projects fail? There are a multitude of reasons from lack of up-front planning to failing to make necessary adjustments as requirements change to overstaffing when the project is running late. Whatever the reason, there are steps you can take to avoid these common traps. In this article for Software Executive Magazine, Larry Putnam, Jr. explains how focusing on scope-based estimates, agile forecasting, and smaller teams will help your development team deliver products on time and according to budget.

Read the article!

Blog Post Categories 
Project Management Team Size Articles Sizing

Using Business Analytics to Set Realistic Customer Expectations

I was recently reading an article by Moira Alexander titled “Why Planning Is the Most Critical Step in Project Management” and I was stuck by her observation that one of the primary reasons that projects fail is because they commit to unrealistic expectations.  In my 35 years of experience, I believe this is the number one reason projects fail.  Yet it is competence that few organizations or product owners ever get good at.

 Today there are good simulation tools that make it simple to establish realistic project boundaries.  The results can be used effectively to communicate and negotiate expectations with clients.  

For example, imagine that you are a product owner planning out your next release.  Your team of 10 people has been working on a 5-month release cadence.   A backlog refinement has shown that there are approximately 100 story points to be completed in this release.  The project plan is shown in the figure below.

Agile Uncertainty

However, there are some uncertainties and we need to deal with them in a realistic way.  Since the schedule and the team size are fixed, the only area that can give is the functionality.   Simulations are a great way to quantify uncertainty.  In our case, we are confident in our team’s productivity and labor cost, but we are somewhat more uncertain about the new capabilities in this release.   It is easy to adjust the uncertainty settings and run a business simulation.  The uncertainty slider bars are in the image below. 

Blog Post Categories 
Estimation Risk Management

Estimation Is Good. Tracking and Oversight Are Even Better!

Now that the baseline estimate has been created, and stakeholders feel their inputs and concerns have been addressed, we as purveyors of the estimate have done our job.  In the world of IT project measurement, many organizations will deservedly feel accomplished that they have armed their development staff with an empirically based roadmap from which to navigate the next x number of months toward delivering a product.  Now let the construction and testing begin!  But wait, there’s more!

It’s always wise to have a sound estimate, but for added assurance of hitting the budget, schedule, staffing and risk targets, organizations have the option of tracking the project mid-flight. Just as estimating is conflated with planning, tracking can be equally confused with other one-dimensional monitoring of projects underway.  So many things can change from the time an estimate is created to the time the first iterations are built.  It’s likely that our estimate assumptions will change after some time has passed into the construction process, unless we have reacted to inevitable unforeseen forces.  For example, requirement changes, staff turnover, management demanding the project x weeks/months earlier, but still expecting all the original functionality.  These are all very real events that are thrown at the PM after the project is underway.  We at QSM have provided a solution for this since the mid-80’s in SLIM-Control, a module in our SLIM Suite.

Software Project Tracking

Blog Post Categories 
SLIM-Control Estimation

10 Tips for Better Software Estimation

This year, QSM will be celebrating our 40th anniversary! Over the years, we have helped many project managers figure out what their software projects should cost, how long they should take, and how to mitigate project and portfolio risk.  Here are 10 tips that every organization should remember for effective software estimation.
  1. Capture some historical data on your projects and keep it simple. The more data, the better, but you can get a good start to your estimation program with just a few projects and a small amount of data from each of those projects. Focus on the core metrics: size, duration, reliability, productivity, and effort. 
  2. Estimate at the release level before detailed planning takes place. This will enable you to tailor your detailed plan to goals that are reasonable. Many analysts spend hours laying out detailed plans for projects that end up over budget and late because they don’t figure out the big picture first. 
  3. Use an empirically-based model that enables you to manage uncertainty. When making big decisions, it’s important to see the 90% chance compared to the 50%. 
  4. Sanity-check your estimates with industry analytics. It’s always good to see typical cost and duration trends from projects that are similar to yours. 
Blog Post Categories 
Estimation

Pentagon Acquisition Needs Consistent Data-Driven Approach for Accountability

DoD acquisition

This post was originally published on Linkedin. Join the QSM Linkedin Group and Company Page to stay up-to-date with more content like this.

When the Honorable Ellen M. Lord, Undersecretary of Defense for Acquisition & Sustainment (USD/A&S) told the Senate Armed Services Committee on Dec. 7 that she intends to demand a higher level of accountability from program managers, you could feel mixed emotions from DoD acquisition professionals. Many are applauding the vocal prioritization on accountability. However, I’m sure struggling acquisition program managers and support contractors, are likely feeling they have a more focused target on their back. There will certainly be other major changes from the former Acquisition, Technology and Logistics (AT&L) office reorganization to two new USD-level offices of USD/A&S and Research & Engineering (USD/R&E). Each will surely be eager to show respective value to the Pentagon in their responsibilities to improve the DoD acquisition process. Particularly, as the DoD continues a focus on DoD business transformation priorities and ensuring that they are acquiring effective defense business systems with capabilities to support those priorities, I’d like to offer some firsthand observations that suggests there still remains a lack of consistency in how we manage that process.

Accountability Requires Consistency

Blog Post Categories 
Government Estimation Benchmarking

How Can You Leverage Big Data to Reduce Your IT Costs?

Today more than ever we have access to large amounts of information. You've probably heard the term "big data," which in essence is having access to large amounts of data and examining the trends in that data. But many executives want to know how they can leverage this information to solve business problems, like lowering IT costs. One way is to use the data to do a better job of estimating IT projects.

Better estimating helps avoid signing up to schedules and budgets that are unrealistic; it helps avoid overstaffing a project or a portfolio of projects; and it helps calculate how much work can be completed within project constraints. In addition, it improves communication internally across the enterprise and externally between the vendor and the client. You can apply estimation to in-house projects and you can use it to generate better proposals or to do a better job of evaluating proposals.  It can also help you negotiate more effectively.

To do a better job of estimating, you need to make good decisions regarding which metrics to leverage. You might have thousands of data points, but it's important to streamline the focus to the core release level metrics: cost, duration, effort, reliability, and productivity.  Next, you need to find a centralized place to organize and store the data so you can analyze it. There are tools out there that can help you. In the view below, you can see a portfolio of projects stored in a centralized place with the ability to manage the access and security.

Big Data to Reduce IT Costs

Blog Post Categories 
Data IT Budgeting Estimation

Do We All Define "Estimate" the Same Way? Maybe Not, but We Should.

Definition of Software Estimate

In any development methodology, we throw around the word “estimate” freely not really understanding how it’s interpreted by many.  In many cases, an estimate, regardless of its content and process by which it was created, is received implicitly as a pin point number with the accuracy of multiple decimal points.  This presents a problem for all parties involved.

I recently had a discussion with a gentleman who told me that prior to using our SLIM tool, the estimates in his organization were arrived at by casual hallway conversations, often started with, “how much do you think this will cost, and how long will it take?”  A typical response is, “hmmm, I’d say about 6 months and $500K.”  That innocent musing then becomes the information upon which business decisions are based, leadership bonuses may be won or lost, and the credibility of the dev team is on the line.

I’d strongly recommend adhering to some definitions when talking estimates.  These definitions will help mitigate potential misunderstandings around the agreement of what makes an estimate:

Blog Post Categories 
Estimation

Derived PI: Is PI from Peak Staff “Good Enough”?

Are you having a hard time collecting total effort for SLIM Phase 3 on a completed project?

Can you get a good handle on the peak staff?

Maybe we can still determine PI!

It is difficult and often time consuming to collect historical metrics on completed software projects.  However, some metrics are commonly easier to collect than others, namely, peak staff, start and end dates of Phase 3 and the size of the completed project.  Asking these questions can get things started:

  • So, how many people did you have at the peak? 
  • When did you start design and when was integration testing done?
  • Can we measure the size of the software?

That gives us the minimum set of metrics to dig up.

However, the PI (Productivity Index) formula also requires phase 3 effort.  Can we use SLIM to generate a PI that is useful, using peak staff instead of total effort?

A statistical test on historical metrics can answer this question.

What are we comparing?

  • Projects used in this study had all 4 of the following:  actual reported effort; size; peak staff; duration.
  • For each project, a derived effort is generated from peak staff, size and duration. 
  • A derived PI is generated from the derived effort, size and duration.  This derived PI is then compared to the actual PI.

Definitions for terms:

Blog Post Categories 
Productivity