Practical Software Estimation Measurement

Blogs

How Can We Leverage Summary Level Analytics to Support Enterprise Planning?

What if you could leverage summary level cost, duration, and productivity data to support estimates for future projects, at the release and enterprise level? C-level executives, development managers, and project stakeholders are all involved at some level in project planning. They want quick access to information on a regular basis and they want web-based solutions to make it happen. So how does it all work? There are web-based analytics tools that allow you to create a centralized database for all of your projects. These tools store the data, leverage it to generate project and portfolio estimates, and then provide a communication vehicle throughout the organization to ensure that everyone involved is on the same page. It all starts with having the data in one place. 

Software Project Database

Once you have all of your project data in one place, then you can focus on analyzing the completed projects. You can compare them against industry trends and leverage a 5-star report to show how they rate on performance in the industry. The initial measures to focus on would be size, duration, effort, reliability, and productivity. A project's productivity will be calculated automatically once you have entered the size, duration and effort. We call this measure a Productivity Index. This measure can be compared to industry and used as a benchmark to measure process improvements over time.  These numbers give you a quantitative picture of your current project environment.  

Software Project Closeout

5 Star Report

The 2017 Software Almanac: Development Research Series

QSM Software Almanac: 2017 Edition

Software plays an increasingly vital role in our everyday lives. It powers everything from autonomous cars and aircraft, life-saving medical equipment, and the data that allows the government to protect our country. When companies develop software, there’s no room for error. 

That’s why software predictive analysis and estimation are still extremely important. Last year, with the release of the 2016 Software Almanac, we learned that the last 35 years of predictive analytics and estimation principles were still incredibly relevant for providing reliable and applicable business intelligence for implementing successful software projects.

This year’s version of QSM’s annual Software Almanac further strengthens those findings. The 2017 Software Almanac builds on the principles identified in last year’s publication and highlights the dangers of not applying predictive analysis and estimation processes.   As stated by Angela Maria Lungu, Almanac Editor and Managing Director at QSM, these principles can be a “double-edged rearview mirror.” If you move forward without applying the historical principles of estimation and analysis correctly, their value is diminished.   Here’s what else you can expect from this year’s Almanac:

Blog Post Categories 
Articles QSM Database

New Article: Big Rock Estimation in Agile

Agile Big Rock Sizing

Big Rock Estimation: Using Agile Techniques to Provide a Rough Software Schedule / Resource Estimate is the third article in the QSM Agile Round Table series.  The QSM Agile Round Table was formed to discuss the role of estimation in agile environments.  QSM customers shared their questions, challenges, and experiences on the relevance and benefits of scope-based estimation in an agile environment.  The Round Table spent several meetings on the key topic of sizing an agile release. The discussion centered around two main questions:

  1. How can you determine the size of a release early in absence of a “big upfront requirements phase,” and thus when the requirements are only known at a very high level and subject to refinement and change?
  2. How can you determine size in a consistent way across multiple products, projects, and agile teams so that you have good historical data on which to base an estimate?

This and the next article in the QSM Agile Round Table series are based on those discussions. Aaron Jeutter, a participant in the Round Table from Rockwell Automation, presented the technique of “Big Rock Sizing.”  This technique is used at Rockwell Automation for early sizing and estimating based on high level requirements that will be refined using agile techniques as the work progresses.

Read the full article!

Blog Post Categories 
Articles Agile Estimation

Agile Development and Software Estimation: Two Processes That Go Great Together

This post was originally published on Linkedin. Join the QSM Linkedin Group and Company Page to stay up-to-date with more content like this.

New approaches to software development can sometimes seem at odds with the needs of business customers. For instance, ardent practitioners of the agile development methodology continue to advocate for rapid response approaches and the need for constant iteration to solve complex problems. On the other hand, companies and customers are demanding a strategic approach that provides insight into process, timing, and costs.

So, which of these yin and yang scenarios should developers employ? The answer is “both.”

Enter scope-based software estimation, which I maintain can be a powerful tool to ensure that projects remain on course and on budget. It is possible for schedule and budget estimation to be achieved without sacrificing any of the things that make agile development so potent.

Not everyone feels the same. Some would argue that there’s simply no place for estimates in an agile development world; that estimates cannot coexist with agile or “lean” methodologies like Scrum, which encourage teamwork, speed, and communication without constraints.

What do Sports and Agile Software Productivity Have in Common?

I am a big sports fan and since I work for a software metrics company, I started thinking about the similarities in productivity measurement in both industries. From draft picks to game planning decisions, managers in sports measure their team’s productivity to help them make better decisions in the future. Software executives and product owners do the same thing; they measure productivity in order to make better planning decisions regarding upcoming projects.

To measure productivity in baseball, we look at measures like batting average, on-base percentage, slugging percentage, and earned run average. When measuring the productivity of agile software projects, we often look at the velocity, which takes into account the number of user stories completed in each sprint. This type of historical data helps us plan effectively at the detailed level.

As part of our work at QSM, we are often asked to provide plans at the release and portfolio level. To provide these plans reliably, we use a macro level, empirically-based productivity measure that encapsulates a number of project-related factors. This measure is called the Productivity Index, an integral part of the Putnam Model.  Also known as the SLIM Model, the Putnam Model was invented by Larry Putnam Sr. almost 40 years ago and is having a big impact on software measurement more than ever today.

Once we know the total number of user stories (or any size measure), the release level effort, and the duration, we can calculate the Productivity Index of a project. The Productivity Index also takes into account the project environment including: the experience level of the team, the complexity of the software, and the quality of the tools and methods being used on the project.

Blog Post Categories 
Productivity Agile

Historical Data Isn’t Playing "Hard to Get"

Historical Data Collection

“No, we don’t have any historical project data collected” is the statement I hear with some frequency when speaking to organizations about their IT project estimating processes.  Ideally we use client history to calibrate and tune the project estimates we provide.  In my quest to spread the word about parametric estimating I often encounter this notion that organizations don’t believe they have historical data in a retrievable form.  In almost every case that I have been involved, it turned out that the historical data was present, just not in the form of a 1,000 rowed spreadsheet.  Often times the data is more available than the client is aware.

Our approach works at a macro level so we are seeking overall project metrics of cost, schedule, size, staffing and defects.  If the actual formal documentation of history is not available for these five core metrics, then it usually is available by leveraging various sources within the organization.  We have found it’s common to resurrect a project’s outcome by seeking feedback from the team that worked the project, however if that’s not possible due to attrition, re-org or other disrupting factors, we can usually find the project metrics through other means.  Those other means may be time and defect tracking tools, requirements analysis tools and accounting systems.  The data is almost always documented somewhere.   

Blog Post Categories 
Database Data Metrics

Three Strategies for Successful 2017 Project Portfolio Planning

This post was originally published on Linkedin. Join the QSM Linkedin Group and Company Page to stay up-to-date with more content like this.

As we move closer to the end of the year, many of us are in planning mode. We’re working hard to determine which development projects are going to get done next year, and which ones may have to wait their turn until 2018.

No one should go it alone, though. Business executives need input from IT managers to truly gauge the feasibility of developing the projects that are on their list. Likewise, IT managers need insight into the expectations of business executives so they can produce the products they need.

That’s what makes project portfolio planning so essential. It brings business stakeholders and IT managers together by allowing them to communicate with each other about needs and expectations, and to find common ground that leads to realistic project estimates that help shape the course of successful development for the next 12 months.

It also helps establish a clear product roadmap. It’s not uncommon for organizations to start out with a long list of “to-do’s” every year, but doing everything is simply unrealistic. Therefore, it’s important to identify and prioritize projects that will bring your company the best ROI and help it meet overall strategic goals over the course of the next year.

New Article: In Agile, What Should We Estimate?

In Agile, What Should We Estimate?

Instead of debating #YesEstimate vs. #NoEstimates, we can ask a more useful question: “what should we estimate and why?”  To answer this, we need to distinguish between consumable value and potentially deliverable software. Both are useful concepts but for different purposes.  By choosing small enough developer-sized bites, we can time-box potentially deliverable software to get frequent feedback and review.  But a meal that provides consumable value that satisfies our users and customers must consider the tradeoff of benefits to both the business and the consumer.  In the second article of QSM's Agile Round Table series, Andy Berner explains why setting goals for consumable value and estimating what it takes to reach those goals are both needed to guide the choices every organization needs to make about what to develop and how to allocate resources.

Read the full article!

Blog Post Categories 
Agile Articles

New Article: Using Software Project Metrics

Compare Project Plan to History

Software measurement by itself does not resolve budget, schedule or staffing issues for projects or portfolios, but it does provide a basis upon which informed decisions can be made. Here are examples of how to use metrics to determine present capabilities, assess whether plans are feasible, and explore trade-offs if they are not. This is the third article of a three part series by QSM's Don Beckett for Projects at Work. You can read the first article here and the second here.

Read the article!

The More Things Change: The Evolution of Software Estimation and Development Over the Past 35 Years

This post was originally published on Linkedin. Join the QSM Linkedin Group and Company Page to stay up-to-date with more content like this.

The term “true original” is used to describe someone who is a trailblazer -- and it describes my father to a T. My dad was an early architect of software estimation, the process of predicting the time, effort, and manpower it takes to complete a software development project.

Thirty-five years ago, my father was a budget director for the Army’s computer programs. He had the unfortunate experience of having his funding significantly reduced when his IT team failed to properly articulate its software development goals in ways that were relatable to leaders. As a superior put it, “Whenever I talk to the IT guys, I hear about bits and bytes, programming languages, and bandwidth, but nothing that relates to time, effort, and cost.”

That comment sent my dad on a mission to develop a software estimation frameworkthat addressed the three points that his boss was most concerned about. He sought to expose what he called “a fundamental law of nature in our software production equation.”

Blog Post Categories 
Estimation