Keith Ciocco's blog

Keith Ciocco's blog

Using Big Picture Analytics to Power Software Estimation

Imagine a software development process where “Big Picture” estimates can be generated before detailed planning takes place, where the estimates can be accessed on the web, and where only three or four inputs are needed to generate reliable information. This process would include intelligent models that take into account historical data and there would be a back office team that specializes in software customization available to do the heavy lifting. Finally, there would be business analytics and industry data (plenty of both) to help with project target negotiations and risk trade-off analysis.

Thankfully, there are science-based estimation solutions available that include the capabilities mentioned above. These packages can make estimation easier, more transparent, and help manage the uncertainty that can come with early planning. There are technology organizations that are using these types of tools to improve their time to market and the accuracy of their software development estimates.

But there are many organizations that still struggle with estimation. They spend millions of dollars each year developing and delivering software. The planning usually starts with senior management asking tough questions about cost and schedule targets. The project leads then try to come up with effort estimates for each person on the project based on experience and gut feel. These effort numbers are tallied up in spreadsheets to come up with an overall estimate, the numbers are usually negotiated and then a final estimate is born. The problem with this process is that it takes a long time to carry out and the estimates are usually inaccurate. This is because the methods don’t take into account the non-linear relationships, integration, and overhead that we often see in software development.  When the estimates are off, millions of dollars are spent trying to change course. The rest is history. 

Blog Post Categories 
Estimation SLIM-Collaborate

Can Estimation & Analytics Improve Vendor Client Relations?

It happens time and time again. Clients look to their vendors to provide software development or configuration services and both sides are often left with big questions. Is the price fair? Can we really get the project done within our duration and resource goals? How can we negotiate for a successful outcome?

There are estimation solutions available that can help. The good ones will leverage empirically-based models, historical data, and industry analytics to uncover which proposals are feasible and which ones are risky.

In the first view below, there are two columns: the “Desired Outcome,” which is one vendor’s proposal and the second column, which is the data-driven “Recommended Estimate.”  The vendor is promising to complete the work in 3 months with a $750,000 price tag. You can see that this proposal is “Risky” and that the vendor will probably finish late and will either have to ask for more money or lose money in the long run.  The charts in the view provide a graphical representation.

Vendor Bid

In the second view for the same project, you see a second vendor’s proposal compared to the “Recommended Estimate.” The vendor’s bid is for 8 months with a $1,000,000 price tag and there is a “Moderately Conservative” rating. In other words, this vendor has a much better chance of achieving what they are promising. 

Vendor Bid

Blog Post Categories 
Vendor Management Estimation

Is Software Estimation Needed When the Cost and Schedule Are Fixed?

In many agile environments, the budget, team size, and schedule are fixed based on an organization’s predetermined targets for sprints or iterations. This leads many project managers to question if software estimation is even necessary. The problem is, without a reliable size estimate, the amount of functionality promised within the time and money constraints could be difficult to achieve and could cause the product delivery to be short on features, or late and over budget.

This is where scope-level estimation tools come into play. They can help evaluate whether targets are reasonable and, even if the schedule and budget are both set in stone, they can help figure out how much work can be delivered. This type of analysis helps set customer expectations and provides data driven leverage for negotiations.

The best estimation tools leverage empirically-based models, industry analytics, and historical data. They can even be used before iteration level planning takes place. They ensure that the overall goals are reasonable before detailed plans are developed. 

In the three views below, we see an estimate generated from a “Time Boxed” method. This is where the product manager was able to input the predetermined time, a productivity measure (PI), and a team size, to see how many story points could be completed within the set constraints. The analysis also includes a “sanity check” of the estimate, comparing it to an agile industry trend from the QSM Industry Database and their own agile historical data.

Time Box

Time Box

Blog Post Categories 
Agile Estimation

10 Tips for Better Software Project Tracking & Oversight

Software Project Tracking

 

During QSM’s 40 years in business we have often been asked to help with software projects that are out of control and riddled with unrealistic goals and soaring costs. Project managers often ask, "where is the light at the end of the tunnel?" In honor of Larry Putnam, Sr., who started QSM back in 1978, here are 10 tips for better project tracking and oversight.

Blog Post Categories 
SLIM-Control SLIM-Collaborate Tracking

10 Tips for Better Software Estimation

This year, QSM will be celebrating our 40th anniversary! Over the years, we have helped many project managers figure out what their software projects should cost, how long they should take, and how to mitigate project and portfolio risk.  Here are 10 tips that every organization should remember for effective software estimation.
  1. Capture some historical data on your projects and keep it simple. The more data, the better, but you can get a good start to your estimation program with just a few projects and a small amount of data from each of those projects. Focus on the core metrics: size, duration, reliability, productivity, and effort. 
  2. Estimate at the release level before detailed planning takes place. This will enable you to tailor your detailed plan to goals that are reasonable. Many analysts spend hours laying out detailed plans for projects that end up over budget and late because they don’t figure out the big picture first. 
  3. Use an empirically-based model that enables you to manage uncertainty. When making big decisions, it’s important to see the 90% chance compared to the 50%. 
  4. Sanity-check your estimates with industry analytics. It’s always good to see typical cost and duration trends from projects that are similar to yours. 
Blog Post Categories 
Estimation

How Can You Leverage Big Data to Reduce Your IT Costs?

Today more than ever we have access to large amounts of information. You've probably heard the term "big data," which in essence is having access to large amounts of data and examining the trends in that data. But many executives want to know how they can leverage this information to solve business problems, like lowering IT costs. One way is to use the data to do a better job of estimating IT projects.

Better estimating helps avoid signing up to schedules and budgets that are unrealistic; it helps avoid overstaffing a project or a portfolio of projects; and it helps calculate how much work can be completed within project constraints. In addition, it improves communication internally across the enterprise and externally between the vendor and the client. You can apply estimation to in-house projects and you can use it to generate better proposals or to do a better job of evaluating proposals.  It can also help you negotiate more effectively.

To do a better job of estimating, you need to make good decisions regarding which metrics to leverage. You might have thousands of data points, but it's important to streamline the focus to the core release level metrics: cost, duration, effort, reliability, and productivity.  Next, you need to find a centralized place to organize and store the data so you can analyze it. There are tools out there that can help you. In the view below, you can see a portfolio of projects stored in a centralized place with the ability to manage the access and security.

Big Data to Reduce IT Costs

Blog Post Categories 
Data IT Budgeting Estimation

Can Someone Get Me A Big Picture Estimate?

It’s a story we hear a lot in the software business these days, especially with agile development. New functionality is needed within a certain amount of time and within a certain budget. 

Some might say, "no problem! We can figure it out as we go along." They might feel comfortable because each sprint has already been set in stone.  But there are business-related questions that need to be answered before sprint-level planning takes place and before we commit to goals that might not be achievable at the release and portfolio levels. Should we agree to do this project? Can we really get all of the work done within our constraints? Will the software be reliable at delivery? How does this project impact our annual and multi-year forecasts? 

This is where having reliable big picture numbers can be helpful. Wouldn’t it be great if senior management and the technical team were on the same page early? There are empirically-based estimation tools that can help. The naysayers might say that the technical requirements aren’t firm enough to come up with early estimates before the sprint planning takes place. But the fact is that some of these models (the good ones) allow for managing uncertainty and they do it based on historical data. The slide below shows a summary example of a release-level estimate for cost, duration, and reliability.

Software Estimate

Blog Post Categories 
Estimation Agile

IT Cost Optimization and Cloud Solutions at Gartner Symposium/ITXpo

QSM at Gartner Symposium/ITXpo

The QSM team had a productive trip to the Gartner Symposium in Orlando. It's always helpful for us to discuss IT current trends and challenges with the people in our industry. Many of these themes came to light as we provided SLIM-Suite product demonstrations along with question and answer sessions at the QSM exhibit.

One of the big areas of interest at the conference was IT cost optimization, which is also one of QSM’s main areas of expertise. I hosted a presentation called “Cost Optimization Best Practices for IT Portfolio Budgeting.” The main focus of the presentation was to show how we can leverage empirically-based models and predictive analytics to balance enterprise demand with capacity and at the same time save big money in the IT budgeting process. The presentation was well-attended and a meet and greet session followed where our QSM team, consisting of Ethan Avery, Richard Pelaez, Greta Moen, and I, provided solution demonstrations and answered questions.

Another big focus of the conference was related to cloud solutions and how they will affect the internet of things and artificial intelligence. Our team featured our cloud solution, SLIM-Collaborate, which provides portfolio analytics and the ability to estimate the cost and risk of creating new software technologies. We provided examples of how we support all types of software & systems projects and explained the benefits of having a secure process for leveraging this information across the enterprise.

Blog Post Categories 
IT Budgeting QSM News

How Can Organizations Optimize Costs in the IT Budgeting Process?

It’s that time of year again for many C-level executives: time to figure out the IT budget for next year. This is to bring the business side of the organization to the table with the technical side to forecast how much IT is going to spend. It can be a complicated process, but there are ways to make it easier and more accurate; and there are ways to save a lot of time and money. The challenges often relate to short planning time frames, minimal information available to generate accurate forecasts, political agendas within the organization, and, unfortunately, only a small number of estimation methods in place. But there are tools and processes available to help face these challenges. Here are the basic steps that we recommend for cost optimization in the budgeting process.

Start by analyzing the historical data that is available. The process can be streamlined by focusing on the core metrics within the organization. This data can include release level size, effort, staff, and duration information. Historical data showing typical effort by role by month spending is also valuable to leverage. Ideally, this type of data should be captured on 8-15 projects.

The next step is to pull together scope level sizing data on projects that are being considered for the new year. This information can include epics, themes, user stories, business requirements, or use cases, to name a few. The goal here is to get as close as possible to determining how much work needs to be done on each release in the pipeline. Once there is a large enough sample of data, then release level estimates can be created for the coming year. There are tools available to help streamline this process and the best ones allow for risk mitigation and sanity checking with historical data.  

Blog Post Categories 
IT Budgeting Estimation

Agile On-Time, But Is It Reliable?

With agile projects, we hear a lot about the planning benefits of having a fixed number of people with a fixed number of sprints.  All great stuff when it comes to finishing on time and within budget. But one of the things we also need to focus on is the quality of the software.  We often hear stories about functionality getting put on hold because of reliability goals not being met.

There are some agile estimation models available to help with this, and they can provide this information at the release level, before the project starts or during those early sprints. They provide this information by leveraging historical data along with time-tested forecasting models that are built to support agile projects. 

In the first view, you can see the estimate for the number of defects remaining. This is a big picture view of the overall release. Product managers and anyone concerned with client satisfaction can use these models to predict when the software will be reliable enough for delivery to the customer.

MTTD over Time

In the second view, you can see the total MTTD (Mean Time to Defect) and the MTTD by severity level. The MTTD is the amount of time that elapses between discovered defects. Each chart shows the months progressing on the horizontal axis and the MTTD (in days) improve over time on the vertical axis. 

Mean Time to Defect

Blog Post Categories 
Agile Quality Estimation Software Reliability