- Consulting Overview
- High Performance Benchmark Service
- Estimation Process Consulting Service
- Function Point Analysis
- SLIM Product Configuration and Deployment Services
- Acquisition and Program Management
- Independent Bid Assessment / Alternatives Analysis
- Independent Verification and Validation (IV&V)
- Project Health Check & Forecasting Service
- Expert Witness Services
- SLIM API Programming
- QSM Historical Database
- About Us
- Contact Us
QSM IT Software Almanac
Today's fast-paced IT environments leave little room for error. But increasingly, software managers are asked to function in a climate of uncertainty. Project estimates must be prepared with little notice and insufficient data. Customers keep changing their minds - and the delivery date - upsetting even the most carefully planned projects. Now more than ever, software managers need answers:
- What will it cost?
- What can I deliver?
- What level of reliability should we commit to?
- How should we staff our maintenace phase?
Without facts, you can't make the right decisions. Your customers won't be happy.
Your project will suffer.
Unfortunately, the information you need hasn't always been readily available. The IT Software Almanac was commissioned to give managers, developers, and industry leaders an inside look at the current state of software development. We designed it to answer the critical need for information: a need that all too often goes unmet.
Throughout the almanac, we approach each topic from a variety of perspectives. We start by establishing industry benchmarks for the typical small, medium, and large project. We go on to examine the tradeoffs between time and effort and take an empirical look at the cost (in money, time, and defects) of staffing up to achieve schedule compression.
Becoming more productive is always a big concern to any organization, so we study best and worst in class performers to see what they're doing right - and wrong. But there's another way to look at performance. What kinds of factors affect productivity on in-progress projects? What caveats should you consider when using different metrics to assess progress? Our Size, Language, and Reuse section examines these important questions and comes up with some surprising answers.
Finally we raise our sights to the long-term implications of advances in the industry. What changes have we seen since the 1980's? What do current trends indicate for the future of system size, productivity, defects, and reuse? What lessons can we derive from the data, and what predictions can we make about the future? The Conclusions section ties together what we've learned from our short and long term views of the industry. Are they telling us the same things? We hope to provide accurate, timely, and focused insights that make your projects and organizations more successful
The software business is full of interesting questions. The answers are in the data.
We're living proof that measurement needn't be a painful process. For almost thirty years QSM has helped Fortune 1000 firms all over the world develop successful metrics programs. As a result, the QSM metrics database is one of the most comprehensive repositories of modern day software projects in existence. It allows us to draw on over 10,000 completed software projects from North and South America, Australia, Europe, Africa, and Asia, representing over 740 million lines of code, 600+ development languages, and 105,833 person years of effort.
We began building the database in 1978. Since that time we've updated it continuously, adding an average of 500 validated projects a year over the last 6 years. Constantly refreshing the database keeps our software estimation and benchmarking tools current and informs our clients as they deal with the challenges of designing and developing quality software in a constantly changing environment. Clients comprise our main source of project metrics but estimates, productivity assessments, and cost-to-complete engagements present other opportunities to collect data.
When analyzing software metrics, we consider it vital to make only valid comparisons, so we classify projects into complexity or application domains. IT projects represent by far the largest segment of the database, followed by engineering class, real time, and microcode projects. For the past 27 years we've studied development productivity with respect to cost reduction, time to delivery, and quality improvement but in 2001 Doug Putnam conducted a study that surprised us a bit.
He sorted IT applications completed between 1982 and 2000 into 3-year bins and ran regression trends on project size, average Productivity Index (an indexed measure of overall project efficiency), schedule, effort, staff, mean time to defect (MTTD) and software reuse metric