Thomas C. Redman recently wrote about data quality on the Harvard Business Review blog. In his post, he creates a vignette of an executive who finds an error in data provided by the "Widgets Department" for an important meeting. The executive corrects the error, the meeting is a huge success, and the story ends there. Redman argues that someone should have gone back to the Widgets Department to report the error, not to complain that the error could have ruined the presentation, but rather that it could ruin the next person's presentation.
David McCandless gave a TED talk in July 2010 that focused on pairing data and design to help visualize patterns. In his talk, McCandless takes subsets of data (Facebook status updates, spending, global media panic, etc.) and creates diagrams which expose interesting patterns and trends that you wouldn't think would exist. Although the focus of McCandless' talk was about how to effectively use design to present complex information in a simple way, I was struck by his own claim that data is not the new oil, but rather that data is the new soil. For QSM, this is certainly true!
If you were unable to attend our recent webinar, Using Function Points and SLIM to Support a Complete Estimation Process, a replay is now available.
The thirty years I have spent in software have bridged a period of remarkable and ever accelerating change. Mercifully, coding an online system on a black and white CRT that accesses an IMS database is mostly a quaint memory. Technology, tools, and processes have all evolved. Why is it, then, that we continue to have the same problems we experienced in the Information Technology Dark Ages? Here are the symptoms:
- Software projects that continue to overshoot their schedules
- Quality problems have neither disappeared nor lessened to an acceptable level
- Budgets are regularly exceeded: sometimes wildly
- Project estimates are inaccurate
I see two principal reasons. I’m certain there are others.