“Agile Metrics” at SD Best Practices Conference – Boston
September 29, 2005
Just returned from Boston where Joshua Kerievsky and I gave a talk entitled “Agile Metrics: Keeping Deadlines from Killing Your Projects and Ruining Your Life.”
A most lively reaction came from the audience of 100+ attendees, especially when I commented that “Agile Metrics” seemed an odd oxymoron, akin to terms like “Homeland Security”, “Central Intelligence”, or “Federal Emergency Management.” (Couldn’t resist…) But seriously, what we mean by the term Agile Metrics is actually two-fold: The first being about how to measure quickly and reliably (which you can do on your own) and the second being about gaining measurement insights into the “before and after” (think ROI), of implementing an Agile approach.
Joshua began by talking about the void in the industry when it came to productivity metrics on XP projects. To some extent, he even laid the blame at the steps of the Agile community for the absence of measures that senior managers would believe. (“Metrics? We don’t need no stinking metrics.”) He commented that many of the proponents of XP sought to go it alone, to their eventual chagrin, and didn’t see how to collaborate with those in the field of metrics to understand how XP projects were behaving versus the status quo, and make a compelling enough case to senior decision makers who would ultimately hold the purse strings on approving an Agile initiative.
Kerievsky then went on to talk about his excitement about the prospect of synthesizing what we are able to accomplish with modern benchmarking practices that I’ve written about over the last 10+ years, and how this collaboration yielded powerful results that were clear and concise when it came to schedule and defect patterns on XP projects.
My portion of the talk gave insight into how we can quickly and reliably get a diagnostic reading on the outcomes for any development paradigm, and what we saw when we measured “before and after” an Industrial XP implementation, at a major medical devices company.
(Rather than repeat the findings described in an earlier post, you can view the results in this blogosphere. See the entry on “Reassessing XP.”)
Ironically, one thing I believe we conveyed in this session was the debunking of some myths about Agile, and about software measurement.
Myth #1: Agile is another term for “License to Hack”. I’ve heard critics of XP and other Agile methods dismiss it as undisciplined. But, truth be told, what seems like programmers running amuck is actually one of the most disciplined executions of software invention that one can imagine. It seems to me that, done properly, something like Industrial XP is more rigorous than most organizations that claim a high degree of process maturity. Tim Lister says they’re at least the equivalent of CMM Level 3.
Myth #2: Metrics is a Frederick Taylor inspired, “heavy methodology” that slows things down by detracting from real work. Yet ironically, companies that know how to use measures well, quickly gain insights into their current IT capacity, can reliably generate project scenarios using advanced modeling techniques in days rather than weeks, and are more able to respond to changes on a project, from a management perspective.
So to sum it up, we’re discovering that Agile can be highly disciplined, and measurement can be about fast and accurate knowledge about software productivity, along with accelerated decision making. In my opinion, we’ve only just scratched the surface on this subject.