Tuesday, November 26, 2013

Waterfall vs Agile

Clay Shirky, who is not a software developer but who is both a very smart guy and a very good writer, has written a quite-worth-reading essay about the healthcare.gov development process: Healthcare.gov and the Gulf Between Planning and Reality.

I'm not sure how plugged-in Shirky was to the actual healthcare.gov development effort, so his specific comments on that endeavor are perhaps inaccurate, but he has some fascinating observations about the overall software development process.

Shirky starts by describing the challenges that arise when senior management have to oversee a project whose technology they don't understand, and draws an analogy to the historical changes that occurred when technology changed the media industry:

In the early days of print, you had to understand the tech to run the organization. (Ben Franklin, the man who made America a media hothouse, called himself Printer.) But in the 19th century, the printing press became domesticated. Printers were no longer senior figures — they became blue-collar workers. And the executive suite no longer interacted with them much, except during contract negotiations.

It's certainly a problem when technology executives don't understand the technology in the projects they oversee. However, Shirky has another point to make, which is about the choice of development processes that can be used in a software development project.

The preferred method for implementing large technology projects in Washington is to write the plans up front, break them into increasingly detailed specifications, then build what the specifications call for. It’s often called the waterfall method, because on a timeline the project cascades from planning, at the top left of the chart, down to implementation, on the bottom right.

As Shirky observes, in a wonderfully-pithy sound bite:

By putting the most serious planning at the beginning, with subsequent work derived from the plan, the waterfall method amounts to a pledge by all parties not to learn anything while doing the actual work. Instead, waterfall insists that the participants will understand best how things should work before accumulating any real-world experience, and that planners will always know more than workers.

This is just a brilliant point, so true, and so well stated. The great breakthrough of agile techniques is to realize that each step you take helps you comprehend what the next step should be, so allowing feedback and change into the overall cycle is critical.

Shirky then spends the remainder of his wonderful essay discussing policy-related matters such as the federal government's procurement policies, the implication of civil service bureacracy, etc., which are all well and good, but not things I really feel I have an informed opinion about.

Where I wish to slightly object to Shirky's formulation, though, is in the black-and-white way that he portrays the role of planning in a software development project:

the tradeoff is likely to mean sacrificing quality by default. That just happened to this administration’s signature policy goal. It will happen again, as long politicians can be allowed to imagine that if you just plan hard enough, you can ignore reality. It will happen again, as long as department heads imagine that complex technology can be procured like pencils. It will happen again as long as management regards listening to the people who understand the technology as a distasteful act.

This is, I think, a common mis-statement of the so-called "agile" approach to software development: Agile development processes do NOT eliminate planning! Shirky worsens the problem, in my opinion, by setting up a dichotomy, starting with the title of his essay and throughout its content, between "planning" and "reality".

To someone not deeply immersed in the world of software development process, Shirky's essay makes it sound like:

  • Waterfall processes involve complete up-front planning
  • That typically fails with software projects, because we're trying to do something new that's never been done before, and hence cannot be fully planned out ahead of time
  • Therefore we should replace all that futile planning with lots of testing ("reality")
It's that last step where I object.

Agile approaches, properly executed, solve this up-front planning problem by decomposing the overall project into smaller and smaller and smaller sub-projects, and decomposing the overall schedule into smaller and smaller and smaller incremental milestones. HOWEVER, we also decompose the planning into smaller and smaller and smaller plans (in one common formulation, captured on individual 3x5 index cards on a team bulletin board or wall), so that each little sub-project and each incremental milestone is still planned and described before it is executed.

That is, we're not just winging it.

Rather, we're endeavoring to make the units of work small enough so that:

  • Everyone on the team can understand the task being undertaken, and the result we expect it to have.
  • Regularly and frequently, everyone on the team can reflect on the work done so far, and incorporate lessons learned into the planning for the next steps
Shirky does a good job of conveying the value of the latter point, but I think he fails to understand the importance of the former point.

You can't settle for a situation in which management doesn't understand the work you're doing. Shirky is clearly aware of this, but perhaps he's never been close enough to a project run using agile approaches, to see the techniques they use to ensure that all members of the team are able to understand the work being done. (Or perhaps he just despairs of the possibility of changing the behavior of politicians and bureaucrats.)

Regardless, don't just listen to my rambling; go read Shirky's essay and keep it in mind the next time you're involved in a large software development project.

No comments:

Post a Comment