Welcome to another chapter in the feature-rich story of DevOps!
Last time, we discussed why the history of software development is important and how waterfall development fit into that narrative. Remember that waterfall development was ironically rather rigid. It lacked the flexibility to adapt to change, a noticeable weakness in a world that is increasingly volatile.
In this chapter, we’re going to explore how (and to what extent) engineers iterated on the waterfall model through Agile Software Development. Instead of trying to control change by locking down phases of development, Agile methodology is about embracing change. Risk is lessened not by devising the perfect plan, but by cutting projects into small chunks and adapting on the fly.
But enough spoilers! Let’s dig around in the roots of Agile philosophy.
While it would be really convenient if waterfall development led directly to agile development, the actual timeline isn’t so clean. Even in Winston Royce’s 1970 paper, there was already a general awareness of the need for iteration and lightweight processes. This awareness only grew as developers discovered weaknesses in prevailing workflows.
These workflows were heavy, inflexible things. Software was still treated as something to be manufactured, and the methods in vogue reflected this mindset. Behemoths like Microsoft focused on planning and eliminating surprises during development. Perfectionism was encouraged, manifesting in astronaut architecture and Big Design Up Front.
Astronaut architecture is very hard to change.
Generally, Agile methodology was a reaction to overzealous documentation and low thresholds for change. The software industry, advocates believed, was too focused on process and planning — to the detriment of the product and customer. While features languished in bureaucratic, corporate purgatories, everyone suffered: developers couldn’t ship their creations, products fell behind the competition, and users were condemned to either use outdated technology or flee to a competitor.
All this pressure to improve led to a slew of workflow experiments in the 90s. The results of these experiments were the precursors to Agile methodology and shared common beliefs around speed, risk, and chaos.
Rapid Application Development
At IBM, James Martin created Rapid Application Development, which emphasized process and prototyping over the excessive planning of waterfall development. He consolidated his ideas into a 1991 book, creatively titled Rapid Application Development (RAD).
RAD had no trouble finding fans, as prototyping became increasingly viable across the industry. But its fans disagreed on the proper way to RAD, in part because of the intentionally vague instructions left by Dr. Martin. Businesses were cautious, hesitant to embrace a fad without a guarantee that they wouldn’t be sacrificing quality for speed.
Dynamic Systems Development Method
In 1994, the Dynamic Systems Development Method emerged to unify the various versions of RAD. Unlike its predecessor, DSDM was spearheaded not by an individual, but by a consortium of motivated vendors and experts. And instead of just outlining a process, DSDM got its hands dirty by defining explicit techniques and project roles.
Two of those noteworthy techniques are timeboxing and MoSCoW prioritization. Timeboxing involves cutting a project into chunks that have resource caps — on both time and money. If either resource begins to run out, efforts are focused on the highest priorities, defined by MoSCoW. While the actual acronym is questionable, MoSCoW’s intents are sound. Look at what those capital letters stand for:
By grouping requirements into these four levels, engineers always know exactly which ones to drop when pressed for time or money. Which is inevitable.
Must have every color ever.
Both timeboxing and MoSCoW are concerned with time, safeguarding against the software industry’s traditional perfectionism. By implementing constraints on how much work can be done and establishing frameworks for the value of that work, DSDM kept people moving.
But folks weren’t done iterating! In 1995, Kevin Schwaber and Jeff Sutherland, went to a research conference on Object-Oriented Programming, Systems, Languages & Applications. The acronym for this is OOPSLA, which sounds accidental but is actually very intentional.
There, Schwaber and Sutherland presented a jointly-authored paper on a new process called Scrum. Scrum was yet another framework for developing software and focused on collaboration and flexibility. The term comes from rugby, where both teams restart play by putting their heads down and focusing on getting the ball. It’s not a subtle metaphor.
Diligent, athletic software developers with their heads down.
Scrum is one of the first philosophies that explicitly emphasized agility and empirical process control. All of Scrum’s values revolve around a single principle: uncertainty. Are you certain the customers will like this? Are you certain you understand everything about this project?
True certainty, argue the Scrumsters, is impossible to achieve — an asymptote of productivity. Where Waterfall attempted to define everything ahead of time, Scrum abandoned that fantasy in favor of smaller batch sizes. So hone the delivery process instead, focusing on adaptation and emerging requirements. This is the essence of Scrum and drives all of its tactics.
Those tactics include familiar routines like sprints, daily standups, and retrospectives. All of these are meant to drive work in focused bursts of productivity. The regular check-ins provide opportunities to course correct and aim energy at emerging priorities.
One important tool of Scrum is the backlog. There are usually multiple backlogs at any given point: one for the product generally, and then a backlog per sprint. These backlogs give the team an efficient method for prioritizing work. These artifacts are so important to proper Scrum execution that whole industries have grown up around them. Pivotal Labs and Atlassian’s JIRA are just two of the companies creating issue trackers.
The final entrant into this motley crew of methodologies is Extreme Programming (XP), coined by Kent Beck in March 1996. Beck was working on a payroll project with Chrysler when he devised the basic tenets of XP. While the project was cancelled, the methodology survived. And what does the methodology teach? Gather all the best practices and ratchet them up to an EXTREME level.
For example, if some testing is a Good Thing, then more testing must be a Better Thing. This translates to test-driven development: where tests describing behavior are written before the code proper. This practice not only increases the total number of tests in an application, but also changes how the code itself is written.
Another example involves optimization of feedback loops. Advocates of XP argue that a ruthless focus on short feedback loops increases the odds that you’re making the right thing. To that end, Extreme Programmers should write code for the present and receive feedback on that code as quickly as possible.
This commitment to feedback manifests in practices like pair programming, where developers write code in two-person units. By having to articulate what they’re writing, engineers write better code and find bugs before they happen. Teams also commit to frequent releases, often manifesting in processes like continuous integration, which is somewhat selfish foreshadowing for a future chapter in this series!
Fantasia on an Agile Theme
If you’ve noticed a theme here, you’re not alone. While these philosophies technically emerged independently of one another, there was a general awareness of the need for speed. But along with that collective recognition, there was also a genuine lack of semantic unification: words like evolutionary and adaptive were being tossed around, but these had different nuances depending on which methodology you happened to be living in.
There were simply too many cooks in the kitchen, and some of these cooks hadn’t even met each other! Some of these thought leaders had crossed paths at other conferences, but it took a motivated soul named Bob Martin to herd all the cats into one place. In September of 2000, Martin sent out an email to gauge interest, and the rest is… well, the rest is below.
Like all great philosophies, Agile proper originated in Utah. There, seventeen developers — many of them representatives of the aforementioned methodologies — met to organize their thoughts on what Being Agile even meant. The actual Agile Manifesto isn’t very long, consisting of twelve principles total. These principles adhered to four values that summarized how Agile advocates were to make decisions:
Individuals and interactions > processes and tools
Working software > comprehensive documentation
Customer collaboration > contract negotiation
Responding to change > following a plan
The Agile Manifesto consolidated several existing methodologies under one proverbial roof: from Extreme Programming, they lifted customer satisfaction and collaboration; from Scrum, they incorporated the introspection of standups and retrospectives; and from Pragmatic Programming, they took the… pragmatism.
It’s important to note that the Agile manifesto was not itself a new methodology, but rather a collection of existing methodologies. The “Agile Alliance” brought all the best parts together, hoping that the whole would be better than its various components. And it has: Version One publishes an annual “State of Agile” report measuring the methodology’s consistently positive impact on the industry.
An early adopter of Agile methodology.
To many engineers, however, “Agile” has become a buzzword — a term brandished by companies desperately seeking to prove their relevance to a shallow pool of talent. The “Agile Alliance” itself can seem somewhat pompous: why should a group of self-proclaimed leaders at a ski resort be able to decide what the industry should or shouldn’t become?
The idea of Agile can sometimes distract from the Agile Manifesto itself. Ultimately, Agile methodology is nothing more than a collection of engineering principles, created to help software teams make practical decisions more quickly. It is not a well-defined process or magical recipe for success, and companies who say that “they are Agile” are often the furthest from living these principles.
Why So Cynical?
But even if Agile is just a collection of engineering principles, does that make it unhelpful or impractical? Not necessarily. Agile is a movement. And like all movements, it’s easy to become emotionally invested without understanding everything about it. It doesn’t help that “agile” is also just a normal word that means something simpler than an Entire Movement.
Agile Methodology is a reaction to — and revision of — a way of doing things that developers didn’t think was working. It irked enough of them that they had to retreat to a safe space to collect their thoughts on the matter. What came out of that was a list of principles that weren’t themselves processes, but could be used to generate processes.
And that’s the larger theme here: the mission of this gathering was to help people change the way they thought about developing software. Instead of a product to be manufactured, code should be an art to be crafted. That subtle difference implies a host of alternate strategies, such as timeboxing, pair programming, and TDD.
Waterfall was slow and rigid. It couldn’t adapt to change, so engineers tried to control it. But that just wasn’t responsive enough to deliver quality software customers actually wanted. Agile methodology was a response to this weakness, and its advocates realized that they could never completely eliminate risk or uncertainty from a project. Instead, they focused on containing risk through rapid development and constant validation that their work was heading the right way.
This idea of constant validation will be the seed for the next chapter in the history of DevOps: continuous integration and delivery!