My enforcement of Test-Driven Development has earned me a lot of nicknames. There's Test <insert name of WWII-era German socialist party>1. Some people "lighten it up" to Test<last three syllables of a Nicaraguan socialist party>2. I personally prefer TDD Sith Lord, because I can type that without any footnotes or disclaimers, and as far as I can tell, the Sith have not been terribly active in the last century.
All of the names connote that I'm a misguided, power-hungry TDD advocate. So why am I such a fervent enforcer of TDD?
Years before I started TDD, I knew plenty of arguments supporting developing software that way, but on paper, those weren't convincing enough; I needed concrete reasons to convert me to the dark side (which, by the way, I now know is the correct side).
TDD forces us to write tests.
How do you get started with TDD? My TDD career began on a project where we (fortunately) were already sold on the benefits of automated testing. We knew we needed those tests; we knew we wanted to know as soon as possible when we'd broken something. We just didn't know how to incorporate test writing into our development process.
Perhaps my original decision was for logistical reasons, but following the Red-Green-Refactor TDD cycle forces me to write tests; it's built in as the "Red" step of the cycle!
TDD prevents us from writing untestable code.
About as soon as we adopt TDD, we run into one of the most popular impediments: legacy code. We've all worked on that one DoEverything() method that's approximately 1J3 lines long, unindented, and overwrites most of the data in the database (the production database, because that's the only one there is). How do we test such a monolith?
Through the modern miracles of source control integration, I'm a click away from seeing the developer for which I can hold a secret resentment. However, now I'm writing my test code at the same time as my production code. When following the TDD cycle, it's virtually impossible for me to write untestable code, and I'm no longer the target of resentment for writing untestable code4.
TDD discourages us from creating unnecessary code.
How much time have we spent troubleshooting and correcting bugs in code that isn't even necessary to solve the problem?
One of the most elegant pieces of code I've ever seen came from a Scrum Alliance Certified Scrum Developer course. The final exam comprised a few requirements about a cash register and we had a couple hours to write tests and implement the code to make the tests pass. The requirements were simple:
- Buying 1 banana costs 79 cents.
- Buying 1 orange costs 99 cents.
I over-engineered a bunch of classes to manage products and orders and calculations and all kinds of things I didn't need5. I'd have been proud of myself for finishing my grandiose order entry system in only 45 minutes, but another student finished in just a few minutes. Here's all the code needed to pass the exam:
public static int CalculatePrice(Product product)
return product.Name == "Banana" ? 79 : 99;
The requirements didn't mention any other product types, orders, or any of the extra classes and code I'd written. Yet I wrote a lot of unnecessary code, which took much longer to write and debug6. When following the TDD cycle, I can write only the code needed to make the tests pass.
This is a rather extreme example. Certainly, if I know what's coming down the road on a real-world project, I might enhance my code accordingly. However, I've become cautious about writing solutions for problems that don't exist yet. All too often, I see projects include stories to prepare for future features many sprints or releases down the road. When those features get de-prioritized or cancelled, all that's left is the unnecessary code.
TDD lets us focus on small portions of code.
Have you ever had an entire sprint's work rejected because the tester's password expired? All too often, the scope of our tests is a bit too broad. Our debugging process is often pretty bloated, too. We've followed this development cycle far too often:
- Run the web site7.
- Log in.
- Navigate to a particular page.
- Enter some data.
- Maybe it's a lot of data.
- Click a button to generate a result.
- Wait for the result.
- Realize that you didn't change whatever code you were going to change.
- Make whatever tiny change you were going to make.
- Goto 1.
I've worked on projects where running the web site, navigating to the right page, and setting up all the data takes over 5 minutes. Imagine trying to troubleshoot when you have a 5-minute penalty every time you run!
Following the TDD cycle, I can write my automated test code once and let the computer do all of that setup work for me. Running a test takes a fraction of a second, which means I can make a lot of little changes without wasting a lot of time for setup.
These are the things that hooked me on TDD.
Your experience may vary, but it took me a while -- years, perhaps -- to get to the point where I was faster with TDD than without it. Early on, it took me a while to write both tests and code, but eventually test writing, abstraction, and dependency injection became second nature to me. Now I work best when I deal with small, test-sized snippets. I'm more efficient -- and happier -- when I follow the TDD cycle. Even if it makes me a TDD Sith Lord.
Footnotes (in case your browser doesn't show tooltips for title tags)
1Test Nazi. No more soup for you.
3 Jillion. It's a lot.
4 Besides, I'm too busy being the target of resentment for my TDD enforcement.
5 If the hardware had been available, I'd have wired up a bar code scanner and a produce scale.
6 I still got my certification; it just took me almost 10 times longer than it had to.
7 Heavens forbid it's an old Windows Service with those 10-second delays coded in so we can get the debugger attached.