Episode 4 is the longest of all episode so far and it is based on an hour-long interview I did with J B Rainsberger a few weeks ago. The interview was very intense and we got into a lot of details. Nevertheless, I had to cut the hour down by at least half. So if something is not clear, it's my fault as an editor.

J. B. Rainsberger helps software companies better satisfy their customers and the businesses they support. He has learned to write valuable software, overcome social deficiencies, and built a life he enjoys. He has traveled the world helping people get what they want out of work and out of their lives. Recently he has launched MyAgileTutor.com to help even more people start getting the advice they need with minimal investment. He lives in Atlantic Canada with his wife, Sarah, and three cats.

You can contact Joe through http://www.jbrains.ca/

Q: When was your first contact with TDD and what did you think of it at the time?

JBR: It was either in 1999 or 2000. I was working as a programmer at IBM, about three years into my professional career, and I reached that point where I really felt uncomfortable with the quality of my work. I would come into the office, I would find out that I created a bunch of bugs; I would spend the day trying to fix them. If I was lucky I'd even get to spend some part of the day trying to add some new behavior. I would go home believing that I did good work and when I came back the next day there'd be more bug reports. And this cycle just kept continuing. Eventually I reached the point where I felt like I just wasn't making any progress at all, and it started to really make me feel quite anxious. I would go in to work starting to think 'Not only am I not making this better, but I don't know when I'm going to finish'.

So finally one day I just said 'No, I have to do something else', I said the magic sentence for any professional programmer: 'There's got to be a better way'. So I opened up a search engine and typed in 'How to test Java code'. I found junit.org and I also found an early draft of the book 'Extreme Programming Installed'. So I downloaded JUnit and started playing around with it. I read this book that had some examples of how to do test-first programming in this very strange language, Smalltalk, which I didn't understand. And with these two things together I started practicing test-first programming - as we knew it at the time. I just started with this idea of: If I write the test then I'll know exactly when I've made a mistake and I can fix it as soon as I have made it.

What I always tell people is that Extreme Programming will divide the world into two groups: the group of people that immediately thinks 'This is crazy and will never work', and the group of people that immediately says 'This is the only way it could possibly work, and how stupid are we for not doing it this way all along'. And I was in the second group.

Q: So nobody had to convince you that this was a worthwhile approach?

JBR: I believed it straightaway. But I wasn't really convinced by it until I had my first industrial-strength experience, and this happened just soon after. I had gradually moved into a team of one; I was working on my own component, which was fairly independent from the rest of the system. That gave me a lot of autonomy and it gave me the opportunity to try any technique I wanted to make my work better. And then the release deadline was coming up. I remember I was in a project-status meeting, Friday afternoon, about three weeks from the code-freeze date and I had been going through this cycle for a few weeks: fixing a couple of bugs while creating two new ones. I said to my manager in this meeting: 'Bad news, I'm not going to make this deadline; the ship is leaking faster than I can plug the leaks. The good news is I have just been playing around a couple of months with this idea of test-first programming where I write the test to make it clear what I'm trying to do, and then I write the code to make that test pass. So here is my proposal: Let me work on this at home where I have no distractions. I'll start again, I'll build the whole thing again, and if I can't finish it in three weeks, you can fire me.' I was 24-25 years old, so I could afford to be this reckless with my career. So he said: 'Fine, go home, go as fast as you can, come back when it's done.'

And that's what I did. I went home; I would spend every day following the steps of test-first programming. I rebuilt my little component from scratch. I worked until I couldn't stay awake any longer, went to sleep, woke up the next day and kept working. I ended up finishing all the work in nine 14-hour-days and towards the end I was: write one line of code, run the tests, which would take 10-12 minutes to run. Because I was exhausted, I would take a nap while the tests were running. I'd wake up when the tests had finished and then I would take the next step. Towards the end I was doing only three or four code changes every hour, but I managed to redo three months of work in nine days. Now I know that some of that was the second-system-syndrome, but that was what gave me a feeling of the so-called ratchet effect: the idea that you take a step forward and then the ratchet locks in place and now you can't fall very far.

What is funny to me now: Back then I didn't know anything of how to use this as a design technique. I was using this purely as a technique for avoiding bugs, for discovering mistakes as soon as I made them. But that experience of rewriting three months of work in nine days really crystallized for me the potential of this technique. I wasn't worried about going faster, I wasn't worried about good design, I was only worried about making steady progress and eliminating this feeling of never going to be finished, of always making it worse instead of better. This episode made it crystal clear for me that test-first programming and I were very compatible, that it matched the way I worked, the way I thought, and it helped solve real problems I had. I was hooked.

Q: What has changed in the way you practice and teach TDD since those early days?

JBR: Early on my emphasis was on removing or avoiding defects, on discovering mistakes quickly. The big change happened when I began to see how I changed the way I thought about design. The first obvious change was that I felt more comfortable making fewer design decisions upfront, that I could let design emerge and that that was okay. So for a long time I was thinking 'Oh, this is a design technique'. And then a few years later I really started to understand that the act of practicing TDD became a way, not just to design software, but to think about design. TDD became a technique for learning about the principles of design. When I teach TDD, I talk about this progression: It goes from a testing technique to a design technique to a learning technique.

It's like chess. You can learn the rules in five minutes, but the strategies of chess take a lifetime to master. The bad news is: You are never going to be a perfect designer. The good news is: There is always something more to learn about design and here's the way to learn it.

Q: Are there situations in which you consider TDD not to be the right approach for developing software?

JBR: Absolutely. If I'm going to throw it away in five hours, then TDD is not all that valuable. If I'm working in a situation where I know I will throw it away soon, if I really understand the technology, if I really master the libraries, then I like not to bother. Now, as a long-time practitioner of TDD, it feels like I'm doing TDD really quickly in my head, even when I don't do it in tests. So I certainly don't do TDD with the same discipline in those situations as I do it in other situations. If I don't need to maintain software over a long period of time, I can feel more comfortable about writing fewer tests. If the cost of it failing is very low, then I don't worry too much about whether I write tests for it. That's particularly true in cases where there's something small, and I know that if I get it wrong I can just rewrite it from scratch. I feel pretty comfortable doing something a second time. I don't like doing it a third, fourth and fifth time, but I don't mind doing it once, hacking it together, and then - if I see it's going to take on a longer life-span - throwing it away and rebuilding it with more discipline.

Ron Jeffries has his old saying: 'Whenever I don't do things in a disciplined way, sooner or later that causes problems for me, and I wish I had done it in a disciplined way.' For a long time I used to take that to mean: You should always do everything in a disciplined way. And I think everyone goes through that phase and I think that's okay.

Financial people use the term 'overpaying for a guarantee'. When you apply too much discipline too often you're overpaying for a guarantee. People have to go through this period in order to really understand when they are being wasteful. It's one thing to say 'You don't always have to be perfectly disciplined'. That doesn't mean that you understand when to be perfectly disciplined and when not to be. I only know some very obvious places where there's too much discipline, like when I'm going to throw it away in five hours or two days. But as soon as somebody else wants to use my software, I want to write tests. As long as I feel comfortable redoing some part of it again or adding tests later, it increases the number of times where I feel comfortable lowering my discipline a little bit.

That's what technical debt is supposed to be all about. Technical debt is not supposed to be all about sloppy code, it's about making that conscious informed decision. I'm willing to let my discipline suffer now knowing that I might have to add some extra discipline back later. That's good debt. That's the kind of debt where you borrow a little bit of money, as long as you know that this will help you obtain more profit later. I think that a lot of people overestimate how good they are at making that trade-off. So when people ask me about "When should I not do TDD?" I say: "As long as you're prepared for the consequences of not working with discipline, you can afford to work with less discipline."

Q: I've seen this work for individuals, maybe for small teams, but for a company I'm always skeptical if they will ever recover from their technical debt.

JBR: Enterprises tend to not make the kind of conscious good debt decisions. They tend instead to panic because they have low cash flow; then they do all the typical afraid-project-manager things; they make all the typical project manager mistakes that come out of fear. I've never seen a larger organization, dozen or hundreds of people, which has institutionalized a sensible approach to technical debt. I've seen individual project managers who understand "The Mythical Man-Month" and "Peopleware" and books like that, who succeed in insulating one or two teams from the insanity of the rest of the enterprise. But I've never seen that successfully institutionalized. It usually only takes one or two people to panic, and then the rest of the enterprise panics too.

Q: What do you think is TDD's relevance in today’s world of Lean Startups, functional and concurrent programming and mobile everywhere?

JBR: There are a few things. I worry about TDD’s relevance. I have to make the distinction here between the mechanics of TDD and what I believe is the intent of TDD. The mechanics are following the rules, writing tests first, and so on, while the intent is providing a way for people to understand more deeply how to design well over time.

Let's go into some of those things: The little experience I have with Mobile, so far has shown me that mobile development seems to rely a lot more on frameworks that don't have competitors. In Android and iOS and so on, there is a small number of very widely adopted frameworks that weren't designed with TDD in mind and the mechanical part of TDD is difficult to do in those environments. You seem to spend a lot of time side-stepping the frameworks. It's just the same as in the early days of Java web development when there was just Struts and people spent a lot of their time side-stepping Struts. And then Spring Web MVC came into the picture, which was for me the first Java web framework that made TDD easier because of its more decoupled, more interface-heavy design style. I haven't seen the Android version of Spring Web MVC yet. It's possible that there will never be one. So it worries me a little that we're going to have a generation of programmers who are used to say: 'It's just a 99 cent game' or 'The failure modes in this app are so unimportant that we don't have to get it very right, we just have to get it somewhat right'.

Lean Startup encourages people to move into this direction a little bit, and that makes me uncomfortable. There's going to be a bunch of Lean Startups who - once they get going and realize how many corners they've cut in order to get early earned revenue - will realize that their discipline deficit is costing them money and that they need more disciplined techniques. That's one reason, I think, Lean Startup folks need to understand the discipline of TDD - or something like it. Early on going fast enabled them to bootstrap and to do a better job of getting going with less cash. But eventually cash flow is no longer the bottleneck, the bottleneck will be profit and they are going to wish they had a more disciplined approach. I assume that the Lean Startups that are still doing well in 2016, 2017 and 2018 are the ones that have some kind of disciplined approach that they deploy at the appropriate moment. It doesn't have to be TDD; it might be something like it.

Functional is the one area where I really don't know because I haven't done a lot of functional stuff. I like to say I'm an object-oriented guy who has a functional style, but I don't understand functional design deeply enough to say that with confidence. The little bit of functional design I've seen and understood, I got from a friend who taught me Haskell for three solid days. I had this vague feeling that the libraries are so good that it doesn't feel like there's anything to get wrong, so it doesn't feel like there's anything to test. You just figure out how the pieces go together and it just works. The kind of isolated object testing or isolated function testing that I talk about in "Integrated Tests are a Scam" doesn't apply very much in the functional world. What I think will matter more is what TDD has taught me: the design principles; the discipline in judgment; context-independence and modularity; the mindful approach to programming; the idea that TDD encourages me to stop every few minutes and really think about what I'm doing, rather than trying to figure out everything upfront. Maybe I'll do TDD in my head instead of writing the tests out.
When working in functional languages you need to be even more disciplined, and the question is: How did you learn that discipline? TDD happens to be the way I developed it, some other people have some other techniques, and in the 2020s there might be a whole generation of programmers who gained that discipline without ever using TDD. When that starts to come out, I'll be really interested to learn what that is. And maybe I'll have to unlearn a little bit of this TDD stuff.

Q: You already mentioned your talk "Integrated Tests are a Scam". As I understand the talk, you recommend to replace most or all integration tests by collaboration and contract tests. Is that a fair description?

JBR: Yes, the only thing I would change is the word 'integration' to 'integrated'. Steve Freeman and Nat Pryce [in "Growing Object-Oriented Software Guided by Tests"] talk about integration tests. They  specifically mean the tests that show that my stuff integrates with their stuff correctly, which is what I think of as collaboration and contract tests. 'Integrated' tests are those tests where we put clusters of objects together, and when a test fails we don't know which object is to blame. So let me refine your statement a little bit: I don't like to use integrated tests to find mistakes that, if I had just paid a little bit more attention, I wouldn't have made. I use the term 'basic correctness', which means: If we had perfect technology with infinite resources and infinite patience, would we compute the correct answer. I don't like to find mistakes in basic correctness with big tests. I prefer to find them with collaboration and contract tests; and then I get all kinds of really good design feedback from those small tests. However, there are a bunch of things that integrated tests are better at finding: They are better at finding emergent behavior that we didn't intend. They are better at illustrating when the system does something that it shouldn't do.

The scam part is a different question all together.  The scam part is where I think big tests encourage us to design poorly; and designing poorly makes it harder to write small tests, so we write more big tests, which encourage us to design more poorly; and that cycle continues and that's the scam. I think that integrated tests actually make the problem worse that we think they solve. It's like buying Aspirin that gives you a bigger headache.

Q: I see the theoretical beauty of your basic-correctness-approach. What I find missing is that - in order to make this approach work - people have to track their contracts in a more formal way to see that.

JBR: Yes, I've heard this a bunch of times. When somebody says this to me I reply: 'What you're saying means, that you have to really understand how to integrate with other people's stuff. Well, of course you do!' Imagine for a moment that you could perfectly understand every aspect of my API that you intend to use. Imagine that by some magic you could do that without writing everything down. Then you don't need to write everything down. But for most people, most of the time, if they don't write that stuff down, they don't think about it precisely enough to get it right. I'm just offering one way to do it: contract and collaboration tests.

Many thanks, Joe, for answering my questions!

Other episodes of the series: