re: I use test-driven development every day, ask me anything VIEW POST


I've been trying to get into TDD, and find myself able to do it only in specific isolated cases. I still feel like I'm missing the bigger picture.

  1. Whats the hardest stuff to test?
  2. If there is something hard to test, what approaches could we take to making it easier to test?
  3. How many tests, effort should I put into testing? Should it get to a point where I can hit deploy and after a while that code goes to production? (CI/CD?) Or is this just a pipe dream for those with a lot of man-power haha.

Thanks 😄


Hey Brad,

In my opinion there are two categories of code that are difficult to test: code I don't understand, code that is tightly coupled (I include testing across the wire, and asynchronous code in the tightly coupled group).

Code I don't understand

If I don't really understand the details of what I am trying to implement, attempting to write it with TDD will be very difficult. For instance, if I have never implemented an instance of the observer pattern before, I would first create a Spike in order to learn how I would structure this code. Once I have learned that, I throw away the untested code in the Spike, and use TDD to write the production code.

Tight coupling

Tightly coupled code prevents you from using bits of code in isolation, you end up bringing everything and the kitchen sink into the test in order to run the function or class. There are two strategies I could use depending on who owns the code.

If code is part of the system I am working on, then I will make an attempt to decouple it. I would use the patterns laid out in Working Effectively With Legacy Code by Michael Feathers. I also balance delivering working software and adding coverage. I do this by only decoupling enough code to allow me to test my new feature. The rest might happen at a later time. When learning TDD, I think it is important not to "boil the ocean", and just take the wins when you can get them. There are times for bigger refactorings, but it is up to you to determine when is the right moment.

If I don't own the code, I would test at a higher level of abstraction. For instance if two classes are using framework code that requires them to be tightly coupled, then I don't try to break them apart. The framework has already made a decision about coupling, it is going to be near impossible for me to change it—without making the code unreadable. Instead of fighting a loosing battle, I create a class or a function that wraps that interaction and I test from there. The tests calls out the high level behavior the framework code provides.

How much?

It is my personal opinion that a team should have enough tests in place to be confident that things will work when the system is deployed to production. When a team pushes to prod. and feels so scared that they spend an hour verifying the behavior of the system in production by hand, that tells me they didn't write enough automated tests.

CI vs CD

Every team I have worked on has had a CI pipeline in place, as making sure the master branch is always in a working state is necessary. However, due to the industries I have worked in, I have never seen automated hourly deploys to production. It was often the case that my team was one of a hand full that deployed twice a week, and that seemed pretty good.

It is more a function of your current company. Asking, "Is that much automation valuable to our users?" would tease out if that work is needed.

Thanks for the questions!

code of conduct - report abuse