TDD - Listening to Your Tests

The practice of TDD is a handy design tool, especially around the levels of classes and methods within those classes. It’s only useful in this way if you’re actively listening to your tests, however.
The most obvious piece of feedback is whether or not your code works, but there’s far more feedback you get when writing a test than this.
Here’s a quick list of things that come to mind when I’m writing tests:
- How much friction am I feeling when writing the test?
- How long is it taking me to write each test?
- Am I required to think about many different concepts at once while writing the test?
- How dependent is my test on the implementation? A test will always depend on the implementation, but by how much?
- Have I written a similar test before in another area in the system?
- Is my test file growing too big with too many tests?
- Are there too many mocks to keep track of?
- How easy to read are the tests? How much knowledge of the implementation is required to understand the test? How closely do they relate to the use case I’m implementing?
- Is the interface I’m designing of benefit to the rest of the system and is it easy to use?
Beginners of TDD like to skip these questions and ignore all/some of them. They’re perhaps under pressure, or an environment exists that only focuses on how long a story takes and plough forwards rather than taking the extra time to think and reflect on the tests. Each time this happens, damage get’s accumulated and slowly grows throughout the system. Competitive environments also encourage other developers to take these shortcuts as they feel they mustn’t look slow compared to the other team members.
Dealing with these questions is the process of refactoring. Refactoring is a continuous process. Refactoring isn’t putting tickets on a board with a plan to do it later or prioritise amongst stories. It’s deeply mixed in as you implement new features.
The best time to refactor is precisely the moment you realise you need to refactor - there are no dependencies to deal with, there are no past features you need to retest; this is the easiest it gets. Realising when you need to refactor and finding out as early as possible is where TDD fits in. TDD gives you feedback on the design of your code for every test you write, immediately.
The alternative is to ignore the issues you feel, and continue implementing the rest of the features you need, carrying small pieces of debt as you go. Small amounts of debt then build up and spread to other classes and other tests. Now, if you want to refactor, you’ve got to go and change this other stuff too. If the tests are high maintenance you’ll likely have to fix the structure of some of the existing tests, which further increases the risk that you might break something.
On short projects with small teams, you’ll probably get away with avoiding these questions and ploughing forward. But, on larger projects with larger teams, the risk will only keep growing until it can’t be ignored.
The most common reason I hear for skipping these refactors are that the issues are too small and don’t really matter much. The whole point of refactoring is that the issues we’re dealing with are so tiny that we can do it continuously. Ignoring the small issues will create larger issues where eventually ‘refactoring’ becomes a type of story on the board that has to be negotiated and prioritised with the product owner. I also find this excuse to be a very slippery slope where more and more “doesn’t really matter much”.
Start Listening
With every test I write, I’m ‘feeling’ how good or bad things are based on the questions in the previous section, along with things like the SOLID principles and thinking about coupling and cohesion.
I have two different options when I want to resolve any of my doubts:
- Change the test code
- Change the implementation code
You’ll often use a combination of both making the test code better and making the implementation code better. Choosing which one to change can be difficult and comes with experience by trying different things out.
For example, say you have a test class that is very complicated to set up a test for. One solution is to hide all this complicated setup in some test helper classes/functions that are used in the test code. Entity framework comes to mind here; I’ve seen some pretty extravagant mock setups that create mocks and pass other mocks into the mocks and it’s all far too complicated. When tests become complicated to setup something is wrong - there’s always a simpler way.
If a class is hard to test don’t battle against test setups but look at making the class you’re testing easier to test. A class that is easy to test is usually easier to use and definitely easier to understand.
It’ll Feel Slow
These questions will make you feel slow; this is where the temptation comes in to ignore them. The problem is, the more you ignore the questions, the more often these questions are going to keep coming up as you have to deal with existing code that’s already been written.
The productivity I have with TDD in a codebase developed with these questions in mind is much higher than a codebase that has ignored these questions. Code that hasn’t been developed in this way usually generates more friction when writing the tests. I then have to investigate and see if I can improve the situation. I can’t always solve it entirely (especially with legacy code), so I end up carrying around this extra bit of weight in the other tests I need to write.
Resolving these questions as you go won’t make the code perfect; you’ll still need to refactor and adapt the code as you learn more as the project goes on. But, it does put you in the best possible position to deal with any changes in the future as soon as they come up. That’s what agility (and agile) is about. Needing to prioritise and plan a refactor is very far from what I would describe as agile.