TDD is not a one size fits all approach to software, but it's pretty damn useful. It's also a reasonably difficult habit to form.
For those who TDD regularly, what made things click?
TDD is not a one size fits all approach to software, but it's pretty damn useful. It's also a reasonably difficult habit to form.
For those who TDD regularly, what made things click?
For further actions, you may consider blocking this person and/or reporting abuse
Elisabeth Leonhardt -
mosbat -
Michelle Sanseverino -
Hanzla Baig -
Top comments (62)
One of the biggest "aha"'s I've had was when working on a legacy codebase with very few unit tests.
I needed to change the behavior of a couple of methods, but I wasn't super sure what their current behavior was to begin with.
A lot of folks would reach for their debugger and spend the next while meticulously stepping through the code. But that felt slow to me...
So I put together a suite of tests around them (prior to making any code changes) to verify and document what they were really doing. Once that was done, the overall change was as simple as adjusting a couple of assertions, causing those tests to fail and reimplementing the methods to meet their new expectations.
It's one thing to see TDD as a way to confidently write a brand new feature, it's a whole other ballgame to see it as a way to confidently navigate and tease apart legacy code. It was like being in a super dark tunnel, and finding a flashlight. Literally everything got easier.
And don't even get me starting on working on an old codebase with a good suite of tests. Pure. Bliss.
Same here. Was implementing an embedded client of a client/server system. I did write tests but not TDD. Then, our backend developer left and I needed to take over the code. It was a complete mess, it was slow, Spaghetti code and no tests. Developer horror story. Then I wrote test by test to freeze the current behavior, found and fixed tons of bugs and threw away half of the code after refactoring.
From that moment on I was in love with TDD. The next project I did was all TDD: Zero bugs in production.
Fantastic example. Sounds like a great way to both help you understand what exactly the code is doing, document it, and keep it safe for future devs.
Can't say I've ever taken this approach before, but I'm gonna try it out next time I'm working on modifying some legacy code.
I'm kind of a coding puzzle fanatic, and CodeWars has TDD built in, I've found it's way easier to teach people TDD after they've used CodeWars because they've already used testing!
Oh no, another seemingly great source of knowledge for me to sink into, rather than study! I'll check this out in the next few days, thanks!
Initially, it didn't click because I was introducing testing to Legacy code, which wasn't very test-friendly...
The first time we had a brand new class in the code, and it was started with testing in mind - it was amazing.
And then what really brought the point home was when we started changing that class and breaking tests(in a good way). That feeling of confidence while introducing breaking changes was what got me hooked.
I don't use TDD in my normal coding workflow because it's not something that makes sense in that context (still waiting for that aha moment, but I expect it will never come).
For fixing bugs, I think it was when a more senior developer ran me through the process and crystallising the conditions that caused the bug made it seem very appealing.
For those who regularly use TDD, can you explain how it's meant to work for greenfield work?
I'm currently working on a project which we started back in Jan 2015, greenfield, with TDD through and through.
It's an API, and the first test we wrote was to successfully do a GET to an endpoint. Say
/v1/user
. Then we add the route, create the view and the serializer (it's a Django REST Framework API) and we get a response. Completely dummy at first, since the test only expects a 200_OK status_code. Then we add an assertion for the content, which must be an array. Now we create the model, with some fields, and make sure the wiring on the serializer is in place.Later on, we would be adding tests that represent a bunch of business rules, like if you create a
Foo
object a relatedBar
object should exist in the database, with a given set of characteristics. And that's how we keep the system evolving, adding new features and modifying behavior, make sure everything else is in place.EDIT: For a more practical example, one can have a look at this test file which is from a personal project of mine. Even though it's not based on the library I mentioned above, this one uses Flask, I took the same approach with it. Started with the most basic test, then moved bit by bit, by writing tests that would depict the outcome I wanted.
I can see how it would make sense for a REST API where you have a defined endpoint and response format, and actually not a use case I had considered TDD for, nice one.
My issue with TDD is how it's meant to apply, if at all, to a task when you're adding a high level feature and TDD implies you write tests for the units (functions) first. How do you know what functions to have or do you go full waterfall and do all the design up front? In that kind of case, I go for TLD but TDD advocates really don't like that and I'm not sure why.
It's hard to describe it without a clear example, but I can assure you I have created and evolved dozens of high-level features using TDD. And, yes, it's true that TDD needs clear and defined requirements, but that does not mean that these requirements are final.
You see, if you have any given requirement, fully-detailed or not, and you need to get implementing it, you will always have a scenario to implement, if it was not defined by the Product Owner (or similar role) you will take a guess and go for it, right? So, you create a test for that scenario. Make it pass. Validate the feature with the P.O.
Oh, that's not what he had in mind? Then ask him to point it out how it is not. Now you have a clear requirement. Modify the test. Change the code. Validate the work.
TDD, at its core, is about incremental change. Don't take giant leaps of faith, instead take small, clear-view, steps.
Finally, if you could provide me with the scenario you have in mind I could help you out with it. Hit me up on Twitter. :)
I don't think I've ever heard TDD talk about incremental steps, it's always been the focus on at the unit level.
I don't have an example at the moment, this is mainly from my last job where I couldn't figure out how to fit TDD in for new features. It was still very much a start up in how the day to day went so devs were given a feature to implement and that was it. Aside from the requirement, there wasn't even a function to call so you couldn't really write a high level test at the start.
You seem to be confusing unit testing with TDD.
Unit tests are one type of tests, the bottom layer of the Test Pyramid, which also defines the Services and UI tests.
Whereas TDD means Test-Driven Development. Simply put it says that you should write a test first, and only then you should write the code to make it pass. After that, you should refactor, add a new assertion (requirement), making it fail again, and write the code to fix it. It can be any kind of tests. An integration test that talks to a 3rd-party system, a unit test that mocks all external calls but asserts that the inwards of an endpoint behave properly or an automated UI test that validates user input and the error message display feature.
It seems people these days have not read Test-Driven Development. They end up getting to know TDD from different places, with different levels of adoption, etc. I highly recommend anyone to read it. It is simple, clear and small. And on top of that is super effective. I usually name it as one of my career-changing books (the other one being The Agile Samurai).
This is all good - one quick clarification: a Unit test does not necessarily test a function/class/object/whatever.
It tests a unit of behaviour.
TDD does not mean adding one test per function/class/whatever. It should mean writing a test that describes the behaviour of your program, and then writing code that implements that behaviour.
Still waiting. π
By pure observation of mobile teams I've been in contact with I've noticed how much TDD is not a thing.
To be frank TDD is quite hard when a UI is involved but you can still practice it on the behaviors I guess (don't know much about iOS development :D)
It is very difficult concept to wrap your brain around. I like the idea of it, and Iβm very interested in learning it more throughly but in practice its just not clicking.
Yeah I get that. Sometimes I do TDD and sometimes I write tests after. If I write tests after I write the shells of the various tests so I won't forget to check them later.
TDD helps me a lot with refactoring and with legacy code. I don't always practice it when I'm exploring.
Maybe you can go back to the source and read the "original" TDD book (still quite expensive but worth it): Test Driven Development: By Example
OoOoO! Thanks! Ill add it to the list!
When I realized that writing test-able code resulted in me writing better code regardless of the tests I wrote. Just by ensuring my code is easy to test meant that my code was more focused, organized and well designed.
"Ugh, this massive class is going to be a pain to test. I better break it down into smaller, more composable pieces to make my life easier" => Single Responsibility Principle
"Ugh, I don't want to have to mock out these constructors and static methods. I better pass those dependencies in to make my life easier" => Dependency Inversion Principle
That's S OLI D - 2 out of 5 key design patterns just to make my life easier in the short term. How can you not follow something that gives you that?
TDD also puts strong pressure on the developer to do the L for Liskov's Substitution Principle, because mocking / fakes requires it.
And it even puts some light pressure to do the I for Interface Segregation Principle... depending how deep the code goes into DIP.
Very good points as well!
I'm struggling with linking Open/Closed Principle to TDD. Anyone else want to step up?
They seem orthogonal IMO.
Perhaps we can say that a module that's closed for modification is not going to see as much rewriting of tests. In other words, the need to throw away tests could be a signal that you're not making as much use of inheritance as you could be. (Though perhaps not a useful signal, since if you're modifying the code then it's already obvious that it isn't closed, and there might also be a good reason for that.)
For the open-for-extension part -- if you're inheriting from something, you don't need to touch its tests, so I wouldn't expect any interplay with TDD.
To be honest, I think that the other parts tie in is just due to the fact that TDD is most practical when unit testing small, decoupled modules, and if your code is organized around classes, that mostly implies SOLID. In other words, to the extent that TDD and SOLID correlate, modularity might be a mediating factor.
I say this because I think you can argue that in the FP paradigm, SOLID-like properties are easier to hit just as a matter of course, in which case TDD wouldn't have as much architectural impact. (That said, there's certainly nothing preventing one from writing large, tightly-coupled functions.)
Guard is a gem - github.com/guard/guard
It has plugin like guard-rspec.
What Guard does is it watches a number of files and restart your server when needed (changing Gemfile, routes etc).
Guard-Rspec add to that by watching your test files and automatically running them when you save
So my 'aha' moment is a bit of an outlier, because I leaned to code using TDD! It's weird for me to think about writing code without there being some sort of expectation about what it's going to do - even if it's just
1 + 1
- I expect that to do something even as I write it.Tests are those expectations written down, and TDD is writing those expectations first.
I'v written a lot of bad tests, too coupled, too slow, - I'm no expert, but I'll always try and write a test first.
For me it was more the other way round where I got the gist of TDD, but breaking bad habits it hard. It was hard to break the habit of diving straight into building the function/feature instead of writing the tests first. But once I made the conscious effort to write tests before building the feature, I started seeing my programming skills going up a notch. To the point now I feel like a fish out of water when I don't write unit tests.
The thing that TDD forced me to do (as a junior dev then), was to question the purpose and outcome of the task and feature requirements. Only then can I write my unit tests. If I can't even write unit tests for a task, it means that I am still unclear of the task's requirements. This is especially true in larger companies where the chain of command may be long and with that greater chances of errors and misunderstanding in communication...
Although I've been solving code katas for quite a while they didn't bring me the aha moment - even though I was following a TDD. But when I was working on a data migration tool where quite complex but very well specified conversions had to be implemented, there I had it. I think it would have taken much longer to implement and debug those tasks without TDD.