DEV Community

Discussion on: Is testability a reason to change your design?

Collapse
 
craser profile image
Chris Raser

Code design is always a game of trade-offs between concerns. But yes, testing/testability is an important consideration. It may help you and Person A find common ground if you're more lucid about what concrete benefits you hope to achieve by testing. It's easy to write off testing as "just testing", but it's harder to dismiss the usefulness of those tests in supporting refactoring efforts, preventing regressions, etc.

I love Sarah Mei's note on Five Factor Testing. It breaks down the goals of testing, and offers some great insight into how to write better tests and better code if you're clear about which of of those goals is most important to you.

Collapse
 
n_develop profile image
Lars Richter

Hey Chris,

Thanks a lot for the tip with the "Five Factor Testing". The article is very good and pretty insightful.
Valuable stuff.
Thanks.

Collapse
 
eljayadobe profile image
Eljay-Adobe

Thanks for the link to Sarah Mei's article on Five Factor Testing.

The one thing I'd change a little bit in Sarah's article is where she talks about integration tests. I don't think developers should write any integration tests (and system tests, and functional tests, and acceptance tests, and performance tests, and...), that is what the quality engineers should create. Otherwise the chance of an integration test have a "blind spot" that corresponds to the implementations self-same "blind spot" approaches unity.

In a previous project, the unit test suite (~70% code coverage) took about a second to run. Unit tests are what the developers should create. That's the proof for basic correctness, provide design (in the small) guidance, the refactor safety net, the regression catcher for violating basic correctness, and the documentation-via-code of functionality.

Unit tests makes sure the nut passes all its requirements, and the bolt passes all its requirements. But says nothing about the nut and the bolt working together. Effectively, unit tests fill the gap for languages that do not provide facilities for design-by-contract -- which is (unfortunately) most of them.

That same project, the integration test suite took over 600 hours to run. The integration test suite was the "when you put this nut and this bolt together, do they work together correctly?"

Integration tests (and system tests, and acceptance tests, and performance tests) serve a very different purpose than unit tests.

Joe Rainsberger has a good presentation Integrated Tests Are A Scam where he argues passionately that integration tests are no substitute for unit tests. I think the title is a bit inflammatory to pique curiosity.

Also, for Behavior Drive Design kind of stories that are written such that they can be executed, such as by using Cucumber story executer and Gherkin story language, those should be written by the product owner, and perhaps with assistance of the business analysts. If they are being written by testers or by developers, its being done wrong.

Collapse
 
craser profile image
Chris Raser

All excellent points. For a large, fully-functional development organization, I whole-heartedly agree with everything you've pointed out.

I think Mei and Lars (the original poster) are in similar situations, in that they are either working on small teams where roles blur, or with company/team cultures that don't fully value automated testing. In those circumstances, it's a victory just to have automated unit and integration tests, regardless of who's writing them. As they say, "Perfect is the enemy of good."