Unit Testing – When Is It Appropriate to Skip Unit Tests?

tddunit testing

I work in a small company as a solo developer. I'm the only developer at the company in fact. I have several (relatively) large projects I've written and maintain regularly, and none of them have tests to support them. As I begin new projects I often wonder if I should try a TDD approach. It sounds like a good idea, but I honestly can never justify the extra work involved.

I work hard to be forward-thinking in my design. I realize that certainly one day another developer will have to maintain my code, or at least troubleshoot it. I keep things as simple as possible and I comment and document things that would be difficult to grasp. And the fact is these projects aren't so big or complicated that a decent developer would struggle to comprehend them.

A lot of the examples I've seen of tests get down to the minutiae, covering all facets of the code. Since I'm the only developer and I'm very close to the code in the entire project, it is much more efficient to follow a write-then-manually-test pattern. I also find requirements and features change frequently enough that maintaining tests would add a considerable amount of drag on a project. Time that could otherwise be spent solving the business needs.

So I end up with the same conclusion each time. The return on investment is too low.

I have occasionally setup a few tests to ensure I've written an algorithm correctly, like calculating the number of years someone has been at the company based on their hire date. But from a code-coverage standpoint I've covered about 1% of my code.

In my situation, would you still find a way to make unit testing a regular practice, or am I justified in avoiding that overhead?

UPDATE: A few things about my situation that I left out: My projects are all web applications. To cover all my code, I'd have to use automated UI tests, and that is an area where I still don't see a great benefit over manual testing.

Best Answer

A lot of the examples I've seen of tests get down to the minutiae, covering all facets of the code.

So? You don't have to test everything. Just the relevant things.

Since I'm the only developer and I'm very close to the code in the entire project, it is much more efficient to follow a write-then-manually-test pattern.

That's actually false. It's not more efficient. It's really just a habit.

What other solo developers do is write a sketch or outline, write the test cases and then fill in the outline with final code.

That's very, very efficient.

I also find requirements and features change frequently enough that maintaining tests would add a considerable amount of drag on a project.

That's false, also. The tests are not the drag. The requirements changes are the drag.

You have to fix the tests to reflect the requirements. Whether their minutiae, or high-level; written first or written last.

The code's not done until the tests pass. That's the one universal truth of software.

You can have a limited "here it is" acceptance test.

Or you can have some unit tests.

Or you can have both.

But no matter what you do, there's always a test to demonstrate that the software works.

I'd suggest that a little bit of formality and nice unit test tool suite makes that test a lot more useful.