We have all unit tests (for a module) in one executable.
The tests are put into groups. I can execute a single test (or some tests) or a group of tests by specifying a (test/group) name on the command line of the test runner.
The build system can run group "Build", the test department can run "All". The developer can put some tests into a group like "BUG1234" with 1234 being the issue tracker number of the case he's working on.
They're NOT an ABSOLUTE Reference Documentation
Note that a lot of the following applies to comments as well, as they can get out of sync with the code, like tests (though it's less enforceable).
So in the end, the best way to understand code is to have readable working code.
If at all possible and not writing hard-wired low-level code sections or particularly tricky conditions were additional documentation will be crucial.
- Tests can be incomplete:
- The API changed and wasn't tested,
- The person who wrote the code wrote the tests for the easiest methods to test first instead of the most important methods to test, and then didn't have the time to finish.
- Tests can be obsolete.
- Tests can be short-circuited in non-obvious ways and not actually executed.
BUT They're STILL an HELPFUL Documentation Complement
However, when in doubt about what a particular class does, especially if rather lengthy, obscure and lacking comments (you know the kind...), I do quickly try to find its test class(es) and check:
- what they actually try to check (gives a hint about the most important tidbits, except if the developer did the error mentioned above of only implementing the "easy" tests),
- and if there are corner cases.
Plus, if written using a BDD-style, they give a rather good definition of the class's contract. Open your IDE (or use grep) to see only method names and tada: you have a list of behaviors.
Regressions and Bugs Need Tests Too
Also, it's a good practice to write tests for regression and for bug reports: you fix something, you write a test to reproduce the case. When looking back at them, it's a good way to find the relevant bug report and all the details about an old issue, for instance.
I'd say they're a good complement to real documentation, and at least a valuable resource in this regard. It's a good tool, if used properly. If you start testing early in your project, and make it a habit, it COULD be a very good reference documentation. On an existing project with bad coding habits already stenching the code base, handle them with care.
Best Answer
There are 2 subtly different aspects to the documentation of unit tests.
The unit test name should just as descriptive and chatty as you can make it. This is not for the purpose of documenting the test, but for providing information about a failed test at a higher level. If you see a failed test in a report, the name will tell you why the code failed, and you can go right to the code without having to check the unit test code to get the details. In other words, it documents the purpose of the test.
The unit test code itself should be clear, self-documenting code, like anything else. It should only have additional comments if there is something tricky going on; and with unit tests, there should almost never be something tricky going on.
So, absolutely not. No formalized extractable comments. The use of unit tests to "document" the behavior of the code under test stems naturally from their nature, as being a collection of "example code snippets". But those "code snippets" don't require any additional comments to serve this purpose -- just read the code.