Why to let / not let developers test their own work

development-processtesting

I want to gather some arguments as to why letting a developer testing his/her own work as the last step before the product goes into production is a bad idea, because unfortunately, my place of work sometimes does this (the last time this came up, the argument boiled down to most people being too busy with other things and not having the time to get another person familiar with that part of the program – it's very specialised software).

There are test plans in this case (though not always), but I am very much in favor of making a person who didn't make the changes that are tested actually doing the final testing. So I am asking if you could provide me with a good and solid list of arguments I can bring up the next time this is discussed. Or to provide counter-arguments, in case you think this is perfectly fine especially when there are formal test cases to test.

Best Answer

As others (and yourself) have noted, developers should unit test their own code. However, after that, any nontrivial product should also be tested by independent person(s) (QA department and/or the client herself).

Developers normally work with the developer mindset of "how to make this work?". A good tester is thinking about "how to break this?" - a very different mindset. Unit testing and TDD does teach developers to change hats to some extent, but you shouldn't rely on it. Moreover, as others have noted, there is always a possibility of misunderstanding requirements. Therefore final acceptance tests should be conducted by someone as close to the client as possible.

Related Topic