Unit-testing – the effect of creating unit tests during development on time to develop as well as time spent in maintenance activities

maintenancequalityunit testing

I'm a consultant and I am going to introduce unit tests to all developers at my client site. My goal is to ensure that all new applications should have unit tests for all classes created.

The client has a problem with high maintenance costs from fixing bugs in their existing applications. Their applications have a life span from between 5-15 years in which they continuously add new features. I'm quite confident that they will benefit greatly from starting with unit tests.

I'm interested in the effect of unit tests on the time and cost of development:

  • How much time will writing unit tests as part of the development process add?
  • How much time will be saved in maintenance activities (testing and debugging) by having good unit tests?

Best Answer

Are there any statistics available to how much longer it will take to develop applications when creating unit test during the development compared to just coding?

There is some very interesting research about this. Read the following whitepaper:

Realizing quality improvement through test driven development: results and experiences of four industrial teams

The whitepaper and others research from one of its authors, Nachi Nagappan, are discussed here: http://research.microsoft.com/en-us/news/features/nagappan-100609.aspx

The study and its results were published in a paper entitled Realizing quality improvement through test driven development: results and experiences of four industrial teams, by Nagappan and research colleagues E. Michael Maximilien of the IBM Almaden Research Center; Thirumalesh Bhat, principal software-development lead at Microsoft; and Laurie Williams of North Carolina State University. What the research team found was that the TDD teams produced code that was 60 to 90 percent better in terms of defect density than non-TDD teams. They also discovered that TDD teams took longer to complete their projects—15 to 35 percent longer.

“Over a development cycle of 12 months, 35 percent is another four months, which is huge,” Nagappan says. “However, the tradeoff is that you reduce post-release maintenance costs significantly, since code quality is so much better. Again, these are decisions that managers have to make—where should they take the hit? But now, they actually have quantified data for making those decisions.”

Additionally, Jason Gorman has proposed such an experiment for the Software Craftsmanship conference this year. He's been trying an experiment creating the same application using a TDD and a non-TDD approach and he's recently blogged about his results:

Over 3 iterations, average time taken to complete the kata without TDD was 28m 40s. Average time with TDD was 25m 27s. Without TDD, on average I made 5.7 passes (delivering into acceptance testing). With TDD, on average I made 1.3 passes (in two attempts, they passed first time, in one it took 2 passes.)

Now, this was a baby experiment, of course. And not exactly laboratory conditions. But I note a couple of interesting things, all the same.

It will be interesting to see the full results of this experiment when more people perform it.

Are there any statistics available that shows how many hours maintenance decreases when having (good) unit tests?

From the whitepaper above:

The results of the case studies indicate that the pre-release defect density of the four products decreased between 40% and 90% relative to similar projects that did not use the TDD practice.