Why Automated Testing Keeps Failing in Companies

tddtest-automationunit testing

We have tried to introduce developer automated testing several times at my company. Our QA team uses Selenium to automate UI tests, but I always wanted to introduce unit tests and integration tests. In the past, each time we tried it, everyone got excited for the first month or two. Then, several months in, people simply stop doing it.

A few observations and questions:

  1. Does automated testing actually work? Most of of my colleagues who used to work at other companies have tried and failed to implement an automated testing strategy. I still haven't seen a real-life software company that actually uses it and doesn't just talk about it. So many developers see automated testing as something that is great in theory but doesn't work in reality. Our business team would love developers to do it even at a cost of 30% extra time (at least they say so). But developers are skeptics.

  2. No one really knows how to properly do automated testing. Yes we have all read the unit testing examples on the internet, but using them for a big project is something else altogether. The main culprit is mocking/stubbing the database or anything else that is non-trivial. You end up spending more time mocking than writing actual tests. Then when it starts taking longer to write tests than code, that's when you give up.

  3. Are there any good examples of unit tests/system integration tests used in a complex data centric web applications? Any open source projects? Our application is data centric but also has plenty of domain logic. I tried the repository approach at some point and found it pretty good for unit testing, but it came at the price of being able to optimize data access easily and it added another layer of complexity.

We have a big project undertaken by 20 experienced developers. This would seem to be an ideal environment to introduce unit testing/integration testing.

Why doesn't it work for us? How did you make it work at your company?

Best Answer

The hardest part of doing unit testing is getting the discipline to write tests first / early. Most developers are used to just diving into code. It also slows down the development process early on as you are trying to figure out how to write a test for the code. However, as you get better at testing, this speeds up. And because of the writing tests, the initial quality of the code starts off higher.

When starting out, try just to write tests. Don't worry so much about mocking/stubbing things in the beginning. Keep the tests simple. Tests are code and can/should be refactored. Though along those lines if something is hard to test, it could also be the design. TDD does drive towards using most design patterns (in my experience, particularly the Factory pattern).

Make sure that the tests get a level of visibility. Integrate them in the release process, during code review ask about them. Any bugs found should get a test. These things are where TDD shines.

Here are a couple of resources that I have found useful:

http://misko.hevery.com/attachments/Guide-Writing%20Testable%20Code.pdf

http://www.agitar.com/downloads/TheWayOfTestivus.pdf

Edit:

One thing to keep in mind when you are writing tests. You are not trying to specify anything about the implementation of the code, only the behavior. When you write code, you test it all the time. Trying to execute it with debug statements and so on. Writing tests formalizes this and provides a record of the tests that you have. That way you can check your functionality confidently without accidentally skipping a test case that you remembered halfway through the development process.