Techniques for assessing team test confidence

automated-teststesting

I am trying to determine what level of confidence my development team has in our automated test base (unit, integration, webtest). I would optimally like to get sensible answers to questions like:

  • What refactorings do you expect to be covered/not covered by tests ?
  • Would you trust a green build to auto-deploy for acceptance test environment ?
  • Would you trust a green build to auto-deploy directly to production ?

I was hoping there were some pre-existing metrics and possibly questionnaires I could use to explore this area. My goal would be to increase the level of automated deployments, but I really only believe we can automate things the developers trust/believe in.

Anyone know any techniques to explore this ?

Best Answer

I'm not aware of any existing work in this area. Your questions are good and putting them on a paper with 1-5 stars for each question, that should give you an idea. Other factors to take into account:

  • How often were bugs found after the tests that the tests ought to have caught? You can track this with a special field in the bug database
  • How often do developers run the tests? Here, confidence == 1/frequency. If they believe in the tests, they'll run them often. If they don't believe in them, they won't run them or even see them as a pain.
  • How often does the build fail on your CI server? This is usually an indication that tests are brittle and run only on developer machines or that developers don't run the tests before they commit.