Practice on existing bugs/defects.
This is a really tough situation. I've never gone all the way to TDD from nothing before, but in my experience, getting a team to go from no unit tests to proactively writing them has been a very "one step at a time" approach.
First, get them comfortable writing unit tests and knowing really what they are and their benefits. For my teams, it's been best to write unit tests for existing bugs. Current bugs in systems have two things that you need to teach people to write unit tests well:
- an expected precondition and postcondition
- an outcome that currently is not what is expected and violates that precondition/postcondition
This gives members very concrete practice examples. They can write a test before they fix the bug, so that it fails. Then, they can fix the code so that it passes, and fixes the bug. Once they're comfortable with this, then you can get them the rest of the way so that they can write unit tests with no code up-front and then write new code to get their tests to pass.
I think the trick is to give them something to practice on where there are clear method pre/post-conditions. If requirements for methods are fuzzy, it's hard for even experienced TDD people to know exactly where to start. Take it a step at time and you'll get there. Good luck!
It's difficult to provide a definitive answer without better understanding the development processes that you usually use, the experience of your team, and more importantly, the complexity of the tasks at hand. To this I should also probably add the way in which you generally write your methods and the design choices you have made.
Unit testing always adds time to the development, and if you are testing your code properly, then you should be spending more time writing tests than implementation code. This sounds quite horrendous at first and is probably one of the main reasons why many choose to avoid unit testing in the first place. Even in my present workplace, it took me more than 4 years to convince everybody that their concerns over testing overheads were only as a result of not fully understanding the longer term benefits.
In the shorter term, and particularly when you first start unit testing, you find that your development time slows down considerably. This is a necessary learning period. You'll find most developers will be selective about where they apply unit tests, or will write their code first, and apply tests after the implementation is essentially complete. In doing so, they will find bugs, which will require a change in the implementation code, requiring a subsequent change in the tests. This will prove to be a very counter-productive process, and will make most developers wonder why they bother if they are always playing catch-up with the tests. The real epiphany will come when your developers start to code complete tests before writing the implementation code, because the resulting code will most likely be more concise, and will be much quicker to write.
An advantage of the test first approach is in the fact that the test will indicate the implementation code works at the exact moment that the code is written to satisfy the conditions of the test. Further, if the implementation code has been kept relatively concise it will likely be easier to modify with the tests, providing you with the real advantage of testing first, which is the assurance of a system that will identify the exact moment when the code has been modified to include an error. Modifications to code are much easier with supporting tests already in place, which significantly reduce your maintenance overheads and make debugging efforts much less onerous.
When you are very comfortable with unit testing and test first, then you will find that your estimates for tasks will become easier to make, and won't be messed up by lengthy and unexpected delays caused by bugs and the subsequently difficult to estimate debugging effort, as you will find that the majority of your potential bugs will have been dealt with simply by focusing your development efforts around testing. Where you may have previously found yourself dealing with a lengthy debugging process during the user acceptance testing period, you should in future find the post-implementation debugging phase, and user acceptance tests taking much less time.
So the real issue then becomes less about how much additional time testing adds, and more about a trade-off between the longer time to implement verses the shorter post implementation time spent debugging and maintaining your product, and also about how your test suite will benefit you when it comes time to change any part of your code base. More importantly, how all of this translates into a real and measurable value to your business. As a very rough example of what I am talking about, when my team first started seriously writing unit tests, I took some measurements relating to bug issues and time to develop modules for our product, and I added in a comparison between the amount of time working on new developments versus maintenance issues. Before test first unit testing, our maintenance efforts were triple the development efforts, whereas on those projects where we had moved entirely to unit testing first (and after a few months of experience), our effort expended on new development work is greater than the effort expended on maintenance. This amounts to a measurable saving with a real dollar value attached. It means that the team has been able to take on more work without needing to expend more cash and juggle resources.
How this will exactly translate into your own working environment is difficult to predict specifically, and I can only recommend having a couple of people act as method champions and work in a test-first manner for a while to get a real feel for how it will fit within you working environment. One suggestion would be to do a little experimental development. Have a couple of people work on a couple of similar tasks (or duplicating the same task if you can afford to allocate the time) for a short while and see what the difference is in terms of time to implement vs time to debug after completion, and conduct code reviews to examine code quality issues as each task progresses. Give it a little time, and see whether your teams objections diminish or become stronger.
Personally, I can't think of any real reasons why you shouldn't be unit testing, however I don't work in your particular environment so I can't really speak for you, and at the end of the day unit testing is merely a practice. It's a tool that if used well can provide you with great results, yet if the entire team is unable to use the tool, or if the use of a tool results in a pattern of failures within a team, then perhaps that tool isn't right for you. The only way you can know for certain is to try it and see for yourselves.
Best Answer
Short Answer: Absolutely positively.
Long Answer: Unit tests are one of the most important practices I try and influence at my place of work (large bank, fx trading). Yes they are extra work, but it's work that pays back again and again. Automated unit tests not only help you actually execute code you're writing and of course verify your expectations but they also act as a kind of watch dog for future changes that you or someone else might make. Test breakage will result when someone changes the code in undesirable ways. I think the relative value of unit tests declines in correlation with the level of expected change and growth in a code base, but initial verification of what the code does make it worthwhile even where the expected change is low. Unit test value also depends on the cost of defects. If the cost (where cost is loss of time/money/reputation/future effort) of a defect is zero, then the relative value of a test is also zero; however this is almost never the case in a commercial environment.
We generally don't hire people anymore who don't routinely create unit tests as part of their work - it's just something we expect, like turning up every day. I've not seen a pure cost benefit analysis of having unit tests (someone feel free to point me to one), however I can say from experience that in a commercial environment, being able to prove code works in a large important system is worthwhile. It also lets me sleep better at night knowing that the code I've written provably works (to a certain level), and if it changes someone will be alerted to any unexpected side effects by a broken build.
Test driven development, in my mind is not a testing approach. It's actually a design approach/practice with the output being the working system and a set of unit tests. I'm less religious about this practice as it's a skill that is quite difficult to develop and perfect. Personally if I'm building a system and I don't have a clear idea of how it will work I will employ TDD to help me find my way in the dark. However if I'm applying an existing pattern/solution, I typically won't.
In the absence of mathematical proof to you that it makes sense to write unit tests, I encourage you to try it over an extended period and experience the benefits yourself.