Practice on existing bugs/defects.
This is a really tough situation. I've never gone all the way to TDD from nothing before, but in my experience, getting a team to go from no unit tests to proactively writing them has been a very "one step at a time" approach.
First, get them comfortable writing unit tests and knowing really what they are and their benefits. For my teams, it's been best to write unit tests for existing bugs. Current bugs in systems have two things that you need to teach people to write unit tests well:
- an expected precondition and postcondition
- an outcome that currently is not what is expected and violates that precondition/postcondition
This gives members very concrete practice examples. They can write a test before they fix the bug, so that it fails. Then, they can fix the code so that it passes, and fixes the bug. Once they're comfortable with this, then you can get them the rest of the way so that they can write unit tests with no code up-front and then write new code to get their tests to pass.
I think the trick is to give them something to practice on where there are clear method pre/post-conditions. If requirements for methods are fuzzy, it's hard for even experienced TDD people to know exactly where to start. Take it a step at time and you'll get there. Good luck!
It's difficult to provide a definitive answer without better understanding the development processes that you usually use, the experience of your team, and more importantly, the complexity of the tasks at hand. To this I should also probably add the way in which you generally write your methods and the design choices you have made.
Unit testing always adds time to the development, and if you are testing your code properly, then you should be spending more time writing tests than implementation code. This sounds quite horrendous at first and is probably one of the main reasons why many choose to avoid unit testing in the first place. Even in my present workplace, it took me more than 4 years to convince everybody that their concerns over testing overheads were only as a result of not fully understanding the longer term benefits.
In the shorter term, and particularly when you first start unit testing, you find that your development time slows down considerably. This is a necessary learning period. You'll find most developers will be selective about where they apply unit tests, or will write their code first, and apply tests after the implementation is essentially complete. In doing so, they will find bugs, which will require a change in the implementation code, requiring a subsequent change in the tests. This will prove to be a very counter-productive process, and will make most developers wonder why they bother if they are always playing catch-up with the tests. The real epiphany will come when your developers start to code complete tests before writing the implementation code, because the resulting code will most likely be more concise, and will be much quicker to write.
An advantage of the test first approach is in the fact that the test will indicate the implementation code works at the exact moment that the code is written to satisfy the conditions of the test. Further, if the implementation code has been kept relatively concise it will likely be easier to modify with the tests, providing you with the real advantage of testing first, which is the assurance of a system that will identify the exact moment when the code has been modified to include an error. Modifications to code are much easier with supporting tests already in place, which significantly reduce your maintenance overheads and make debugging efforts much less onerous.
When you are very comfortable with unit testing and test first, then you will find that your estimates for tasks will become easier to make, and won't be messed up by lengthy and unexpected delays caused by bugs and the subsequently difficult to estimate debugging effort, as you will find that the majority of your potential bugs will have been dealt with simply by focusing your development efforts around testing. Where you may have previously found yourself dealing with a lengthy debugging process during the user acceptance testing period, you should in future find the post-implementation debugging phase, and user acceptance tests taking much less time.
So the real issue then becomes less about how much additional time testing adds, and more about a trade-off between the longer time to implement verses the shorter post implementation time spent debugging and maintaining your product, and also about how your test suite will benefit you when it comes time to change any part of your code base. More importantly, how all of this translates into a real and measurable value to your business. As a very rough example of what I am talking about, when my team first started seriously writing unit tests, I took some measurements relating to bug issues and time to develop modules for our product, and I added in a comparison between the amount of time working on new developments versus maintenance issues. Before test first unit testing, our maintenance efforts were triple the development efforts, whereas on those projects where we had moved entirely to unit testing first (and after a few months of experience), our effort expended on new development work is greater than the effort expended on maintenance. This amounts to a measurable saving with a real dollar value attached. It means that the team has been able to take on more work without needing to expend more cash and juggle resources.
How this will exactly translate into your own working environment is difficult to predict specifically, and I can only recommend having a couple of people act as method champions and work in a test-first manner for a while to get a real feel for how it will fit within you working environment. One suggestion would be to do a little experimental development. Have a couple of people work on a couple of similar tasks (or duplicating the same task if you can afford to allocate the time) for a short while and see what the difference is in terms of time to implement vs time to debug after completion, and conduct code reviews to examine code quality issues as each task progresses. Give it a little time, and see whether your teams objections diminish or become stronger.
Personally, I can't think of any real reasons why you shouldn't be unit testing, however I don't work in your particular environment so I can't really speak for you, and at the end of the day unit testing is merely a practice. It's a tool that if used well can provide you with great results, yet if the entire team is unable to use the tool, or if the use of a tool results in a pattern of failures within a team, then perhaps that tool isn't right for you. The only way you can know for certain is to try it and see for yourselves.
Best Answer
There is some overlap between the two for sure. NUnit is the predecessor, and as a result more mature. C# Unit Test framework is younger, but integrated with Visual Studio.
I've come from the Java world where JUnit is king (there are others, but none as popular), and NUnit 2.5.x is a really good match to the way JUnit 4 works. In fact, it works much better than the Java counterpart.
As far as test definition is concerned, I favor NUnit. Sure MS Test has similar (but different) attributes that you apply to your classes, but I believe it is lacking some features that can save you some work on some types of testing. You may want to check out a similar discussion on StackOverflow.
I like the new assertion model built in to NUnit (it is using the Hamcrest style asserts) because they are both easy to read and easily extensible.