attempting it in the fashion of TDD, will merely make it a maintenance nightmare and impossible for the team maintain.
You can't win that argument. They're making this up. Sadly, you have no real facts, either. Any example you provide can be disputed.
The only way to make this point is to have code which is lower cost to maintain.
Furthermore, as it's a front-end application (not web-based), adding tests is pointless,
Everyone says this. It may be partially true, also. If the application is reasonably well designed, the front-end does very little.
If the application is poorly designed, however, the front-end does too much and is difficult to test. This is a design problem, not a testing problem.
as the business drive changes (by changes they mean improvements of course), the tests will become out of date, other developers who come on to the project in the future will not maintain them and become more of a burden for them to fix etc.
This is the same argument as above.
You can't win the argument. So don't argue.
"I am fully responsible for the rewrite of this product"
In that case,
Add tests anyway. But add tests as you go, incrementally. Don't spend a long time getting tests written first. Convert a little. Test a little. Convert a little more. Test a little more.
Use those tests until someone figures out that testing is working and asks why things go so well.
I had the same argument on a rewrite (from C++ to Java) and I simply used the tests even though they told me not to.
I was developing very quickly. I asked for concrete examples of correct results, which they sent in spreadsheets. I turned the spreadsheets into unittest.TestCase (without telling them) and uses these to test.
When were in user acceptance testing -- and mistakes were found -- I just asked for the spreadsheets with the examples to be reviewed, corrected and expanded to cover the problems found during acceptance test.
I turned the corrected spreadsheets into unittest.TestCase (without telling them) and uses these to test.
No one needs to know in detail why you are successful.
Just be successful.
Test driven design is about getting your API right, not the code.
The benefit of writing the simplest failing tests first is that you get your API (which it essentially is you are designing on the fly) as simple as possible. Up front.
Any future uses (which the next tests you write are) will go from the initial simple design, instead of a suboptimal design coping with more complex cases.
Best Answer
In my experience, it's more than 50%.
Once you've written the test, the solution has a tendency to come very easy. So I don't think it's odd to spend 70% - 75% of your time writing tests, but you're spending much less time writing the 'production code' (code-being-tested) and spending virtually no time in the debugger.
The sooner you find a bug, the cheaper it is to fix, and TDD helps with that tremendously. I've worked on projects where the last 2 months (of an 8 month project) were spent fixing bugs, and that phase would be almost entirely eliminated with TDD.
To me though, the real value is in maintenance. Inheriting a code base with tests makes you less scared to alter it. You feel like you didn't break anything when the tests still pass. Since you aren't scared to make changes you're willing to refactor if something isn't right. Which means the code can be made cleaner, the design can fit better, and, in theory, changes can be applied. Contrast that with voodoo code everyone's scared to touch.