Code Quality – Is There Something Wrong with Our Version Control?

code-qualitysource codeteam-foundation-server

I work with a team of programmers as the business analyst. We just released version 2.0 of our product and are working on the next version to be released in 3 months (it's an internal software product). Unfortunately version 2.0 has some issues that they have had to fix and we're going to deploy those fixes in a couple weeks. The problem is that we also don't want to deploy the changes that are still being worked on and are not slated to be released for another 3 months.

The programmers decided that the way to manage this was that only the code for the defects will be checked in, and the code for the new enhancements will be kept on the developer's local machines until they are done. I will have to get local builds from their machines to test because if they check in the code and we have to push out another patch to fix defects we don't want to include those enhancements just yet. There is also the problem where the same code file contains both defect fixes and enhancements, so they have to copy the code file locally, then make a change to fix a bug and check that one in, then resume work on the enhancements by taking the local copy they made.

It seems quite convoluted – is there a better way to handle this type of scenario? We're using Team Foundation Server and Visual Studio 2010.

Best Answer

V2.0 should have had what we used call a 'steady-state branch' (we used Perforce, not TFS) made for it once it was released. Any fixes for v2 would have been made to this branch and then propagated back into the v3 development branch while v3 features were also being worked on, i.e. a defect on v2 would result in a defect also on v3.

Having changes reside on developer's machines for a long time will likely result in an integration nightmare.