Unit Testing – Should Code Be Designed for Unit Testing from the Start?

asp.net-coreinterfacesunit testingweb-api

There's a debate going on in our team at the moment as to whether modifying code design to allow unit testing is a code smell, or to what extent it can be done without being a code smell. This has come about because we're only just starting to put practices in place that are present in just about every other software dev company.

Specifically, we will have a Web API service that will be very thin. Its main responsibility will be marshalling web requests/responses and calling an underlying API that contains the business logic.

One example is that we plan on creating a factory that will return an authentication method type. We have no need for it to inherit an interface as we don't anticipate it ever being anything other than the concrete type it will be. However, to unit test the Web API service we will need to mock this factory.

This essentially means we either design the Web API controller class to accept DI (through its constructor or setter), which means we're designing part of the controller just to allow DI and implementing an interface we don't otherwise need, or we use a third party framework like Ninject to avoid having to design the controller in this way, but we'll still have to create an interface.

Some on the team seem reluctant to design code just for the sake of testing. It seems to me that there has to be some compromise if you hope to unit test, but I'm unsure how allay their concerns.

Just to be clear, this is a brand new project, so it's not really about modifying code to enable unit testing; it's about designing the code we're going to write to be unit testable.

Best Answer

Reluctance to modify code for the sake of testing shows that a developer hasn't understood the role of tests, and by implication, their own role in the organization.

The software business revolves around delivering a code base that creates business value. We have found, through long and bitter experience, that we cannot create such code bases of nontrivial size without testing. Therefore, test suites are an integral part of the business.

Many coders pay lip service to this principle but subconsciously never accept it. It is easy to understand why this is; the awareness that our own mental capability is not infinite, and is in fact, surprisingly limited when confronted with the enormous complexity of a modern code base, is unwelcome and easily suppressed or rationalized away. The fact that test code is not delivered to the customer makes it easy to believe that it is a second-class citizen and non-essential compared to the "essential" business code. And the idea of adding testing code to the business code seems doubly offensive to many.

The trouble with justifying this practice has to do with the fact that the entire picture of how value is created in a software business is often only understood by higher-ups in the company hierarchy, but these people don't have the detailed technical understanding of the coding workflow that is required to understand why testing can't be gotten rid of. Therefore they are too often pacified by practitioners who assure them that testing may be a good idea in general, but "we are elite programmers who don't need crutches like that", or that "we don't have time for that right now", etc. etc. The fact that business success is a numbers game and that avoiding technical debt, assuring quality etc. shows its value only in the longer term means that they are often quite sincere in that belief.

Long story short: making code testable is an essential part of the development process, no different than in other fields (many microchips are designed with a substantial proportion of elements only for testing purposes), but it's very easy to overlook the very good reasons for that. Don't fall into that trap.