Dangers of Huge Monolithic Applications

Architectureprojectrisk-assesmentscalability

The big project I'm working on for a couple years now is a control (and everything) application of an advanced device, heart of its firmware.

The device is quite advanced, with more different functionalities than I could say from memory, and 98% of them are handled by this one huge executable. In one hand, the program is quite maintainable, well modularized inside, properly documented, there's a reasonable separation of functionalities by directories and files and so on.

But in the end it gets all clustered into one application that does everything from remote database communication, touchscreen handling, handling a dozen various communication protocols, measurements, several control algorithms, video capture, sunrise time and date of easter (seriously, and they are needed for very serious purposes!)… In general, stuff that is very thinly related, often related only through some data that trickles between some far modules.

It could be done as several separate executables communicating with each other, say, over sockets, with more specific purpose, maybe loaded/unloaded as needed, and so on. No specific reason why it is made this way.

In one hand, it works, and it works okay. The project is more simple, without maintaining build of multiple binaries. The internal structure is easier too, when you can just call a method or read a variable instead of talking over sockets or shared memory.

But in the other hand, the size, the scale of this thing just creeps me out, it feels like piloting Titanic. I was always taught to modularize, and clumping everything into one gargantuan file feels wrong. One problem I know is a heavy crash of one (even insignificant) module crashes all – but code quality assures this doesn't really happen in release versions. Otherwise, internal separation and defensive programming assures this will still run mostly correctly even if half of the internal modules fail normally for some reason.

What other dangers did I overlook? Why does this creep me out? Is this just irrational fear of unknown? Is making serious big projects this way an accepted practice? Either calm my fears or give me a good reason to refactor version 2.0 into multiple smaller binaries.

Best Answer

Except for the tiny comment at the end (second, supervising CPU) you could be describing my company. Yup, we need Easter too.

Well, we're a bit further along. We did split the big executable and tried to use standard components for standard bits. Not exactly the big improvement you'd hope for. In fact, performance is becoming a major pain even on beefier hardware. And maintenance costs haven't really gone down now that there's tons of code to serialize and synchronize data over narrow interfaces.

The lesson I've learned? Having a single executable is a well-proven solution for small systems, and we have decades of experience in managing it. All our tools support that natively. Modularity can be done within a single executable, too, and when you need to compromise on modularity for other reasons the hack remains small.

Related Topic