C++ – Is measuring the binary size of a c++ program a good indication of code complexity? Or something else

ccode metricsdevelopment-process

In my company we have this obligatory practice before a review to be send the binary size impact of the code change to be measured and provided.

We must use -Os for this metric to avoid unpredictable inlining. There is no customer or product driven arguments for it. It is a big server that is installed as singleton of a dedicated machine, where the executable size is completely ignorable in comparison to all the other resources involved in the system.

The main argument for this practice is that -Os binary size correlates with code complexity.

Is measuring the binary size a reliable metric to use when evaluating code complexity? Or for evaluating anything else?

Best Answer

As noted in the comments, size of the binary could be very important for some embedded systems - especially old ones.

However, as you've noted in the update to the question

There is no customer or product driven arguments for it. It is a big server that is installed as singleton of a dedicated machine, where the executable size is completely ignorable in comparison to all the other resources involved in the system.

This is one of the most pointy-haired schemes I've heard in a long time. You'll be penalized for including a library that's well tested and solves a lot of problems, but they'll let a Bubble sort get past?

Seriously, it would be useful to see some justification for their main argument that the binary size correlates with other qualities of the code. It's entirely possible that I'm dead wrong and there is such a correlation, but I kind of doubt it.

Related Topic