C++ – care about micro performance and efficency

ccode-organizationoptimization

Many questions and answers on the C/C++ pages, specifically or indirectly discuss micro performance issues (such is the overhead of an indirect vs direct vs inline function), or using an O(N2) vs O(NlogN) algorithm on a 100 item list.

I always code with no concern about micro performance, and little concern about macro performance, focusing on easy to maintain, reliable code, unless or until I know I have a problem.

My question is why is it that a large number of programmers care so much? Is it really an issue for most developers, have I just been lucky enough to not to have to worry too much about it, or am I a bad programmer?

Best Answer

In practice, performance is seldom an issue that needs to be managed at that level of detail. It's worth keeping an eye on the situation if you know you're going to be storing and manipulating huge amounts of data, but otherwise, you're right, and better off, keeping things simple.

One of the easiest traps to fall into -- especially in C and C++ where you have such fine-grained control -- is optimizing too early, and at too fine a level. In general the rule is: A) don't optimize until you find out you have a problem, and B) don't optimize anything that you haven't proven to be a problem area by using a profiler.

A corollary to B) is: programmers are notoriously bad at predicting where their performance bottlenecks are, even though, to a one, they think they're good at it. Use a profiler, and optimize the parts that are slow, or change algorithms if one section of code is being called way too many times, so that it's causing a problem.

Related Topic