Is micro-optimisation important when coding

code-qualityoptimizationperformance

I recently asked a question on Stack Overflow to find out why isset() was faster than strlen() in PHP. This raised questions around the importance of readable code and whether performance improvements of micro-seconds in code were worth even considering.

My father is a retired programmer, and I showed him the responses. He was absolutely certain that if a coder does not consider performance in their code even at the micro level, they are not good programmers.

I'm not so sure – perhaps the increase in computing power means we no longer have to consider these kind of micro-performance improvements? Perhaps this kind of considering is up to the people who write the actual language code? (of PHP in the above case).

The environmental factors could be important – the Internet consumes 10% of the world's energy. I wonder how wasteful a few micro-seconds of code is when replicated trillions of times on millions of websites?

I'd like to know answers preferably based on facts about programming.

Is micro-optimisation important when coding?

My personal summary of 25 answers, thanks to all.

Sometimes we need to really worry about micro-optimisations, but only in very rare circumstances. Reliability and readability are far more important in the majority of cases. However, considering micro-optimisation from time to time doesn't hurt. A basic understanding can help us not to make obvious bad choices when coding such as

if (expensiveFunction() || counter < X)

Should be

if (counter < X || expensiveFunction())

(Example from @zidarsk8) This could be an inexpensive function and therefore changing the code would be micro-optimisation. But, with a basic understanding, you would not have to, because you would write it correctly in the first place.

Best Answer

I both agree and disagree with your father. Performance should be thought about early, but micro-optimization should only be thought about early if you actually know that a high percent of time will be spent in small CPU-bound sections of code.

The problem with micro-optimization is that it is usually done without having any concept of how programs actually spend more time than necessary.

This knowledge comes from experience doing performance tuning, as in this example, in which a seemingly straightforward program, with no obvious inefficiencies, is taken through a series of diagnosis and speedup steps, until it is 43 times faster than at the beginning.

What it shows is that you cannot really guess or intuit where the problems will be. If you perform diagnosis, which in my case is random-pausing, lines of code responsible for a significant fraction of time are preferentially exposed. If you look at those, you may find substitute code, and thereby reduce overall time by roughly that fraction.

Other things you didn't fix still take as much time as they did before, but since the overall time has been reduced, those things now take a larger fraction, so if you do it all again, that fraction can also be eliminated. If you keep doing this over multiple iterations, that's how you can get massive speedups, without ever necessarily having done any micro-optimization.

After that kind of experience, when you approach new programming problems, you come to recognize the design approaches that initially lead to such inefficiencies. In my experience, it comes from over-design of data structure, non-normalized data structure, massive reliance on notifications, that sort of thing.

Related Topic