90/10 Rule of Program Optimization – What Does It Mean?

optimizationprogramtheory

According to Wikipedia, the 90 / 10 rule of program optimization states that “90% of a program execution time is spent in executing 10% of the code” (see the second paragraph here).

I really don't understand this. What exactly does this mean? How can 90% of the execution time be spent only executing 10% of the code? What about the other 90% of the code then? How can they be executed in just 10% of the time?

Best Answer

There are two basic principles in play here:

  • Some code is executed much more often than other code. For example, some error handling code might never be used. Some code will be executed only when you start your program. Other code will be executed over and over while your program runs.
  • Some code takes much longer to run than other code. For example, a single line that runs a query on a database, or pulls a file from the internet will probably take longer than millions of mathematical operations.

The 90/10 rule isn't literally true. It varies by program (and I doubt there is any basis to the specific numbers 90 and 10 at all; someone probably pulled them out of thin air). But the point is, if you need your program to run faster, probably only a small number of lines is significant to making that happen. Identifying the slow parts of your software is often the biggest part of optimisation.

This is an important insight, and it means that decisions that seem counterintuitive to a new developer can often be correct. For example:

  • There is lots of code that it is not worth your time to make "better", even if it is doing things in a dumb, simplistic way. Could you write a more efficient search algorithm for application XYZ? Yes, but actually a simple comparison of every value takes a trivial amount of time, even though there are thousands of values. So it's just not worth it. It can be tough for new developers to avoid unnecessary optimisation, because in their degree program so much time was spent on writing the "correct" (meaning most efficient) algorithm. But in the real world, the correct algorithm is any one that works and runs fast enough.
  • Changes that make your code much longer and more complex may still be a performance win. For example, in application FOO it may be worth adding hundreds of lines of new logic, just to avoid a single database call.
Related Topic