Algorithms – How to Determine if an Algorithm is O(log n)

algorithmsbig ocomplexity

I'm refreshing my CS Theory, and I want to know how to identify that an algorithm O (log n) complexity. Specifically, is there an easy way to identify it?

I know with O(n), you usually have a single loop; O(n^2) is a double loop; O(n^3) is a triple loop, etc. How about O (log n)?

Best Answer

I know with O(n), you usually have a single loop; O(n^2) is a double loop; O(n^3) is a triple loop, etc. How about O (log n)?

You're really going about it the wrong way here. You're trying to memorize which big-O expression goes with a given algorithmic structure, but you should really just count up the number of operations that the algorithm requires and compare that to the size of the input. An algorithm that loops over its entire input has O(n) performance because it runs the loop n times, not because it has a single loop. Here's a single loop with O(log n) performance:

for (i = 0; i < log2(input.count); i++) {
    doSomething(...);
}

So, any algorithm where the number of required operations is on the order of the logarithm of the size of the input is O(log n). The important thing that big-O analysis tells you is how the execution time of an algorithm changes relative to the size of the input: if you double the size of the input, does the algorithm take 1 more step (O(log n)), twice as many steps (O(n)), four times as many steps (O(n^2)), etc.

Does it help to know from experience that algorithms that repeatedly partition their input typically have 'log n' as a component of their performance? Sure. But don't look for the partitioning and jump to the conclusion that the algorithm's performance is O(log n) -- it might be something like O(n log n), which is quite different.

Related Topic