Algorithm Complexity – Single For-Loop Runtime Explanation Problem

algorithmsbig ocomplexityruntime

I am analyzing some running times of different for-loops, and as I'm getting more knowledge, I'm curious to understand this problem which I have still yet to find out.
I have this exercise called "How many stars are printed":

for (int i = N; i > 1; i = i/2) System.out.println("*");

The answers to pick from is

A: ~log N
B: ~N
C: ~N log N
D: ~0.5N^2

So the answer should be A and I agree to that, but on the other side.. Let's say N = 500 what would Log N then be? It would be 2.7. So what if we say that N=500 on our exercise above? That would most definitely print more han 2.7 stars? How is that related?

Because it makes sense to say that if the for-loop looked like this:

for (int i = 0; i < N; i++)

it would print N stars.

I hope to find an explanation for this here, maybe I'm interpreting all these things wrong and thinking about it in a bad way. Thanks in advance.

Best Answer

You've overlooked the key characteristic of the logarithm base.

Because i is divided by 2 in each iteration, the running time is logarithmic with base 2. And

log2(500) ~ 8.9

What you are looking at is

log10(500) ~ 2.7

(logarithm with base 10)

By the way, the reason why the base is often omitted in runtime discussions (and your calculator probably doesn't have a button for log2) is that due to the mechanisms of logarithmic math, a different base corresponds to a constant factor and thus is not relevant when you're ignoring constant factors anyway. It can be calculated easily:

loga(x) = logb(x) / logb(a)

Related Topic