Algorithm Analysis – Understanding Time Complexity of 2^sqrt(n)

algorithm-analysisbig o

I am solving an algorithm question and my analysis is that it would run on O(2^sqrt(n)). How big is that? Does it equate to O(2^n)? Is it still non-polynomial time?

Best Answer

This is an interesting question. Fortunately, once you know how to solve it, it is not particularly hard.

For functions f: NR+ and g: NR+, we have fO(g) if and only if lim supn→∞ f(n) / g(n) ∈ R.

A function f: NR+ has at most polynomial growth if and only if there exists a constant kN such that fO(nnk). Let's work this out for arbitrary but fixed kN.

lim supn→∞ 2(n1/2) / nk =
limn→∞ 2(n1/2) / nk =
limn→∞ elog(2) n1/2 / elog(n) k =
limn→∞ elog(2) n1/2 − log(n) k = ∞ ∉ R

The first equality is true because both, the nominator and the denominator, are monotonically growing steady functions. The second equality uses the identity xy = elog(x) y. The limit is not finite because the exponent in the final expression is not bounded above. Without giving a formal proof, it can be assumed to be known that n1/2 dominates log(n) asymptotically. Therefore, the function in question exceeds polynomial growth.

However, its growth is strictly less than exponential, where exponential is defined (by me, for this purpose) as O(n ↦ 2cn) for c > 0. Showing this is even more straight forward.

lim supn→∞ 2cn / 2(n1/2) = limn→∞ 2cnn1/2 = ∞ ∉ R

for any fixed c > 0. Therefore, the complexity of the function is somewhere truly in between polynomial and exponential.

Related Topic