Big Oh notation does not mention constant value

algorithm-analysisbig o

I am a programmer and have just started reading Algorithms. I am not completely convinced with the notations namely Bog Oh, Big Omega and Big Theta. The reason is by definition of Big Oh, it states that there should be a function g(x) such that it is always greater than or equal to f(x). Or f(x) <= c.n for all values of n >n0.

Why don't we mention the constant value in the definition? For example, let's say a function 6n+4, we denote it as O(n). But it's not true that the definition holds good for all constant value. This holds good only when c >= 10 and n >= 1. For lesser values of c than 6, the value of n0 increases. So why we do not mention the constant value as a part of the definition?

Best Answer

There are several reasons, but probably the most important one is that constants are a function of the implementation of the algorithm, not the algorithm itself. The order of an algorithm is useful for comparing algorithms irrespective of their implementation.

The actual runtime of a quicksort will typically change if it's implemented in C or Python or Scala or Postscript. The same applies for bubble sort -- the runtime will vary widely based on implementation.

However, what will not change is the fact that, all else being equal, as the dataset gets larger the time required to run a bubble sort will increase faster than the time required to run quicksort in the typical case, no matter what language or machine they're implemented with, assuming a reasonably correct implementation. This simple fact allows you to make intelligent inferences about the algorithms themselves when concrete details are not availble.

The order of an algorithm filters out factors that, while important in actual real-world measurements, tend to just be noise when comparing algorithms in the abstract.

Related Topic