What notation do you use for the average time complexity of an algorithm? It occurs to me that the proper way would be to use big-theta to refer to a set of results (even when a specific try may differ). For example, average array search would be Θ(n+1)/2.
Is this right?
Explanation of average search time being Θ(n+1)/2: Given that the access time sum for the n first integers is n(n+1)/2, the average time for a large number of random access to the array will be (n(n+1)/2)/n = (n+1)/2. In this case I would say that Θ(n+1)/2 will be the average access time. It makes sense (to me) but since I'm new to asymptotic notation (and programming in general), I'd like to know if this is common practice.
I'm asking because I'm confused to see big-O being used for everywhere in pages like http://bigocheatsheet.com. eg: average case of array search O(n).
Best Answer
is the same as
So, if you can prove that
f ∈ Ο(g) ∧ f ∈ Ω(g)
, then by all means useΘ
.Note that this has absolutely nothing whatsoever to do with time complexity of algorithms, average or otherwise.
Θ
,Ο
,ο
,Ω
andω
are about the growth rate of functions; whether those functions describe average time/step/space complexity of an algorithm, worst-case time/step/space complexity of an algorithm, best-case time/step/space complexity of an algorithm, expected case time/step/space complexity of an algorithm, the population of China, the number of users on StackOverflow, or the size of Lara Croft's bra is completely irrelevant.