Difference between complexity and performance guarantee

algorithmscomplexity

I'm a bit confused with the performance guarantee and complexity of selection sort.

I checked through internet and the complexity of selection sort is O(n^2). This O(n^2) is in terms of time complexity, am i right?

So how about performance guarantee? Is the performance guarantee in my case measured in terms of swapping or in time complexity as well? If the performance guarantee is in terms of swapping, so best case of swapping is zero swaps (the array is already sorted) and the worst case of swapping is n-1 step? The performance guarantee is then equal to (n-1)/0=undefined, am I correct?

Please correct me if im wrong…or is the performance guarantee is in terms of running time? Then performance guarantee will be (n-1)/(n-1) = 1?

Can someone please clear my doubts?

Best Answer

"performance guarantee" is not a term typically used in analyzing algorithms.

Time complexity is measured in terms of whatever you want. You're stumbled on a dirty secret of CS - Big-O notation is often used rather sloppily without specifying what exactly increases with the input size in that manner. It's generally the number of some primitive operation that's assumed to dominate others in the implementation, and assumed to take constant time itself. Both of these assumptions are often not universally true. For example, for sorting algorithms time complexity is usually based on the number of comparisons. But comparing numbers is actually not a constant time operation itself if the numbers get realy big