[Math] Finding best case, worst case, and average case for an unknown algorithm

algorithmscomputer science

I have been given a paragraph explaining what an unknown algorithm does:

An algorithm checks to see if a list of integers is sorted, it starts at the beginning of the list, and scans along until a successive pair of elements is out of order. In this case, the algorithm returns false. In other cases(i.e. the list of integers is sorted) the algorithm would return true.

I have worked out that the worst case running time for this algorithm on an input list of n elements O(n). The best case running time for this algorithm on an input list with n elements is Ω(2). What would be the average-case expected running time be for an input list with a random permutation of numbers from 1,2…n? I'm really not sure if my answers are correct or not…

Best Answer

If the list is uniformly distributed, its $k$ first elements are ordered increasingly with probability $1/k!$ and they are ordered decreasingly with probability $1/k!$ hence the number $N$ of comparisons needed by the algorithm is such that $P[N\geqslant k]=2/k!$ for every $2\leqslant k\leqslant n-1$. Thus, for every $n\geqslant3$, one sees that $2\leqslant E[N]\leqslant2\mathrm e-3$, in particular $E[N]=\Theta(1)$.