[Math] Selection Sort Algorithm Analysis

algorithms

While performing algorithm analysis on the following C code-snippet,

void selection_sort(int s[], int n)
{
    int i, j, min;
    for (i = 0; i < n; i++)
    {
        min = i;
        for (j = i + 1; j < n; j++)
            if (s[j] < s[min])
                min = j;
        swap(&s[i], &s[min]);
    }
}

the author of the book I am reading writes that the if statement is executed

$$
S(n)=\sum_{i=0}^{n-1}\sum_{j=i+1}^{n-1}1=\sum_{i=0}^{n-1}n-i-1
$$

times. To eventually express this using Big Oh notation, he first writes that

$$
S(n)\approx\frac{n(n+1)}{2},
$$

because "we are adding up $n-1$ terms whose average value is about $n/2$." How did he conclude this?

Ultimately, of course, he concludes that the running time is $\Theta(n^2)$.

Best Answer

The idea is that the $n - 1$ terms are ranging from $n - 1$ down to $1$, so the average of those numbers is about $n / 2$. The actual sum is $\frac{n(n - 1)}{2}$, so I don't see why he didn't just put the exact formula (it's not too hard to work out, and is a standard result). But yeah, it's not too hard to see that the average of $n - 1, n - 2, \ldots, 1$ is roughly $n / 2$.

Related Question