Why is the time complexity of summing from 1 to n $O(n^2)$

asymptoticssummation

I was just recently introduced to Big-O notation and this confused me.

When summing from 1 to n, why is the time complexity O(n2)?

In other words, why is $\sum_{i=1}^n i = 1 + 2 + … + n = \frac{n(n+1)}{2} = O(n^2)$?

This is a screenshot from the course that shows the above equalities.

Intuitively, I think it should be O(n) since n is the largest factor and the rest are constants so when added, it seems it should produce O(n).

The following is from a lesson that extracts this idea with comments from the instructor that follow the code:

def sum_odd(L):
M = L[:]
total = 0
while M != []:
    if M[0] % 2 == 1:
        total = total + M[0]
    M = M[1:]
return total

"The runtime of the loop depends on the iteration number. The slowest line is line 7.
This line of code takes n−1 time the first iteration, n−2 time the second iteration, and so on until the second last iteration where it takes 1 operation to make a list of 1 element and on the last iteration, this makes the empty list, which we can also say takes 1 operation.
The sum total of these number of operations is O(n2).
Hence the runtime is O(n2) for the entire function since line 7 of the loop is the slowest part of this code."

Best Answer

You are confusing functions (such as the function describing the runtime of an algorithm) with algorithms themselves. $O$-notation applies to functions, not algorithms.

When analyzing algorithms, we consider the function $T(n)$ that gives the algorithm's runtime on inputs of size $n$. If $T(n)=1+2+\dots+n$, this might correspond to an algorithm that performs one step, then 2 steps, then 3 steps, ..., then $n$ steps. (It has nothing to do with an algorithm for adding the numbers from $1$ to $n$.) This function grows as a quadratic function of $n$, i.e. $T\in O(n^2)$.

Related Question