Complexity of algorithms using Big Theta notation. log n? n log n

asymptoticscomputational complexity

I have a few algorithms and am trying to determine their complexity using Big Theta notation:

for i in 1..n
    for j in 1..i
        for k in 1..j
            O(1)

while n >= 1
    O(1)
    n /= 2

while n >= 1
    for i in 1..n
        O(1)
    n /= 2

I'm looking for their complexities in terms of $n$. Trying to tackle the first, I'm trying to work my way up from one loop to three (t means big theta):

for i in 1..n - t(1 + 1 + 1 + ... + 1) = t(n)
    O(1)

for i in 1..n - t(n)
    for j in 1..i - t(1 + 2 + 3 + ... + n) = t(0.5n(n-1))
        O(1)

And from here I'm not sure how to proceed as the arithmetic is complicated.

The options for a solution are $\Theta(c)$, $c \in \{1, n, n^2, n^3, 2^n, n!, \log n, n \log n\}$ but already I feel like I'm outside that scope. I'm also not even sure what $\Theta(\log n)$ and $\Theta(n \log n)$ mean for an algorithm.

These questions are marked fairly low, so I wonder if I'm overthinking it or missing a simple methodology to determine complexity.

Were the loops strictly 1..n for all three, I know it would be $\Theta(n^3)$ but not when loop bounds depend on each other.

From my loose understand of complexity, I think $f(n) \in O(g(n))$ means $f(n)$ is dominated by $g(n)$ and $f(n) \in \Omega(g(n))$ means $g(n)$ is dominated by $f(n)$. I think $f(n) \in \Theta(g(n))$ means they're equivalent.

Best Answer

First one:

$$\sum_{i=1}^{n}\sum_{j=1}^{i}\sum_{k=1}^{j}1=\sum_{i=1}^{n}\sum_{j=1}^{i}j=\sum_{i=1}^{n}\frac{i(i+1)}{2}$$$$=\frac{1}{2}\sum_{i=1}^{n}i^2+\frac{1}{2}\sum_{i=1}^{n}i=\frac{1}{12}n(n+1)(2n+1)+\frac{1}{4}n(n+1)=\Theta(n^3)$$

Second one:

Number of steps is $k$ where $2^k\le n<2^{k+1}$ so $k \le \log n < k+1$, so $k=\Theta(\log n)$

Third one:

$$n+\left\lfloor\frac{n}{2}\right\rfloor+\left\lfloor\frac{n}{4}\right\rfloor+...\le n+\frac{n}{2}+\frac{n}{4}+...=2n$$ And since the number is between $n$ and $2n$ it is $\Theta(n)$

Related Question