[Math] Rate of convergence of a sequence in $\mathbb{R}$ and Big O notation

asymptoticsoptimizationreal-analysis

From Wikipedia

$f(x) = O(g(x))$ if and only if there exists a positive real number
$M$ and a real number $x_0$ such that $|f(x)| \le \; M |g(x)|\mbox{
for all }x>x_0$.

Also from Wikipedia

Suppose that the sequence $\{x_k\}$ converges to the number $L$. We
say that this sequence converges linearly to $L$, if there exists a
number $μ ∈ (0, 1)$ such that $\lim_{k\to \infty}
\frac{|x_{k+1}-L|}{|x_k-L|} = \mu$.

If the sequences converges, and

  • $μ = 0$, then the sequence is said to converge superlinearly.
  • $μ = 1$, then the sequence is said to converge sublinearly.

I was wondering

  1. Is it true that if $\{x_n\}$ either linearly, superlinearly or sublinearly
    converges to $L$, only if $|x_{n+1}-L| = O(|x_n-L|)$? This is based on
    what I have understood from their definitions and viewing $\{ x_{n+1}-L \}$ and $\{ x_n-L \}$ as functions of $n$. Note that "only if" here means "if" may not be true, since $\mu$ may lie outside of $[0,1]$ and $\{x_n\}$ may not converge.
  2. Some Optimization book says that the steepest descent algorithm
    has linear rate of convergence, and writes $|x_{n+1}-L| =
    O(|x_n-L|)$. Is the usage of big O notation here expanding the meaning of linear rate of convergence?

Thanks and regards!

Best Answer

For (1), construct a sequence such that both $x_{2n}$ and $x_{2n+1}$ converge to $L$, but $x_{2n}$ converges much faster in such a way that $|x_{2n+1}-L|/|x_{2n}-L| \to \infty$.

For (2), note that $|x_{n+1}-L| = O(|x_n-L|)|$ only means that there exists a positive constant $c$ such that $|x_{n+1}-L| < c|x_n-L||$ for large enough $n$. Convergence means that $c < 1$.

More explicitly, if $|x_{n+1}-L| = 2|x_n-L|$ then $|x_{n+1}-L| = O(|x_n-L|)$, but $x_n$ certainly does not converge to $L$.

Related Question