Convergence speed for iterative methods

convergence-divergencenumerical methods

I am a bit confused about convergence rates of iterative methods. Why do we say that a method converges linearly, if the error decreases exponentially with each step, i.e.

$$d(x^{i}, x^*) \leq \lambda^i d(x^1, x^*), \quad \lambda \in (0,1)$$

Best Answer

You could also call that geometric convergence. The "linear" refers to the fact that the reference error on the right side only appears in the first power. With quadratic convergence you would get $$ d(x^{i+1},x^∗)≤λd(x^i,x^∗)^2\implies d(x^i,x^∗)\leλ^{2^i-1}d(x^0,x^∗)^{2^i} $$ which converges to zero if $λd(x^0,x^∗)<1$. This extends to similar formulas for higher order convergence.


Be careful of where you start your sequence, the linear convergence formula should be $$ d(x^i,x^∗)\leλ^{i-1}d(x^1,x^∗) $$ as you can check when comparing the initial condition at $i=1$.