Convergence of a sequence with strict inequality condition

calculusreal-analysissequences-and-series

Let $(u_n)$ be the sequence defined by $u_0 > 0$ and $u_{n+1} = \frac{1}{2}(u_n + v_n)$ where $v_n$ is another sequence such that for all $n \in \mathbb{N}$, $v_n < u_n$. This sequence is nonincreasing and lower bounded, thus it admits a limit. The limit is $0$ for a fair amount of examples i've tried but i am struggling to show that it is the case for any such $v_n$. The result holds under the additional assumption that $\sup_{n \in \mathbb{N}} \frac{v_n}{u_n} < 1$. Does it hold in general?

Best Answer

The conjecture is certainly not true, since one can take any decreasing sequence $\{u_n\}$ and simply define $v_n = 2u_{n+1}-u_n$. For example, if $u_n = 1 + \frac1n$ and $v_n = 1-\frac{1}{n}+\frac{2}{n+1}$, then $\lim_{n\to\infty} u_n = 1$.

For that matter, one can always add a constant to $u_n$ and $v_n$ simultaneously while preserving the assumptions, so the limit $0$ is not special at all.

Related Question