[Math] Limit of a Recursive Sequence

algebra-precalculuscalculussequences-and-series

I'm having a really hard time finding the limit of a recursive sequence –

$$ \begin{align*}
&a(1)=2,\\
&a(2)=5,\\
&a(n+2)=\frac12 \cdot \big(a(n)+a(n+1)\big).
\end{align*}$$

I proved that the sequence is made up from a monotonically increasing sequence and a monotonically decreasing sequence, and I proved that the limits of the difference of these sequences is zero, so by Cantor's Lemma the above sequence does converge. I manually found out that it converges to $4$, but I can't seem to find any way to prove it.

Any help would be much appreciated!
Thank you.

Best Answer

It never hurts to look at some data. Calculate the first few terms:

$$2,5,\frac72,\frac{17}4,\frac{31}8,\frac{65}{16},\frac{127}{32}$$

We can even fit the first two numbers into the apparent pattern:

$$\frac1{1/2},\frac51,\frac72,\frac{17}4,\frac{31}8,\frac{65}{16},\frac{127}{32}$$

The denominator of $a(n)$ appears to be $2^{n-2}$, and the numerators are alternately one less and one more than a power of $2$. Specifically, it appears that the numerator of $a(n)$ is $2^n-1$ when $n$ is odd and $2^n+1$ when $n$ is even or, more simply, $2^n+(-1)^n$. We conjecture, therefore, that

$$a(n)=\frac{2^n+(-1)^n}{2^{n-2}}\;.$$

Now prove this by induction on $n$, and observe that it’s now trivial to calculate the limit of the sequence.

Related Question