Limit of a (rather general) recursive sequence

calculuslinear algebrarecursionsequences-and-series

So I am struggling with the following problem:

Let $F$ be a vector (sub)space of recursive sequences satisfying $x_{i+2} = bx_{i+1} + ax_{i}$ in field $\mathbb{K}$, with a vector space endomorphism $L: F \to F, (x_{i})_{i} \to (x_{i+1})_i $. Show that if $a>0$ and $b \neq 0$, $\lim_{i \to \infty} \frac{x_{i+1}}{x_{i}}$ exists and is equal to one of the eigenvalues of the endomorphism $L$.

I have shown that $F$ is indeed a vector subspace of $\mathbb{K}^\mathbb{N}$ and that the function $L$ is and an endomorphism of this vector subspace. I have proven that the $\dim F = 2$ and that we have a "natural" basis of two sequences $f_0 = (1, 0, …)$ and $f_1 = (0, 1, …)$. The endomorphism $L$ can then be described by a matrix:
\begin{equation*} A=
\begin{pmatrix}
0 & 1 \\
a & b
\end{pmatrix}
\end{equation*}

in this natural basis. I have then found two eigenvalues, $e_1 = \frac{b + \sqrt{b + 4a}}{2}$ and $e_2 = \frac{b – \sqrt{b + 4a}}{2}$. The related eigenvectors are then
\begin{equation*} v_{1,2} =
\begin{pmatrix}
2 \\
2e_{1,2}
\end{pmatrix}
\end{equation*}

We also see that:
\begin{equation*}
\begin{pmatrix}
x_i \\
x_{i+1}
\end{pmatrix} = A^i
\begin{pmatrix}
x_0 \\
x_1
\end{pmatrix}
\end{equation*}

Since $A$ is diagonalizable, we can write:
\begin{equation*}
\begin{pmatrix}
x_i \\
x_{i+1}
\end{pmatrix} = S D^i S^{-1}
\begin{pmatrix}
x_0 \\
x_1
\end{pmatrix}
\end{equation*}

with
\begin{equation*}
S =
\begin{pmatrix}
2 & 2\\
2e_1 & 2e_2
\end{pmatrix}, \; D =
\begin{pmatrix}
e_1 & 0\\
0 & e_2
\end{pmatrix}, \; S^{-1} = \frac{-1}{4\sqrt{b^2 + 4a}}\begin{pmatrix}
2e_2 & 2e_1\\
2 & 2
\end{pmatrix}
\end{equation*}

We can then find the explicit formula for $x_i$, we get:
\begin{equation}
x_i = \frac{-1}{4\sqrt{b^2 + 4a}}((4e_1^i e_2 + 4e_2^i)x_0 + (4e_1^{i+1} + 4e_2^{i})x_1)
\end{equation}

And so
\begin{equation*}
\lim_{i \to \infty} \frac{x_{i+1}}{x_i} = \lim_{i \to \infty} \frac{(4e_1^{i+1} e_2 + 4e_2^{i+1})x_0 + (4e_1^{i+2} + 4e_2^{i+1})x_1)}{(4e_1^i e_2 + 4e_2^i)x_0 + (4e_1^{i+1} + 4e_2^{i})x_1)}
\end{equation*}

And that is where I am stuck, if I could prove that this limit exists, I could simply use the fact that \begin{equation*}
\theta_{i+1} = \frac{x_{i+1}}{x_{i}} = \frac{bx_i + ax_{i-1}}{x_i} = b + a \frac{x_{i-1}}{x_i} = b + \frac{a}{\theta_{i-1}}
\end{equation*}

and then $\theta = \lim_{i \to \infty}\frac{x_{i+1}}{x_{i}}$ satisfies
\begin{equation*}
\theta = b + \frac{a}{\theta}
\end{equation*}

and so
\begin{equation*}
\theta = \frac{b \pm \sqrt{b^2 + 4a}}{2}
\end{equation*}

which is equal to the eigenvalues of $L$. But how do I prove that this limit exists in the first place? I tried to evaluate the explicit formula that I got, but I didn't find a way to do it. Could you please help?

Best Answer

The limit in question can be calculated by simply dividing both numerator and denominator by $e_{0}^i$, where $e_{0}$ is the eigenvalue with the largest absolute value. $e_{0}$ is the limit in that case, hence, the point is proven.

Related Question