Homogeneous linear difference equation with multiple roots

finite differencesnumerical methodsrecurrence-relations

Given a homogeneous linear difference equation
$$\sum_{j=0}^k \alpha_j y_{n+j} = 0$$
I want to show that if $r$ is a root of the characteristic polynomial
$$\rho(\xi) = \sum_{j=0}^k \alpha_j \xi^j$$
with multiplicity $q$, i.e.
$$\rho(r) = \rho'(r) = \cdots = \rho^{(q-1)}(r)=0$$
then the sequence
$$y_n = n^s r^n$$
is a solution of the difference equation for $s=0,1,\dots,q-1$.

That is, for $s=0,1,\dots,q-1$,
$$\sum_{j=0}^k \alpha_j (n+j)^s r^{n+j} = 0.$$

This is very much analogous to the behavior of homogeneous differential equations with repeated roots of their characteristic polynomial, although I believe the proof is somewhat different.

The proof for the case where $r$ is a root of multiplicity $1$ is rather trivial:
$$\sum_{n=0}^k \alpha_j (n+j)^0 r^{n+j} = r^n \sum_{n=0}^k \alpha_j r^j = r^n \rho(r) = 0$$
and the case where $r$ is of multiplicity $2$ is not too difficult either. (This post shows one way to go about it.)

I've tried using a binomial expansion on the $(n+j)^s$ term to see where that gets me, but I keep getting stuck. Any inspiration is welcome.

Best Answer

A blast from the past! First we might remark that this formula doesn't apply if the root $r=0$. In that case it implies that $y_s$, $0\le s\le q-1$ are unrestricted and we may set $z_j=y_{j+q}$ and solve the problem $$\sum_{j=0}^{k-q}\alpha_{j+q}z_{n+j}=0$$ Thus we will consider $k>0$, $\alpha_k\ne0$, and $r\ne0$. We need a lemma: If $r$ is a root of the characteristic equation of multiplicity $q>0$, i.e. $$\alpha_0+\sum_{j=1}^k\alpha_j\xi^j=(\xi-r)^qQ_{k-q}^{(0)}(\xi)$$ Where $Q_{k-q}^{(0)}(\xi)$ is a polynomial of degree $k-q$ and $Q_{k-q}^{(0)}(r)\ne0$ then $r$ is a root of $$\alpha_0\delta_{s0}+\sum_{j=1}^k\alpha_jj^s\xi^j=(\xi-r)^{q-s}Q_{k-q+s}^{(s)}(\xi)$$ with multiplicity $q-s>0$ where $Q_{k-q+s}^{(s)}(\xi)$ is a polynomial of degree $k-q+s$ and $Q_{k-q+s}^{(s)}(r)\ne0$ for $0\le s\le q-1$. This follows simply my mathematical induction: it's true by hypothesis for $s=0$ and if true for any $0\le s<q-1$, then $$\begin{align}\xi\frac d{d\xi}\left(\alpha_0\delta_{s0}+\sum_{j=1}^k\alpha_jj^s\xi^j\right)&=\xi\sum_{j=1}^k\alpha_jj^{s+1}\xi^{j-1}=\alpha_0\delta_{s+1,0}+\sum_{j=1}^k\alpha_jj^{s+1}\xi^j\\ &=(q-s)(\xi-r)^{q-s-1}\xi Q_{k-q+s}^{(s)}(\xi)+(\xi-r)^{q-s}\xi Q_{k-q+s}^{(s)\prime}(\xi)\\ &=(\xi-r)^{q-s-1}\left((q-s)\xi Q_{k-q+s}^{(s)}(\xi)+(\xi-r)\xi Q_{k-q+s}^{(s)\prime}(\xi)\right)\\ &=(\xi-r)^{q-s-1}Q_{k-q+s+1}^{(s+1)}(\xi)\end{align}$$ Where $Q_{k-1}^{(s+1)}(r)=(q-s)r Q_{k-q}^{(s)}(r)\ne0$, so it's true for $s+1$, $1\le s+1\le q-1$.

Given the above lemma, we plug in the solution $y_n=n^sr^n$, $0\le s\le q-1$: $$\begin{align}\sum_{j=0}^k\alpha_jy_{n+j}&=\sum_{j=0}^k\alpha_j(n+j)^sr^{n+j}=\alpha_0n^sr^n+\sum_{j=1}^k\alpha_j\sum_{t=0}^s{s\choose t}n^{s-t}j^tr^{n+j}\\ &=\sum_{t=0}^s{s\choose t}n^{s-t}r^n\alpha_0\delta_{t0}+\sum_{t=0}^s{s\choose t}n^{s-t}r^n\sum_{j=1}^k\alpha_jj^tr^j\\ &=\sum_{t=0}^s{s\choose t}n^{s-t}r^n\left[\alpha_0\delta_{s0}+\sum_{k=1}^t\alpha_jj^tr^j\right]=0\end{align}$$ The expression in the square brackets being $0$ because $t\le s\le q-1$.

EDIT: Fixed problem as pointed out in the comments by @quevivasbien and one problem with indeterminate forms. There are still such issues if $n=0$. We can fix those by defining $y_0=\delta_{s0}$ and then when $n=0$ we get $$\sum_{j=0}^k\alpha_jy_j=\alpha_0\delta_{s0}+\sum_{j=1}^k\alpha_jj^sr^j=0$$ because that's directly the hypothesis of our lemma.