[Math] Solving Recurrences Using Annihilators

discrete mathematicsrecurrence-relations

I've recently learnt how to use "annihilators" to find closed form solutions to recurrence relations. For instance, if I have the recurrence:
$$\begin{align}
f(0) &= 10 \\
f(n) &= 4 \, f(n-1), \; n \geq 1
\end{align}$$
I can use the operator $(E-4)$ which transforms $f(n)$ into the zero function $\forall \, n \geq 0$. I then use the operator's form to find the generic solution in a static table, which in this case is $\alpha \, 4^n$. Plugging this into the base case I get $\alpha=10$ which yields the closed form $f(n) = 10 \times 4^n$.

This is great, but I have no idea why it works. Wikipedia turns this up, which is seemingly impenetrable to the uninitiated in abstract algebra. I can see that you treat the operator as a polynomial, and that its roots are what get plugged into the generic solution. But why?

Best Answer

Hidden underneath the "annihilator" concept in linear recurrences is the notion of power series and generator functions.

If you have a sequence $\{a_n\}$ you can define the power series:

$$a(z) = \sum_{n=0}^\infty a_n z^n$$

The abstract algebra part is that we are dealing here with a "formal power series" - we don't need to have it converge for any $z$. Given two power series, $a(z)$ and $b(z)$, we can multiply them to get a new power series, for example. We can add power series.

Now given your $a_n=f(n)$, we can see that:

$$(1-4z)a(z) = a_0 + \sum_{n=1}^\infty (a_n-4a_{n-1})z^n = a_0$$

We then define $E(a(z)) = \frac{a(z)-a(0)}{z}$. This takes a power series and shifts it one to the left. Thn we see that:

$$\frac{a(z)-a(0)}{z} - 4a(z) = 0$$

Thus we get $(E-4)(a(z)) = 0$.

Working backwards, we also see that $a(z)=\frac{a_0}{1-4z}=a(z)$. But we know that we can expand $\frac{1}{1-4z} =\sum_{n} 4^nz^n$. So we get that $a_n = 4^na_0 z^n$.

That might look like I've played a little slight of hand, since how can I show $\frac{1}{1-4z} = \sum_{n} 4^nz^n$ without worrying about convergence? That's the magic of power series - they form a commutative ring, and we can work out multiplicative inverses for (some) elements in this ring. Two power series are considered equal if they have the same coefficients, and there are no zero divisors, so it just magically falls out.

More generally, if $R(E)$ is some polynomial of $E$ and $f$ satisfies the condition $R(E)f = 0$ then when we define $f(z)=\sum f(n)z^n$ we get a rather complicated expression but which ultimately boils down to some $q(z)f(z) = p(z)$ where $q(z)$ and $p(z)$ are always (finite) polynomials, not power series. So we get $f(z)=\dfrac{p(z)}{q(z)}$. We can expand that by representing $p(z)/q(z)$ in terms of partial fractions.

Note, if $a(z)=\sum a_nz^n$ then $$(E^k)(a(z)) =\frac{a(z)-a_0-a_1z-\dots a_{k-1}z^{k-1}}{z^k}$$

Related Question