Worst case asymptotic growth of conditional linear recurrence relation.

asymptoticsrecurrence-relations

If we have a linear recurrence relation on a sequence $\{x_n\}$, then I know how to find the worst case asymptotic growth. We consider the largest absolute value $\alpha$ of any root of the characteristic polynomial. Then, independent of the initial values, the asymptotic growth is $x_n=\mathcal{O}(\alpha^n)$. I realize that sometimes you get an extra polynomial factor for roots of larger order, but let us ignore all polynomial factors for convenience.

For example if $x_n=x_{n-1}+x_{n-2}$, then $x_n=\mathcal{O}(\phi^n)$ where $\phi$ is the golden ratio, because that is the largest absolute value of any root of $x^2-x-1$.

My question is about the following. Say we have two linear recurrence relations on a sequence $\{x_n\}$ such that the first holds for some $n$, but the second holds for other $n$. For example, say we know that for all $n$ we have either $x_n=x_{n-1}+x_{n-2}$, or $x_n=2x_{n-2}+4x_{n-3}$, but we do not know which of the two holds for which values of $n$.

My intuition tells me that, to find the worst case asymptotic growth, I just have to take the worst asymptotic growth of any of the recurrences in question. In the example I just gave we hence have $x_n=\mathcal{O}(2^n)$ by the second recurrence. However, I can not come up with a rigorous argument for this to be the case. So my question is whether my intuition is correct.

For some context, I am studying branching algorithms for $NP$-hard problems. Often times you want to have multiple cases for how you should branch, which is where the multiple recurrences come from. However, there is no clear way to predict which case will pop up, and hence which recurrence holds for which index. Note that you can get arbitrarily many different linear recurrences.

Best Answer

It's not correct. A counterexample: $$x_n = \frac43x_{n-1}-\frac43x_{n-2}\quad\text{or}\quad x_n = -\frac43x_{n-1}-\frac43x_{n-2}$$ $$x_n = 1,2,-4,-8,16,32,-64,\dots$$ The sequence grows as $2^n$. The roots of the recurrences are $\dfrac{\pm 2\pm 2\sqrt{2}i}{3}$, each of absolute value $\dfrac{2}{\sqrt{3}} < 2$.

Now, your application to complexity problems should come with some positivity requirements, which would rule out this example. It might be enough to salvage things, but I can't say without further study.

OK, I've come back to it. Assume the sequence $x_n$ is positive (for $n\ge 0$) and the recurrences $x_n^{(j)}=c_1^{(j)}x_{n-1}+c_2^{(j)}x_{n-2}+\cdots+c_{k}^{(j)}x_{n-j}$ have nonnegative coefficients $c_i^{(j)}$.

In this form, each of the characteristic equations $t^k-c_1^{(j)}t^{k-1}-c_2^{(j)}t^{k-2}-\cdots - c_k^{(j)}$ has exactly one positive real root $b_j$, and this $b_j$ has the largest absolute value of any root. Let $B$ be the maximum of the $b_j$. Now, normalize: let $y_n = B^{-n}x_n$, and transform the recurrences to fit the new sequences. The new recurrences are \begin{align*}y_n^{(j)} &= B^{-1}c_1^{(j)}y_{n-1}+B^{-2}c_2^{(j)}y_{n-2}+\cdots+B^{-k}c_k^{(j)}y_{n-k}\\ y_n^{(j)} &= d_1^{(j)}y_{n-1}+d_2^{(j)}y_{n-2}+\cdots+d_k^{(j)}y_{n-k}\end{align*} with $y_n\in \{y_n^{(j)}\}$, naming the new coefficients $d_i^{(j)}$. What's the point? Since $t^k-c_1^{(j)}t^{k-1}-c_2^{(j)}t^{k-2}-\cdots-c_k^{(j)} \ge 0$ for $t\ge b_j$, it's nonnegative for $t = B$ and all $j$. After scaling by those powers of $B$, $$1-d_1^{(j)}-d_2^{(j)}-\cdots-d_k^{(j)}\ge 0$$ for all $j$. Consequently, if each of $y_{n-1},y_{n-2},\dots,y_{n-k}$ is $\le P$ for some constant $P$, so is $y_n^{(j)}$. If we take enough initial conditions to cover all of the recurrences and let $P$ be the maximum of those initial conditions, then, by induction, $y_n\le P$ for all $n$.

Multiply by $B^n$ to go back to the $x_n$, and that's it: $x_n\le PB^n$, where $P$ is some constant depending on initial conditions and $B$ is the largest root of the characteristic equations for the recurrences.

Related Question