Algorithms – Finding Constant Runtime Algorithm for Recursive Summation

algorithmsasymptoticsrecursive algorithmssequences-and-series

Suppose we have a sequence of integers $A_n$ where $A_0$, …, $A_{k=1}$ < $50$, and for each subsequent term in the sequence, $A_i = A_{i-1}b_1 + A_{i-2}b_2 +… + A_{i-k}b_k$. ($A_0$ through $A_{k-1}$ and all $b_i$ (where i is 1 through k) are givens. I am to devise an algorithm that computes $A_n \mod 50$ and would run in a constant time (treat k as a constant), but I'm having a bit of a trouble solving this.

I get that since each $A_i$ is given by k numbers preceding it, and every $A_j$ (regardless of whether they were original 'given' numbers or subsequent terms) has 50 different possible values, there must be $50^k$ different possible ways to get $A_i$, but since all $A_j$ are modded by $50$, $A_i$ should in fact only have 50 different possible values. So I don't quite understand this discrepancy, and I'm further unsure of how to use this fact (if at all) to find the algorithm that runs in a constant time.

*Arithmetic operations on numbers of size O(log n) are done in constant time.

Any help would be greatly appreciated!

Best Answer

There is no really constant algorithm, because we need at the very least to read $n$, which is already $O(\log n)$.

There is an algorithm that uses $O(1)$ arithmetic operations with numbers of order $O(n)$ and $O(1)$ other operations. Say $\vec B_i = \langle A_{i - k}, A_{i - k + 1}, \ldots, A_{i - 1}\rangle$. Note that we can find $A_i$ in constant time if we know $\vec B_i$.

There are only $50^k$ possible values for $\vec B_i$, and if $\vec B_i = \vec B_j$ then $\vec B_{i + x} = \vec B_{j + x}$ for any $x \geq 0$. So for some $p, q < 50^k$ we have $\vec B_p = \vec B_{p + q}$ and thus $\vec B_i = \vec B_{i + c\cdot q}$ for any $i \geq p$, $c > 0$. We can find such $p$ and $q$ in constant time by brute force. In other words, $A_i$ is periodic with both period and pre-period part having length of at most $50^k$, and we can find these lengths in constant time.

Now, we can write $n = t + c\cdot q$ where $p \leq t < p + q$, and thus get $\vec B_n = \vec B_t$. We can find $\vec B_t$ in constant time, and from it find $A_n$ in constant time.

The only non-constant time is to find $t$, and to do it we need to read $n$ and perform some arithmetic on it, which takes $O(\log n)$ time. If we say that arithmetic on input-size numbers is constant time, then this algorithm runs in $O(1)$.

The discrepancy you found is because $A_i = A_j$ doesn't imply $A_{i + 1} = A_{j + 1}$, so we need to know more than $1$ previous element to find next, and there are more than $50$ different values they can take.

Related Question