[Math] Divide and Conquer algorithm for polynomial multiplication

algorithmscomputational complexitycomputer scienceinterpolationrecurrence-relations

For our Algorithm and Complexity course we have to design an Divide and Conquer algorithm on finding a polynomial $P(x)$ of degree n such that $P(x_i)=0$ for $i \in \{1,…,n\}$.

The polynomial I'm looking for is obviously $\prod_{i=1,…,n} (x-x_i)$. My current idea is to create an algorithm which takes the set of points $\{x_i\}$ as input, and computes $(x-x_1)(x-x_2)$ if the input is just $2$ points, otherwise I split the set in two sets and use the algorithm again.

However, the complexity of this algorithm will still be very high (something like $O(2^n)$ I guess?), so there must be a quicker way. Any help with finding a quicker algorithm will be great!

Thank you!

Best Answer

Let $A(x) = \sum_{j=0}^n a_jx^j$ and $B(x) = \sum_{j=0}^n b_jx^j$, and set $C(x) := A(x)B(x)$. The naive way to compute the product $A(x)B(x)$ is by convolution: $C(x) = \sum_{j=0}^{2n} c_jx^j$ where $$ c_j = \sum_{k=0}^j a_kb_{j-k},\quad 0\leqslant j\leqslant 2n. $$ This takes $\Theta(n^2)$ time, since we perform $j+1$ multiplications and $2n+1$ additions to compute each $c_j$, and hence $\sum_{j=0}^{2n} (j+1) = (1+n)(1+2n) =$ multiplications and $n(2n+1)$ additions.

The divide and conquer approach is to split $A$ and $B$ each into polynomials of degree $\sim\frac n2$, i.e.

$$ A_0(x) = \sum_{j=0}^{\left\lfloor\frac n2\right\rfloor-1}a_jx^j,\quad A_1(x) = \sum_{j=\left\lfloor\frac n2\right\rfloor}^{n}a_jx^{j-\left\lfloor\frac n2\right\rfloor}. $$ Then $A(x) = A_0(x) + A_1(x)x^{\left\lfloor\frac n2\right\rfloor}$. Define $B_0(x)$ and $B_1(x)$ similarly, then \begin{align} A(x)B(x) &= \left(A_0(x) + A_1(x)x^{\left\lfloor\frac n2\right\rfloor}\right)\left(B_0(x) + B_1(x)x^{\left\lfloor\frac n2\right\rfloor}\right)\\ &=A_0(x)B_0(x) + A_0(x) B_1(x)x^{\left\lfloor\frac n2\right\rfloor} + A_1(x)B_0(x) + A_1(x)B_1(x)x^{2\left\lfloor\frac n2\right\rfloor}. \end{align} This divides a problem of size $n$ into $4$ problems of size $\left\lfloor\frac n2\right\rfloor$, each of which take $\Theta(n)$ time. This yields the recurrence $T(n) = 4T(\left\lfloor\frac n2\right\rfloor) + Cn$, where $C$ is constant. Assuming $T(1)\in\Theta(1)$ and $n=2^m$, we have \begin{align} T(n) &= T\left(2^m\right)\\ &= 4^m + \sum_{j=0}^{m-1}2^jCn\\ &= 4^m + Cn(2^m-1)\\ &= n^2 +Cn(n-1), \end{align} so that $T(n)\in\Theta(n^2)$ - no improvement!

The trick is that it actually only takes three multiplications in each subproblem. To find $A_0B_0$, $A_0B_1$, $A_1B_0$, and $A_1B_1$, we need only to compute $$ A_0B_0,\quad A_0B_1 + A_1B_0,\quad A_1B_1, $$ since $$A_0B_1 + A_1B_0 = (A_0 + A_1)(B_0 + B_1) - A_0B_0 - A_1B_1. $$ Then we have the recurrence $T(n) = 3T(\left\lfloor\frac n2\right\rfloor) + Cn$, so that \begin{align} T(n) &= T(2^m)\\ &= 3^m + \sum_{j=0}^{m-1}\left(\frac32\right)^jCn\\ &= 3^m + Cn\left(\left(\frac32\right)^m-1\right)\\ &= 3^{\lg n} + Cn\left(\left(\frac32\right)^{\lg n}-1\right)\\ &= n^{\lg 3} + Cn\left(n^{\lg 3 -1}-1\right)\\ &= n^{\lg 3}(1 + C) - Cn, \end{align} and hence $T(n)\in\Theta\left(n^{\lg 3}\right)$.

Related Question