Polynomial Multiplication
Polynomial multiplication is a binary operation that takes two polynomials as inputs and produces a polynomial.
$$
P(x)=a_0+a_1x^1+a_2x^2+…+a_nx^{n-1}
$$
Now Given
$$
A(x)=\sum_{i=0}^{n-1}a_ix^i
$$
$$
B(x)=\sum_{i=0}^{m-1}b_ix^i
$$
$$
C(x)=A(x)B(x)
$$
C is called the convolution of A and B.
Problem: How to multiply two polynomials in $O(n\log n)$ time
To solve the problem, we need to apply the Fast Fourier Transform (FFT) algorithm, rather than the traditional method of multiplying each term of one polynomial by each term of the other polynomial. Then it it suffices to answer: How to multiply two polynomials using FFT?
DFT
There is two forms of representation of a polynomial. One is the coefficient form and another is the point-value form. The point-value form is the value of the polynomial at some points. The Discrete Fourier Transform (DFT) is a transformation that converts a polynomial from the coefficient form to the point-value form.
IDFT
The inverse DFT converts a polynomial from the point-value form to the coefficient form.
FFT
We have to evaluate the polynomial at $o=n+m$ points. The simple operation costs $O(n^2)$ time. There is a better way to do this in $O(n\log n)$ time:
Root of unity
Let $\omega_n^k$ denotes the $k^{th}$ root of unity of order $n$. Namely, $\omega_n^k=\cos \frac{2\pi k}{n-1}+i\sin \frac{2\pi k}{n-1}=e^{\frac{2\pi k}{n-1}}$
It can be easily proved that
- $\omega_n^k$= $\omega_{2n}^{2k}$
- $\omega_n^k=-\omega_n^{k+\frac{n}{2}}$(assert n is even)
- $\omega_n^k\cdot\omega_n^l=\omega_n^{k+l}$
We just need to evaluate the polynomial at $\omega_n^0,\omega_n^1,…,\omega_n^{n-1}$.
Assert n is even, if n is odd, we can add a zero term to the polynomial.
For $0\le k<n/2$, we have
$$
\begin{aligned}
P(\omega_n^k)&=a_0+a_1\omega_n^k+a_2(\omega_n^k)^2+…+a_n(\omega_n^k)^{n-1} \
&=(a_0+a_2(\omega_n^k)^2+a_4(\omega_n^k)^4a+…+a_{n-1}(\omega_n^k)^{n-2})+\omega_n^k(a_1+…+a_n(\omega_n^k)^{n-2})\
&= (a_0+a_2\omega_{\frac{n}{2}}^k+a_4(\omega_{\frac{n}{2}}^k)^2+…+a_n(\omega_{\frac{n}{2}}^k)^{\frac{n}{2}-1})+\omega_n^k(a_1+…+a_{n}(\omega_{\frac{n}{2}}^k)^{\frac{n}{2}-1}) \
&= R(\omega_{\frac{n}{2}}^k)+\omega_n^kS(\omega_{\frac{n}{2}}^k)
\end{aligned}
$$
And for $P(\omega_n^{k+\frac{n}{2}})$ we have
$$
\begin{aligned}
P(\omega_n^{k+\frac{n}{2}})=P(-\omega_n^{k})=R(\omega_{\frac{n}{2}}^k)-\omega_n^kS(\omega_{\frac{n}{2}}^k)
\end{aligned}
$$
Then the problem is divided into two subproblems of size $\frac{n}{2}$. We can solve the problem recursively.
Why Does FFT Save Time: For every recursion, we have $R(\omega_{\frac{n}{2}}^k)\text{ and } S(\omega_{\frac{n}{2}}^k)$, so we could get $P(\omega_n^k)\text{ and }P(\omega_n^{k+\frac{n}{2}})$ at the same time(even if we do not need $R(\omega_{\frac{n}{2}}^k)\text{ and } S(\omega_{\frac{n}{2}}^k)$ in lower layers for calculating the value at one point, it makes sense for n points)
Time complexity
For k from 0 to n-1, calculating the DFT of two halves of $P(\omega_n^k)$ costs $T(\frac{n}{2})$ time.
Adding the two halves of $P(\omega_n^k)$ costs $O(n)$ time.
$$
T(n)=2T(\frac{n}{2})+O(n)
$$
In total, the time complexity is $O(n\log n)$.
Now we have the point-value form of the polynomial, Let $P(\omega_n^k)=v_k.$
Let $Q(x)=v_0+v_1x+v_2x^2+…+v_nx^{n-1}$
$$
\begin{aligned}
Q(\omega_n^{-k})&=\sum_{i=0}^{n-1}v_i(\omega_n^{-k})^i\
&=\sum_{i=0}^{n-1}v_i\omega_n^{-ik}\
&=\sum_{i=0}^{n-1}\sum_{j=0}^{n-1}a_j\omega_n^{ij}\omega_n^{-ik}\
&=\sum_{j=0}^{n-1}a_j\sum_{i=0}^{n-1}\omega_n^{i(j-k)}\
&=\sum_{j=0}^{n-1}a_j\sum_{i=0}^{n-1}(\omega_n^{j-k})^{i}\
&=\sum_{j=0}^{n-1}na_j\delta_{j,k}\
&=na_j
\end{aligned}
$$
How to calculate $Q(\omega_n^{-k})$
Use the FFT algorithm to calculate the point-value form of the polynomial. Then we can get the coefficient form of the polynomial.
Time complexity
The same as DFT, $O(n\log n)$.
Now we can multiply two polynomials in $O(n\log n)$ time.