The Legendre Polynomials are given by:
$\displaystyle P_0(x) = 1$
$\displaystyle P_1(x) = x$
$\displaystyle P_2(x) = \frac{1}{2}(3x^2 - 1)$
$\displaystyle P_3(x) = \frac{1}{2}(5x^3 - 3x)$
$\displaystyle P_4(x) = \frac{1}{8}(35x^4 -30 x^2 + 3)$
$\displaystyle P_5(x) = \frac{1}{8} (63 x^5-70 x^3+15 x)$
$\displaystyle P_6(x) = \frac{1}{16}(231 x^6-315 x^4+105 x^2-5)$
$\displaystyle P_7(x) = \frac{1}{16}(429 x^7-693 x^5+315 x^3-35 x)$
$\ldots$
We want to expand $f(x)$ as an infinite series of Legendre polynomials $P_l(x)$, given:
$$
f(x)=\left\{\begin{matrix} +1
& 0<x<1\\
-1 & -1<x<0
\end{matrix}\right.
$$
To use the Legendre Series, we put:
$$\tag 1 f(x) = \sum_{i=0}^\infty c_iP_i(x)$$
To solve this, we solve a series of integrals given by:
$$\int_{-1}^1 f(x)P_i(x)~dx = \sum_{i=0}^\infty c_i \int_{-1}^1 (P_i(x))^2 ~dx$$
Because the Legendre polynomials are orthogonal, all the integrals on the right are zero except the one we care about, namely $c_i$. So, lets crank those $c_i$ using this approach.
$i = 0$
$$\int_{-1}^1 f(x)P_0(x)~dx = \sum_{i=0}^\infty c_0 \int_{-1}^1 (P_0(x))^2 ~dx$$
Note that because we have a piecewise continuous function, we split the integral up into two pieces (you know which rule allows us to do this), but also this is a symmetric function, so we can double the result. That is:
$$\int_{-1}^1 (f(x))(1)~dx = \int_{-1}^0 (f(x))(1)~dx + \int_{0}^1 (f(x))(1)~dx = \int_{-1}^0 (-1)(1)~dx + \int_{0}^1 (1)(1)~dx = 2\int_{0}^1 (1)(1)~dx = \sum_{i=0}^\infty c_0 \int_{-1}^1 (P_0(x))^2 ~dx = c_0 \int_{-1}^1 (1)^2~dx$$
This yields $0 = c_0 \cdot 2 \rightarrow c_0 = 0$
$i = 1$
$$\int_{-1}^1 f(x)P_1(x)~dx = \sum_{i=0}^\infty c_1 \int_{-1}^1 (P_1(x))^2 ~dx$$
$$2\int_{0}^1 (1)(x)~dx = \sum_{i=0}^\infty c_1 \int_{-1}^1 (P_1(x))^2 ~dx = c_1 \int_{-1}^1 (x)^2~dx$$
This yields $1 = c_1 \cdot \frac{2}{3} \rightarrow c_1 = \frac{3}{2}$
$i = 2$
$$\int_{-1}^1 f(x)P_2(x)~dx = \sum_{i=0}^\infty c_2 \int_{-1}^1 (P_2(x))^2 ~dx$$
$$2\int_{0}^1 (1)(\frac{1}{2}(3x^2-1))~dx = \sum_{i=0}^\infty c_2 \int_{-1}^1 (P_2(x))^2 ~dx = c_2 \int_{-1}^1 (\frac{1}{2}(3x^2-1))^2~dx$$
This yields $0 = c_2 \cdot 0 \rightarrow c_2 = 0$ (All even terms are zero)
If we continue this process, we find:
- $c_0 = 0$
- $\displaystyle c_1 = \frac{3}{2}$
- $c_2 = 0$
- $\displaystyle c_3 = -\frac{7}{8}$
- $c_4 = 0$
- $\displaystyle c_5 = \frac{11}{16}$
- $c_6 = 0$
- $\displaystyle c_7 = -\frac{75}{128}$
- $\ldots$
Thus,
$\displaystyle f(x)=\left\{\begin{matrix} +1 & 0 <x<1\\-1 & -1<x<0\end{matrix}\right. = c_1P_1(x) + c_3P_3(x)+ c_5P_5(x) + c_7P_7(x) + \ldots + c_nP_n(x)$
$$\displaystyle \therefore ~ f(x) = \frac{3}{2}P_1(x) - \frac{7}{8}P_3(x) + \frac{11}{16}P_5(x) - \frac{75}{128}P_7(x) + \ldots + c_nP_n(x)$$
In this answer, the coefficients are computed analytically.
We have $\sin\pi x=\sum_{n=1}^\infty (4n-1)a_n P_{2n-1}(x)$ with $a_n=\int_0^1 P_{2n-1}(x)\sin\pi x\,dx$.
Using $(2n+1)P_n(x)=\big[P_{n+1}(x)-P_{n-1}(x)\big]'$ and integration by parts, we get
$$\left.\begin{aligned}
C_n&:=\int_{-1}^1 P_n(x)\cos\pi x\,dx
\\S_n&:=\int_{-1}^1 P_n(x)\sin\pi x\,dx
\end{aligned}\right\}
\implies
\left\{\begin{aligned}
(2n+1)C_n&=\pi(S_{n+1}-S_{n-1})
\\(2n+1)S_n&=\pi(C_{n-1}-C_{n+1})
\end{aligned}\right.$$
which gives a computation of $a_n=S_{2n-1}/2$, with $C_0=C_1=S_0=0$ and $S_1=2/\pi$.
Alternatively, we have the following expression: $$a_n=\sum_{k=0}^{n-1}\frac{(-1)^k}{(2k)!}\frac{(2n+2k-1)!}{(2n-2k-1)!}\frac{1}{2^{2k}\pi^{2k+1}}.$$ A (long) way to get it: take $a_n=\frac12\int_{-1}^1$ and use Rodrigues' formula for $P_{2n-1}(x)$, then integrate by parts $2n-1$ times, then use Poisson's integral for Bessel functions to get $$a_n=\frac{(-1)^{n-1}}{\sqrt2}J_{2n-\frac12}(\pi)=(-1)^{n-1}j_{2n-1}(\pi),$$ and finally the explicit expression for the spherical Bessel function.
Best Answer
If the terms of an infinite series $\,s_1 + s_2 + ... + s_n + ...\,$ are such that they are equal to zero after $\,s_n\,$, then it is said to terminate and its sum is $\,s_1 + s_2 + ... + s_n\,$ which is a finite sum and the series converges to it.
In the common case of a power series $\,a_0 + a_1 x + a_2 x^2 + ... + a_n x^n + ...\,$ the same thing applies and a terminating power series is $\,a_0 + a_1 x + a_2 x^2 + ... + a_nx^n\,$ which is a polynomial and which has infinite radius of convergence.
You can look at MSE question 2573694 "Validity of terminating series solution of differential equation" for a similar situation.
Note that I think that the terminology is not a good one, but it is commonly used -- likely because of a lack of a better one.