Basis of the Dual Space of Polynomial Spaces

dual-spaceslinear algebralinear-transformations

I have the following problem:

Let $V=\mathcal{P}_n(\mathbb{R})$ be the vector space of polynomials of degree $\leq n$. Define $\alpha_k : V\to\mathbb{R}$ by

$$
\alpha_k(p)=\int_{-1}^{1}t^kp(t)dt,\qquad p\in V.
$$

Show that $\{\alpha_0,\dots,\alpha_n\}$ is a basis for the dual space $V^*$ of $V$.

Here is my proof attempt:


We just need to show linear independence as $\operatorname{dim}(V^*)=\operatorname{dim}(V)=n+1$, and linearly independent sets of the right length form a basis.

We proceed by induction on $n$. The base case is trivial since when $n=0$ we have a set of only one vector and is thus linearly independent.

For the inductive step, suppose the result holds for $i=0,\dots,n-1$, i.e., the set $\{\alpha_0,\dots,\alpha_{n-1}\}$ is linearly independent. Thus whenever $c_0,c_1,\dots,c_{n-1}$ satisfy

$$
\sum_{k=0}^{n-1}c_k\alpha_k=0,\qquad\forall p\in\mathcal{P}_{n-1}(\mathbb{R})
$$

we have $c_0=c_1=\cdots=c_{n-1}$. Let $c_0,\dots,c_n$ satisfy

$$
\sum_{k=0}^{n}c_k\alpha_k=0,\qquad\forall p\in\mathcal{P}_n(\mathbb{R}).
$$

Suppose $n$ is even. The case where $n$ is odd is similar, and is omitted.

Let $p(t)=t$. Then we have
\begin{equation*}
\begin{split}
0=\sum_{k=0}^nc_k\alpha_k(t) &= \sum_{k=0}^{n-
1}c_k\int_{-1}^{1}t^{k+1}dt+c_n\int_{-1}^{1}t^{n+1}dt \\
&= \sum_{k=0}^{n-1}c_k\int_{-1}^{1}t^{k+1}dt \\
&\implies c_0=c_2=\cdots=c_{n-2}=0.\\
\end{split}
\end{equation*}

Now let $p(t)=1$. Then
\begin{equation*}
\begin{split}
0 &= c_1\int_{-1}^{1}tdt+c_3\int_{-1}^{1}t^3dt+\cdots +c_{n-
1}\int_{-1}^{1}t^{n-1}dt+c_n\int_{-1}^{1}t^ndt \\
&= c_n\int_{-1}^{1}t^ndt \\
&\implies c_n=0.
\end{split}
\end{equation*}

Finally, with $p(t)=t^3$, we have

\begin{equation*}
\begin{split}
c_1\int_{-1}^{1}t^4dt+c_3\int_{-1}^{1}t^6dt+\cdots +c_{n-1}\int_{-1}^{1}t^{n+2}dt &= 0 \\
\implies c_1=c_3=\cdots=c_{n-1} &= 0. \\
\end{split}
\end{equation*}

Hence the set $\{\alpha_0,\dots,\alpha_n\}$ is linearly independent. $\Box$


Does this work? Is there anywhere I left out too much detail or anything that needs to be modified? Any help is appreciated.

Best Answer

Another proof of linear independence is this:

Let $\lambda_0,...,\lambda_n$ be s.t. $\sum_{i=0}^n \lambda_i\alpha_i=0$. Hence, for all $p \in V$, $$ \int_{-1}^1q(t)p(t)dt=0 $$

where $q(t)=\sum_{i=0}^n\lambda_it^i$. As $q\in V$, we deduce $$ \int_{-1}^1q^2(t)dt=0 $$

hence $q(t)=0$ for all $t\in [-1,1]$. By the fundamental theorem of algebra, if $q$ were not the zero polinomial, it has only $n+1$ complex roots. But, since all $t \in [-1,1]$ are roots of $q$, $q$ is the zero polinomial. Hence, $\lambda_i=0$ for all $i\in \{0,...,n\}$

Related Question