[Math] Extend a linearly independent set of polynomials to form a basis

linear algebrapolynomialsvector-spaces

I have the following set of polynomials:
$$Z=\{1+x+x^2,1-x+2x^2\}.$$

And I have to show that the set is linearly independent (which I've already got), and now I have to extend the set to form a basis of $P_2[\mathbf{R}]$.

This is where I'm stuck, I know that the $\dim(P_2[\mathbf{R}])=3$, so I only have to search for one more polynomial, but how do I do this?

I tried:

${(a_1+a_2)}+{(a_1-a_2)}{x} +{(a_1+{2}{a_2})}{x^2} \in span(Z)$

Because I'm trying to solve for any polynomial in $P_2[\mathbf{R}]$:
$${a_1 + a_2 = r_1}$$
$${a_1 – a_2 = r_2}$$
$${a_1 + 2a_2 = r_3}$$

From which I got the matrix:
$$\begin{bmatrix} 1 & 1 & -1 & 0 & 0 \\ 1 & -1 & 0 & -1 & 0 \\ 1 & 2 & 0 & 0 & -1 \end{bmatrix}$$

Which I then reduced and got the basis for the column space of the matrix, and got the missing polynomial based on the coefficients of my last result. So I now have:
$${\beta=\{1+x+x^2,1-x+2x^2,-1\}}$$

as a basis of $P_2[\mathbf{R}]$.

The thing is, I'm not sure if my result is correct and if it is WHY did it work out like that…

Also, does anyone have an easier suggestion to attack this problem?

P.S.: Sorry if I put in to much information, this is my first question here.

Thanks!

Best Answer

Your result seems correct to me. Assume that there exists constants $A$, $B$ and $C$ such that $$A(1+x+x^2)+B(1-x+2x^2)+C(-1)=0$$ for every $x$ which gives $$(A+2B)x^2+(A-B)x+A+B-C=0$$ hence $$A+2B=0,\hspace{1cm} A-B=0, \hspace{1cm} A+B-C=0$$ The first two equations clearly give $A=B=0$ so that $C=A+B=0$ as well. Hence the polynomials are linearly independent.

For another approach: We're looking for a polynomial $P(x)=ax^2+bx+c$ which is linearly independent from $1+x+x^2$ and $1-x+2x^2$. In essence this is not different from finding a 3-dimensional vector $(a,b,c)$ which is linearly independent from $(1,1,1)$ and $(1,-1,2)$. In turn this means that the determinant of these three vectors is non-zero.

Related Question