[Math] Finding an approximate function using orthonormal basis

approximationlinear algebraorthonormalvector-spaces

I'm trying to take a function in $C_0[0,1]$ space (let's call this $f(x)$) and trying to find the best approximate of $f(x)$ at $P_2[0,1]$ space (let's call this approximate $p(x)$). Note that $P_2[0,1]$ is a subspace of $C_0[0,1]$ and $f(x)$ is not an element of $P_2[0,1]$.

The first thing that I did was finding the orthonormal basis of $P_2[0,1]$ using the Gram-Schmidt process. I know that using the inner product between the function and the orthonormal basis will help me find the best approximate when the value of the innder product is at the minimum, but I am not quite understanding this part. How do I find $P(x)$ in a form of $ax^2+b x+c$?

Would anyone like to explain the process to me?

Best Answer

You have a hidden inner product in your phrase "best approximation". Presumably your best approximation is one that minimizes $\int_0^1(f(x)-p(x))^2w(x)dx$ where $w(x)$ is a weight function. Each choice of $w(x)$ gives a different set of orthogonal polynomials. If you choose $w(x)=1$ you get the Legendre polynomials. On the interval $[-1,1]$ the first three are $1,x,\frac 12(3x^2-1)$