[Math] approximating a function by orthogonal projection

approximationestimationlinear algebraMATLABvector-spaces

I am trying to understand how the coefficients of the 5th degree polynomial has been calculated in https://www.math.wustl.edu/~freiwald/309projapp.pdf for approximating the sin(t) function between [-pi,pi] using orthogonal projection. Basically, I don't get how e0, …, e5 (these are coefficients, right?) are calculated.
Here are some screenshots:
enter image description here

enter image description here

enter image description here

enter image description here

I am new to the topic of orthogonal projection for function approximation so I really appreciate step-by-step explanation.

Some of the main things I want to know:

1) how to find each coefficient for the 5th degree polynomial based on t?

2) How to calculate/use MATLAB to calculate them? Author mentions MATLAB is used but I want to do the same but no code is available.

Best Answer

You want to find coefficients $\alpha_0,\alpha_1,\alpha_2,\alpha_3,\alpha_4,\alpha_5$ such that $$ \int_{-\pi}^{\pi}\{\sin(t)-\alpha_0-\alpha_1 t -\alpha_2 t^2 -\alpha_3 t^3-\alpha_4 t^4-\alpha_5 t^{5}\} t^{n}dt=0, \;\;\; 0 \le n \le 5. $$ This is a $6\times 6$ system of equations. This system has a unique solution. The Legendre polynomials are orthogonal on $[-1,1]$. So you could scale the independent variable in order to use these functions. The normalized Legendre polynomials on $[-1,1]$ have the form $$ P_n(x)=C_n\frac{d^{n}}{dx^n}(x^2-1)^n $$ https://en.wikipedia.org/wiki/Legendre_polynomials

Definition [Projection] Let $M$ be a subspace of an inner product space $X$, and let $x \in X$. The orthogonal projection of $x$ onto $M$ is the unique $m\in M$ such that $(x-m)\perp M$. The closest-point projection of $x$ onto $M$ is the unique $m \in M$ such that $\|x-m\|=\inf_{m'\in M'}\|x-m'\|$.

Theorem [Projection]: Let $M$ be a subspace of an inner product space $X$, and let $x\in X$. Then there exists a closest-point projection of $x$ onto $M$ iff there exists an orthogonal projection of $x$ onto $M$, and these two projections are the same if they exist.

Suppose $X$ is an inner product space, and suppose $M$ is a finite-dimensional subspace of $X$. If $\{ e_1,\cdots,e_n \}$ is an orthonormal basis of $M$, then the orthogonal projection of $x\in X$ onto $M$ is $$ P_M x = \sum_{k=1}^{n}\langle x,e_k\rangle e_k, $$ which is the unique vector $\sum_{k=1}^{n}\alpha_ke_k\in M$ such that $$ \langle x-\sum_{k=1}^{n}\alpha_k e_k, e_l\rangle,\;\; l=1,2,\cdots,n. $$ The Gram-Schmidt process builds an orthonormal sequence by subtracting the orthogonal projection of $x\notin M$ onto $M$ and then normalizing to get a unit vector that is orthogonal to $M$. In other words, if you start with a sequence $\{ x_1,x_2,x_3,\cdots\}$, then an orthonormal sequence $\{ e_1,e_2,e_3,\cdots\}$ is constructed by using the orthogonal projection $P_{M_n}$ of $X$ onto the subspace $M_n$ spanned by $\{x_1,x_2,\cdots,x_{n}\}$. The general element of Gram-Schmidt is $$ e_n = \frac{1}{\|x_n-P_{M_{n-1}}x_n\|}(x_n - P_{M_{n-1}}x_n) $$

Related Question