[Math] What does Gibbs phenomenon shows the nature of Fourier Series

ca.classical-analysis-and-odesfourier analysis

As the title shows,we know that there is some points the series not approaching to the function.

Now,take the convergence theorem into consideration.As there is some the first break-points,the series is still convergent.And,the Gibbs phenomenon always takes place on the first break-points.

Why does Gibbs phenomenon take place?What does it show the nature of Fourier Series?

Best Answer

A Fourier series truncated to order $n$ is the best approximation to the given function in the $L^2$ sense using trigonometric polynomials of order $n$. As such, small rapid deviations don't matter much. Since there is a limit to how big the derivatives of a trigonometric polynomial of fixed order can be (without the coefficients being big), in order to fit such a polynomial to a discontinuity it pays to overshoot a bit on each side of the discontinuity in order to “gather speed” so you can get from one value to the other fast. When I say it “pays”, i mean to say that you what you lose by not approximating the function too well at the overshoot, you more than gain back by doing the jump faster.