[Math] Numerical Convergence of Trapezoidal Rule

numerical methodsquadrature

So I am implementing the trapezoidal rule as my choice of quadrature. Most functions seem to be second order convergence except for two:

$$ f(x) = e^{cos(x)} – .1cos(x), \quad x \in [-7, -2] $$

The loglog plot of $f(x)$ of error vs $dx$ size is

enter image description here

where the red dashed line is a line with slope 2. I imagine the plot looks this way since we need smaller time step size to get better results, but I am wondering $\textbf{why does step size need to reach some size to get second order convergence}$.

The second function giving me weird behavior is

$$ g(x) = cos(e^x) – x^2sin(x) – x, \quad x \in [-5, 5]$$
The loglog plot of $g(x)$ of error vs $dx$ size is

enter image description here

where the red dashed line is again a line with slope 2. For this one, I assume there is some inequality the derivatives of $g$ and step size must satisfy to keep second order convergence but am left wondering $\textbf{why does the order of convergence "disappear" as step size decreases}$.

Best Answer

The first graph is showing a slowdown of convergence at small step sizes. This is a typical feature of floating point arithmetic for things like definite integrals. At even smaller step sizes you will start to see the error increase as the step size is decreased. Mathematically this can be understood by estimating the total error as a function of $\Delta x$. You will find that there is a term that decreases with $\Delta x$ that arises from floating point and another that increases with $\Delta x$ that arises from discretization. I find it a bit peculiar to see this effect when the number of subintervals is only 5000, however.

The second graph is showing erratic behavior at larger step sizes and the predicted second order convergence at small step sizes. This is basically what the theory predicts: for sufficiently small step sizes you should see second order behavior, but for larger step sizes the correction terms (which are, for smooth functions, proportional to 3rd and higher derivatives) are too large to be neglected. In this example the $\cos(e^x)$ term has huge higher derivatives in the range, say, $[3,5]$.