[Math] the difference between “actual error” and “relative error”

error-propagationnumerical methods

I know that there is a difference between absolute error and relative error, but what about the term actual error?

This came up in the notes for an example problem, but I haven't been able to find any clear answers. My intuition is that actual is synonymous with absolute, but I'm not sure. It's also possible that the author made a mistake and meant to use one of the two (or even a different) term.

Please note that I'm not asking or confused about "actual value", which I know is used in the definition of absolute error.

Here is the passage in context:


Consider the error bound formula:

$E_t \leq\frac{K(b-a)^3}{12n^2}$

Find a value for K. Then find an upper bound for the error
when calculating $T_4$. How does this upper bound for the error compare with the actual error between $\int_0^\pi\cos (e^x)dx$ and $T_4$?

Best Answer

Absolute error is the difference between a measured or computed value and the true value. If our approximate value for $\pi$ is $3.14$ the absolute error is about $0.0016$. Relative error is the absolute error divided by the quantity. In this example the relative error is about $\frac {0.0016}{\pi} \approx 0.0005 = 0.05\%$ Each is important in certain cases. When quoting errors, you need to figure out which is more useful, then make sure the error you quote covers the possibilities.

Added: in your example, $K$ depends on a derivative of the function. You are expected to find the value of $K$ that corresponds to the greatest derivative of $\cos (e^x)$ on the interval $[0,\pi]$, then plug it in to get an error bound. After that, compute $T_4$ and a more accurate value for the integral. The actual error is the difference between these. You may find that the error bound is rather loose because the relevant derivative becomes large over a small part of the interval.

Related Question