[Math] Estimating error of linear approximation

calculuslinear approximationtaylor expansion

I have this question on my assignment which I just cannot seem to wrap my brain around. I've been reluctant to post it to any forum because I don't like the answer handed to me, but I seriously do need some help with this:

enter image description here

I'm not sure how I am meant to use the MVT with this, but I have been thinking about this question for awhile. So since I've made an assumption that the second derivative is positive, and through this I can make the statement that, since our approximation is at a (the start of the interval), the worst approximation you can get is when x is the furthest the way it can be on the interval, i.e. x=b. here is my working so far:

|t(x)-f(x)|

t(x)= f(a)+f'(a)(x-a)

MVT: f(b) = f(a)+f'(c)(b-a)

Worst approximation, which will give maximum error when x=b

|f(a) +f'(a)(b-a)-f(b)|

|f(a) +f'(a)(b-a)-f(a) +f'(c)(b-a)|

|f'(a)(b-a)-f'(c)(b-a)|

|(f'(a)-f'(c))(b-a)|

I am very lost, and am not sure how to apply the MVT twice? Have I even applied it once with my f(b)= … ? I would appreciate any help, but would prefer a push in the right direction over being told how to do it straight up. Thanks in advance. 🙂

Also, I believe it can be solved using Taylor's Series, but seeing as we haven't learnt about it and that it specifically says MVT in the hint, I don't think I should be using it.

Best Answer

First, $f(x) - T(x) = f(x) - f(a) - f'(a)(x - a)$. By MVT, $f(x) - f(a) = f'(c)(x - a)$ for some $c\in (a,x)$. Then $f(x) - T(x) = [f'(c) - f'(a)](x - a)$. Apply the mean value theorem again to $f'(c) - f'(a)$ and try to complete the argument.

Related Question