Show that if $g: \mathbb{R} \rightarrow \mathbb{R}$ is twice continuously differentiable then, given $\epsilon > 0$, we can find some constant $L$ and $\delta (\epsilon) >0$ such that:
$$|g(t) – g(\alpha) – g'(\alpha)(t-\alpha)| \leq L|t-\alpha|^{2}$$
for all $|t-\alpha| < \delta(\epsilon)$.
This seems to be begging for the use of the definition of continuity on the second derivative and then somehow applying the definition of the derivative but I can't make any progress.
I can get the LHS of the inequality by using the fact that
$$g'(t) = \frac{g(t) – g(\alpha)}{t-\alpha} + \frac{o(t – \alpha)}{t-\alpha}$$
but I can't get this into any sort of inequality and besides it doesn't make use of the continuity of $g''(x)$. I've also tried to use the mean value theorem but this didn't seem to help much either. Any hints would be greatly appreciated, cheers.
Best Answer
By the mean-value theorem $g(t) - g(\alpha) = g'(\beta)(t - \alpha)$ for some $\beta$ between $t$ and $\alpha$, so you're really trying to bound $$(g'(\beta) - g'(\alpha))(t - \alpha)$$ It should be easy from here.
You can actually get a bound of $C(t - \alpha)^2$ by applying the mean value theorem again, this time on $g'(x)$.