[Math] Proving that $\lim\limits_{h\to 0}\frac{f(x+h)-2f(x)+f(x-h)}{h^2}=f”(x)$

calculusderivatives

Prove that $$\lim\limits_{h\to 0}\frac{f(x+h)-2f(x)+f(x-h)}{h^2}=f''(x)$$

Is the following a correct proof:

$f''(x)=$$\lim\limits_{h\to 0}\frac{f'(x)-f'(x-h)}{h}=\lim\limits_{h\to 0}\frac{\frac{f(x+h)-f(x)}{h}-\frac{f(x)-f(x-h)}{h}}{h}=\lim\limits_{h\to 0}\frac{f(x+h)-2f(x)+f(x-h)}{h^2}$

I would really love input on this proof. The book "Berkeley Problems in Mathematics" solves it differently.

Best Answer

Summing up $$f(x+h) - f(x) = \ h f'(x) + \frac{h^2}{2}f''(x) + o(h^3)$$ $$f(x-h) - f(x) = - h f'(x) + \frac{h^2}{2}f''(x) + o(h^3)$$ you get $$f(x+h) + f(x-h) -2f(x) = h^2 f''(x) + o(h^3)$$ which is equivalent to $$\lim_{h \to 0} \frac{f(x+h) + f(x-h) -2f(x)}{h^2} = f''(x)$$

However I think this is the proof your book gives, since this is quite standard.

Related Question