Real Analysis – Uniform Convergence of Difference Quotients to Partial Derivative

partial differential equationsreal-analysisuniform-convergence

I'm currently reading Evans' PDE book. In it he claims that for $f \in C^2_c(\mathbb{R}^n)$
$$\frac{f(x + he_i) – f(x)}{h} \to \frac{\partial}{\partial x_i}f(x)$$
and
$$\frac{\frac{\partial}{\partial x_i}f(x + he_j) – \frac{\partial}{\partial x_i}f(x)}{h} \to \frac{\partial^2}{\partial x_jx_i}f(x)$$
uniformly as $h \to 0$.

My question is why must the convergence be uniform?

Thanks in advance.

Best Answer

I know that this, is an old question but one can improve the answer to the case mentioned in Evans book. In particular, one can show the uniform convergence of the difference quotients to the derivative for $C_c^1(\mathbb{R}^n)$ functions. The step for the second derivative follows from this case:

The key idea is to use that for $f\in C_c^1(\mathbb{R}^n)$ we find that $\nabla f$ (and also $f$) is uniformly continuous. Hence, we receive with the mean value theoren:

$$| \frac{f(x+he_i)-f(x)}{h} -\partial_i f(x)| = |\partial_i f(y) -\partial_i f(x)| $$ for some $y$ in the line from $x$ to $x+he_i$ and consequently for some $y\in B_h(x)$.

Now let $\varepsilon >0$ be given and $x\in \mathbb{R}^n$ be arbitrary. Then, by the uniform continuity there is a $\delta >0$ (independent of $x$) such that for all $y\in B_\delta(x)$ we find $|\partial_i f(x) -\partial_i f(y)|< \varepsilon$. Choosing $h\leq\delta$ proves the statement.

Related Question