Real Analysis – A Multivariate Function with Bounded Partial Derivatives is Lipschitz

analysislipschitz-functionsmultivariable-calculusproof-verificationreal-analysis

I'm curious if I've done this correctly — please offer suggestions/corrections if not! I'm new to working in $\Bbb R^n$ so clear insights would be appreciated.

The problem:

Let $f:\Bbb R^2 \to \Bbb R$ be such that each $D_1f$ and $D_2f$ are defined everywhere and are bounded functions. Prove that $f$ is Lipschitz.

My attempt:

The definition of Lipschitz that I'm working with is that there exists
some $L > 0$ such that $|f(x) -f(y)| \leq L||x-y||$.

Since $D_1f$ and $D_2f$ are bounded, there must exist:

  • $S_1 = \sup \{||D_1f(x)|| : x \in \Bbb R^2\}$
  • $S_2 = \sup \{||D_2f(x)|| : x \in \Bbb R^2\}$

Now let $a = (a_1,a_2)$ and $b = (b_1,b_2) \in \Bbb R^2$. Then we have:
\begin{align*}
f(a) – f(b) &= f(a_1,a_2) – f(b_1,b_2)
\\
&= f(a_1,a_2) – f(a_1,b_2) + f(a_1,b_2) – f(b_1,b_2)
\end{align*}
Then, by the triangle inequality:
$$|f(a)-f(b)| \leq |f(a_1,a_2) – (a_1,b_2)| + |f(a_1,b_2) – f(b_1,b_2)|$$
And since the partial derivatives exist everywhere in $\Bbb R^2$, we can use the one-dimensional Mean Value Theorem to show that there exists some $c$ such that:
$$\frac{f(a_1,a_2)-f(a_1,b_2)}{a_2-b_2} = D_2f(a_1,c)$$
And noting how we defined $S_2$, it follows that
$$|f(a_1,a_2) – f(a_1,b_2) \leq S_2 |a_2 – b_2|$$
And similarly
$$|f(a_1,b_2) – f(b_1,b_2)| \leq S_1 |a_1-b_1|$$
And using the statement we got from the triangle inequality, we have that
$$|f(a)-f(b)| \leq S_1|a_1-b_1| + S_2|a_2 – b_2|$$
And by the Cauchy-Schwarz inequality, we have that
$$S_1|a_1-b_1| + S_2|a_2 – b_2|
\leq
\sqrt{S_1^2 + S_2^2}\cdot
\sqrt{(a_1-b_1)^2+(a_2-b_2)^2}
=
\sqrt{S_1^2 + S_2^2} \cdot
||a-b||$$
Whereby
$$|f(a)-f(b)| \leq \sqrt{S_1^2 + S_2^2} \cdot ||a-b||$$
So $f$ is Lipschitz with $L = \sqrt{S_1^2 + S_2^2}$.

Best Answer

The proof is correct and sufficiently detailed. My personal preference is to express it in a more "modular" form by isolating an important fact:

Lemma. A function $f:\mathbb{R}^n\to \mathbb R$ is Lipschitz if and only if there exists a constant $L$ such that the restriction of $f$ to every line parallel to a coordinate axis is Lipschitz with constant $L$.

Notice that the lemma has nothing to do with partial derivatives. One direction is trivial, the other is just the triangle inequality. E.g., $$ |f(a_1,a_2)-f(b_1,b_2)| \le |f(a_1,a_2)-f(b_1,a_2)|+|f(b_1,a_2)-f(b_1,b_2)| \\ \le L|a_1-a_2|+L|b_1-b_2| \le L\sqrt{2}\|a-b\| $$ and similarly for general $n$. $\quad\Box$

Once you have the lemma, the proof of the claim in your post boils down to setting $L = \max_i(\sup |D_if|)$ and using the one-dimensional Mean Value Theorem to show that the hypothesis of the lemma is satisfied.

Related Question