How to prove that a multivariable real vector valued function is continuous iff its component functions are continuous

continuityepsilon-deltamultivariable-calculus

Suppose we have a multivariable, real, vector-valued function $f:\mathbb{R}^n \to \mathbb{R}^m$ where $f = (f_1, f_2, …, f_m)$. How can I prove that $f$ is continuous iff all of its component functions $f_i$ are continuous? This is where where $f_i$ is a multivariable, real function $f_i:\mathbb{R}^n \to \mathbb{R}$

FORWARDS:

suppose that $f$ is continuous and $a = (a_1, a_2, …, a_n) \in \mathbb{R}^n$

$\implies \forall a, \forall \epsilon > 0, \exists \delta > 0$ s.t. $d(a, x) < \delta \implies d(f(a), f(x)) < \epsilon$

$\implies ||f(x) – f(a)|| < \epsilon$

$\implies \sqrt{\sum_{i = 1}^{m}(f_i(x) – f_i(a))^2} < \epsilon$

$\implies \sum_{i = 1}^{m}(f_i(x) – f_i(a))^2 < \epsilon^2$

$\implies (f_i(x) – f_i(a))^2\ < \epsilon^2$

$\implies |f_i(x) – f_i(a)| < \epsilon$

BACKWARDS:

suppose that each $f_i$ is continuous and $a = (a_1, a_2, …, a_n) \in \mathbb{R}^n$

$\implies \forall a, \forall \epsilon > 0, \exists \delta_i > 0$ s.t. $d(a, x) < \delta_i \implies d(f_i(a), f_i(x)) < \epsilon$

if we let $\delta = min\{\delta_1, \delta_2, …, \delta_n\}$ , we can say that…

$\implies \forall a, \forall \epsilon > 0, \exists \delta > 0$ s.t. $d(a, x) < \delta \implies d(f_i(a), f_i(x)) < \epsilon$

Now, because the above statement holds true $\forall \epsilon$, we can say that…

$\implies \forall a, \forall \epsilon > 0,\exists \delta > 0$ s.t. $d(a, x) < \delta \implies d(f_i(a), f_i(x)) < \frac{\epsilon}{\sqrt{m}}$

$\implies |f_i(x) – f_i(a)| < \frac{\epsilon}{\sqrt{m}}$

$\implies (f_i(x) – f_i(a))^2 < (\frac{\epsilon}{\sqrt{m}})^2$

$\implies \sum_{i = 1}^{m}(f_i(x) – f_i(a))^2\ < m(\frac{\epsilon}{\sqrt{m}})^2 = \epsilon^2$

$\implies \sqrt{\sum_{i = 1}^{m}(f_i(x) – f_i(a))^2} < \epsilon$

$\implies ||f(x) – f(a)|| < \epsilon$

Also, does this proof only hold true when d is the stadard euclidean distance on $\mathbb{R}^n$ and $\mathbb{R}$. Why would other norms or metrics suffice? I've seen a few other posts online about this proof. However, I'm still unsure about a few things.

Best Answer

Your proof is correct - but yes, it is based on the Euclidean distance $d(a,b) = \lVert a - b \rVert_2$. Note that I write $\lVert -\rVert_2$ instead of $\lVert -\rVert$; the latter symbol will be used to denote an arbitrary norm. See here. Your proof only requires to know that for $y = (y_1,\ldots,y_m)$ $$\lvert y_i \rvert =\sqrt{y_i^2} \le \sqrt{\sum_{i=1}^m y_i^2} = \lVert y \rVert_2 \tag{1} ,$$ $$\lVert y \rVert_2 = \sqrt{\sum_{i=1}^m y_i^2} \le \sqrt{m \max_{i=1}^m y_i^2} = \sqrt m \sqrt{ \max_{i=1}^m y_i^2} = m \max_{i=1}^m\sqrt{y_i^2} = \sqrt m \max_{i=1}^m \lvert y_i \rvert \tag{2} .$$

The expression $$\lVert y \rVert_\infty = \max_{i=1}^m \lvert y_i \rvert$$ is known as the maximum norm on $\mathbb R^m$. From $(1)$ and $(2)$ we get $$\lVert y \rVert_\infty \le \lVert y \rVert_2 \le \sqrt m \lVert y \rVert_\infty \tag{3} .$$ There are many other norms on $\mathbb R^m$, e.g. the taxicab norm $\lVert y \rVert_1 = \sum_{i=1}^m \lvert y_i \rvert$.

Norms $\lVert - \rVert$ and $\lVert - \rVert'$ on a vector space $V$ are called equivalent if there exist $a, b >0$ such that $\lVert y \rVert \le a \lVert y \rVert'$ and $\lVert y \rVert' \le b \lVert y \rVert$ for all $y$. Clearly this is equivalent to the existence of $A, B > 0$ such that $A\lVert y \rVert \le \lVert y \rVert' \le B \lVert y \rVert$ for all $y$.

$(3)$ says therefore that the Euclidean norm and the maximum norm are equivalent.

It is easy to verify that two norms are equivalent if and only if they induce the same topology on $V$. A more interesting fact is that all norms on a finite-dimensional $V$ are equivalent. See Equivalence of Norms on Finite-Dimensional Spaces: Proof.

In many special cases one can verify the equivalence of norms by simple individual proofs; see $(1)$ and $(2)$ above. As an easy exercise we can prove that $\lVert - \rVert_\infty$ and $\lVert - \rVert_1$ are equivalent: $$\lVert y \rVert_\infty = \max_{i=1}^m \lvert y_i \rvert \le \sum_{i=1}^m \lvert y_i \rvert = \lVert y \rVert_1 \le m \max_{i=1}^m \lvert y_i \rvert = m \lVert y \rVert_\infty . \tag{4}$$

Thus your result is true for any norm on $\mathbb R^m$, but the general proof requires to know that all norms are equivalent.

Anyway, your original proof can easily be adapted to work for $\lVert - \rVert_1$ and $\lVert - \rVert_\infty$.