[Math] Multi-Variable Function continuously differentiable if partial derivatives exist continuosly

derivativesmultivariable-calculus

I am wondering about a detail in the proof of the equivalence of a function being continuously differentiable and having continuous partial derivatives. The theorem is found in Rudin's Principles of Mathematical analysis and goes as follows:

Suppose $\textbf{f}$ maps an open set $E$ of $\mathbb{R}^n$ into $\mathbb{R}^m$. Then $\textbf{f}\in\mathcal{C}^1(E)$ if and only if the partial derivatives $D_jf_i$ exist and are continuous on $E$ for $1\leq i\leq m$ and $1\leq j\leq n$.

For the "if" direction of the proof, he claims it suffices to consider the $m=1$ case. My question is why does that suffice?

Best Answer

Because $$\left(\frac\partial{\partial \nu} f (x) \right)_i= \langle \nu, \nabla f_i\rangle = \sum_{j=1}^n \nu_j D_jf_i$$ So the individual components of any directional derivative only depend on the respective component of $f$.