Questions about real differentiable

calculusderivativesjacobianmultivariable-calculusreal-analysis

Let $f$ be $f: \mathbb R^2 \to \mathbb R$.

What I recall/understand:

  1. That $f$ is differentiable at a point $(a,b)$ is defined as this weird thing from James Stewart, Calculus (similar for $v$)

enter image description here

  1. Sufficient condition: That $f: \mathbb R^2 \to \mathbb R$ is differentiable a point $(a,b)$ has a sufficient condition from this theorem also from James Stewart, Calculus

enter image description here

which I read as

If $f_x$ and $f_y$ exist in an open disc containing $(a,b)$ and are continuous at $(a,b)$, then $f$ is differentiable at $(a,b)$.


Questions:

  1. Does $f$ differentiable at $(a,b)$ imply $f_x$ and $f_y$ exist at $(a,b)$?

  2. For $f: \mathbb R^2 \to \mathbb R$ differentiable at $(a,b)$, what exactly its 'derivative' anyway?

  • Update: It's Jacobian. The thing is wiki doesn't say the jacobian of $f$ is the derivative of $f$. Anyway, I'll tag this question with jacobian.

    • What I understand is for just $g: \mathbb R \to \mathbb R$, we have $g$ 'differentiable' at $a$ if $\lim_{x \to a} \frac{g(x)-g(a)}{x-a}$ exists and then define the 'derivative' $g'(x)$ as the limit. Similar for even complex derivative, for example.

So what about for $f$? What is the 'derivative' of $f$? You might argue that $f$ doesn't really have 'a'/'the' derivative but rather has infinite derivatives, eg using gradient based on $f_x$ and $f_y$.

But even for just $f_x$ and $f_y$ these derivatives exist based on the existence of a limit. Like you could define something like 'differentiable in the $x$-direction at $(a,b)$' if $\lim_{x \to a} \frac{f(x,b)-f(a,b)}{x-a}$ exists and then you define 'the derivative in the $x$-direction at $(a,b)$' as the limit.

So, what, differentiable for $f: \mathbb R^2 \to \mathbb R$ doesn't really have like 'a derivative' ?

Best Answer

The usual definition of derivative of a function $f$ from $\mathbf{R}^m$ to $\mathbf{R}^n$ at a point $x \in \mathbf{R}^m$ is that it is the linear map that approximates $f$ to order greater than one near $x$. Precisely, a linear map $L:\mathbf{R}^m \to \mathbf{R}^n$ is the derivative of $f$ at $x$ provided

$$\lim_{h \to 0} \frac{f(x+h)-f(x)-L(h)}{|h|}=0,$$ where $h$ is a vector in $\mathbf{R}^m$ tending towards $0$ and $|h|$ denotes its length. If you define the quantity in the limit as $\epsilon(h)$, then this is the same as saying that $$f(x+h)=f(x)+L(h)+|h| \epsilon(h),$$ where $\epsilon(h)$ goes to $0$ with $h$ (in this sense $L$ approximates $f$ to order greater than $1$). Moreover if a linear map with this property exists, it is unique.

This is what the first definition you mention is saying (in coordinates, and with $m=2$ and $n=1$). But you are right that you could also define directional derivatives: for a vector $v$ you could put $$\partial_v(f)(x)=\lim_{h \to 0} \frac{f(x+hv)-f(x)}{h}$$ where $h$ now runs over $\mathbf{R}$, when this limit exists and call it the derivative of $f$ in the direction $v$. The relationship between these is that if $f$ has a derivative in the first sense, it has derivatives in all directions $v$ and $$\partial_v(f)(x)=L(v).$$

Conversely, the last assertion is that if the derivatives in the directions $v_1,\dots,v_m$ exist in a neighborhood of $x$ and are continuous at $x$, where $v_1,\dots,v_m$ is a basis of $\mathbf{R}^m$, then $f$ has a derivative $L$ in the first sense. All this is completely standard and can be found, for instance, in Spivak's nice book Calculus on manifolds.

Related Question