In the $f:\mathbb{R^2}\rightarrow \mathbb{R}$ case it is helpful to consider the direction vector $u=(\cos\theta, \sin\theta)$ in order to quickly check all directional derivatives by simply adjusting for the angle. for example, in the function $f(x,y)=\dfrac{x^2y} {\sqrt{x^2+y^2}}$, switching to polar coordinate gives us $f(\cos\theta,\sin\theta)=\cos^2\theta\sin\theta$ How can we generalize this to higher dimensions? I had the question of determining differentiablity of the function I just gave as an example and this simplification made the problem a lot easier, so I ask because in case where I have to prove that a function isn't differentiable I would like to look for contradictions to the theorem "If $f$ is differentiable the the directional derivative at point $a$ in direction $v$ is equal to the gradient computed at $a$ multiplyed by $v$".
Checking all directional derivatives for higher dimensions using proof similar to polar coordinate method
analysiscalculusmultivariable-calculusproof-writingreal-analysis
Related Solutions
Note: this answer stems from the (now removed) comments left under my answer to your other question about polar coordinates. As requested, I have edited to add the proofs.
Before proceeding, let us define (possibly non-standard notation) $$\lim_{(r,\phi)\to\{0\}\times\mathbb R}g(r,\phi)=L\tag{$\diamond$}$$ to mean that for each $\epsilon>0$, there is an open neighborhood $U$ of $\{0\}\times\mathbb R$ in $[0,\infty)\times\mathbb R$ such that $(r,\phi)\in U$ and $r\neq0$ implies $|g(r,\phi)-L|<\epsilon.$
In the case $g(r,\phi)=f(r\cos\phi,r\sin\phi)$, which is $2\pi$-periodic in the second argument, this can be simplified, by requiring instead that for each $\epsilon>0$, there is a $\delta>0$ such that $0<r<\delta$ implies $|g(r,\phi)-L|<\epsilon$. (Intuitively, in this case we do not have to worry about $\delta$ getting smaller and smaller as $\phi\to\pm\infty$.) This can be done e.g. by using the tube lemma from elementary topology.
We may now state:
Proposition. The following are equivalent for a function $f:\mathbb R^2\setminus\{(0,0)\}\to\mathbb R$:
- We have $$\lim_{(x,y)\to(0,0)}f(x,y)=L.$$
- The function $\tilde f:\mathbb R^2\to\mathbb R$, defined by $$\tilde f(x,y)=\begin{cases}f(x,y);&(x,y)\neq(0,0),\\L;&(x,y)=(0,0),\end{cases}$$ is continuous at $(0,0)$ and $f(0,0)=L$.
- For all $\phi_0\in\mathbb R$ we have $$\lim_{(r,\phi)\to(0,\phi_0)}f(r\cos\phi,r\sin\phi)=L.\tag{$\star$}$$
- We have $$\lim_{(r,\phi)\to\{0\}\times\mathbb R}f(r\cos\phi,r\sin\phi)=L.$$
- The function $g:[0,\infty)\times\mathbb R\to\mathbb R$, defined by $$g(r,\phi)=\begin{cases}f(r\cos\phi,r\sin\phi);&r>0,\\L;&r=0,\end{cases}$$ is continuous at $(0,\phi_0)$ for each $\phi_0\in\mathbb R$.
Proof. The equivalence of 1 and 2 is standard and follows directly from the definitions.
To see that 2 implies 5, let $p:[0,\infty)\times\mathbb R\to\mathbb R^2$ be defined by $p(r,\phi)=(r\cos\phi,r\sin\phi).$ This is obviously continuous, so if $\tilde f$ is continous, $g=\tilde f\circ p$ must be continuous as well.
Next, we show that 5 implies 3. To see this, let $\epsilon>0$ and $\phi_0\in\mathbb R$. By $5$, there is a $\delta>0$ such that $\|(r,\phi)-(0,\phi_0)\|<\delta$ implies $|g(r,\phi)-g(0,\phi_0)|<\epsilon$. In particular, if $r>0$, this means that $\|(r,\phi)-(0,\phi_0)\|<\delta$ implies $|f(r\cos\phi,r\sin\phi)-L|<\epsilon$, which is precisely 3.
Next, we show that 3 implies 4. Let $\epsilon>0$. For each $\phi_0\in\mathbb R$ we have a $\delta(\phi_0)>0$ such that $\|(r,\phi)-(0,\phi_0)\|<\delta(\phi_0)$ implies $|f(r\cos\phi,r\sin\phi)-L|<\epsilon$. Let $$U(\phi_0)=\{(r,\phi)\in[0,\infty)\times\mathbb R\mid \|(r,\phi)-(0,\phi_0)\|<\delta(\phi_0)\}.$$ Then $U=\bigcup_{\phi_0\in\mathbb R}U(\phi_0)$ is an open neighborhood of $\{0\}\times\mathbb R$ in $[0,\infty)\times\mathbb R$ and for each $r\neq 0$ the condition $(r,\phi)\in U$ implies $\|f(r\cos\phi,r\sin\phi)- L\|<\epsilon$, because every such $(r,\phi)$ is an element of some $U(\phi_0)$ and therefore satisfies $\|(r,\phi)-(0,\phi_0)\|<\delta(\phi_0)$.
Finally, we show that 4 implies 1, thus concluding the circle of implications and establishing the desired equivalence. To do this, let $\epsilon>0$. By 4 and the fact that $f(r\cos\phi,r\sin\phi)$ is a $2\pi$-periodic function of $\phi$, there is a $\delta>0$ such that $0<r<\delta$ implies $|f(r\cos\phi,r\sin\phi)-L|<\epsilon$. Therefore, if $0<\|(x,y)\|<\delta$, we have $|f(x,y)-L|<\epsilon$, because $(x,y)=(r\cos\phi,r\sin\phi)$ for some $r\in(0,\delta)$ and $\phi\in\mathbb R$. This concludes the proof. $\qquad\square$
Remark. Note that the limit $(\star)$ is the limit of a two-variable function. Requiring that we have the single variable limit $$\lim_{r\to0}f(r\cos\phi_0,r\sin\phi_0)=L\tag{$\ast$}$$ for each $\phi_0\in\mathbb R$ is not (!) equivalent. In fact, this latter condition is not sufficient for the limit to exist as shown by the standard counterexample $f(x,y)=\frac{x^2y}{x^4+y^2}.$) The point is that the limit $(\star)$ takes into account points $(r,\phi)$ with different values of $\phi$ whereas in $(\ast)$ the value $\phi=\phi_0$ is fixed.
Added: here are the details regarding the part with the tube lemma, but I will do it without actually using the tube lemma, which would allow us to skip a few steps. (In fact, what follows basically includes a proof of the tube lemma in this special case.)
The definition of $(\diamond)$ mentions a neighborhood $U$ of $\{0\}\times\mathbb R$ in $[0,\infty)\times\mathbb R$. We wish to show that, if $g(r,\phi)$ is $2\pi$-periodic in $\phi$, we can always take a neighborhood of the form $V=[0,\delta)\times\mathbb R$ for some $\delta>0$, instead of this possibly more complicated neighborhood $U$.
Assume we have a neighborhood $U$ of $\{0\}\times\mathbb R$, as in the definition of $(\diamond)$, not necessarily of the form $[0,\delta)\times\mathbb R$. Using $U$, we are going to construct a neighborhood $V=[0,\delta)\times\mathbb R$ satisfying the same properties. Note that since $U$ is a neighborhood of every point $(0,\phi_0)\in\{0\}\times[0,2\pi]$, it contains a ball $$B((0,\phi_0),\eta(\phi_0))=\{(r,\phi)\in[0,\infty)\times\mathbb R\mid \|(r,\phi)-(0,\phi_0)\|<\eta(\phi_0)\}$$ of radius $\eta(\phi_0)>0$ around every such point. This ball contains a set of the form $$V(\phi_0)=[0,\delta(\phi_0))\times(\phi_0-\delta(\phi_0),\phi_0+\delta(\phi_0))$$ which still contains $(0,\phi_0)$. The sets $V(\phi_0)$ form a cover of the interval $\{0\}\times[0,2\pi]$, which is compact, so there is a finite subcover $V(\phi_1),V(\phi_2),\ldots, V(\phi_n)$, where $\phi_1,\phi_2,\ldots,\phi_n\in[0,2\pi]$. Now take $\delta=\min\{\delta(\phi_1),\delta(\phi_2),\ldots,\delta(\phi_n)\}$. Observe that $$[0,\delta)\times[0,2\pi]\subseteq V(\phi_1)\cup V(\phi_2)\cup\ldots\cup V(\phi_n)\subseteq U,$$ so $(r,\phi)\in[0,\delta)\times [0,2\pi]$ implies $|g(r,\phi)-L|<\epsilon$. But since $g$ is $2\pi$-periodic in $\phi$, this means that $(r,\phi)\in[0,\delta)\times\mathbb R$ also implies $|g(r,\phi)-L|<\epsilon$. Therefore $V=[0,\delta)\times\mathbb R$ has the desired properties.
I'll show you my method, which I prefer over @Niki Di Giano's.
You can obtain the partial derivatives without solving for $r$ and $\theta$ in terms of $x,y$.
First, obtain the $r$ and $\theta$ partial derivatives first. Using the multivariable chain rule
\begin{align} \frac{\partial u}{\partial r} &= \frac{\partial u}{\partial x}\frac{\partial x}{\partial r} + \frac{\partial u}{\partial y}\frac{\partial y}{\partial y} = \cos\theta\frac{\partial u}{\partial x} + \sin\theta \frac{\partial u}{\partial y} \\ \frac{\partial u}{\partial \theta} &= \frac{\partial u}{\partial x}\frac{\partial x}{\partial \theta} + \frac{\partial u}{\partial y}\frac{\partial y}{\partial \theta} = -r\sin\theta\frac{\partial u}{\partial x} + r\cos\theta\frac{\partial u}{\partial y} \end{align}
You can treat this as a system of linear equations
\begin{align} \cos\theta\frac{\partial u}{\partial x} + \sin\theta \frac{\partial u}{\partial y} &= \frac{\partial u}{\partial r} \\ -\sin\theta\frac{\partial u}{\partial x} + \cos\theta\frac{\partial u}{\partial y} &= \frac{1}{r}\frac{\partial u}{\partial \theta} \end{align}
where $u_x$ and $u_y$ are the unknowns. Use whatever method you want to solve for the $x$ and $y$ partials, and obtain
\begin{align} \frac{\partial u}{\partial x} &= \cos\theta\frac{\partial u}{\partial r} - \frac{\sin\theta}{r}\frac{\partial u}{\partial \theta} \\ \frac{\partial u}{\partial y} &= \sin\theta\frac{\partial u}{\partial r} + \frac{\cos\theta}{r}\frac{\partial u}{\partial \theta} \end{align}
I prefer to do this when the inverse coordinate change is less straightforward and it's more convenient to solve the linear system. Otherwise, you should get the same result.
You can treat $\frac{\partial }{\partial x}$ as an operator and apply it twice, i.e
\begin{align} \frac{\partial^2 u}{\partial x^2} &= \left(\cos\theta\frac{\partial }{\partial r} - \frac{\sin\theta}{r}\frac{\partial }{\partial \theta} \right)\left(\cos\theta\frac{\partial u}{\partial r} - \frac{\sin\theta}{r}\frac{\partial u}{\partial \theta} \right) \\ &= \cos\theta\frac{\partial }{\partial r}\left(\cos\theta\frac{\partial u}{\partial r}\right) - \frac{\sin\theta}{r}\frac{\partial }{\partial \theta}\left(\cos\theta\frac{\partial u}{\partial r}\right) \\ &\quad - \cos\theta\frac{\partial }{\partial r}\left(\frac{\sin\theta}{r}\frac{\partial u}{\partial \theta}\right) + \frac{\sin\theta}{r}\frac{\partial }{\partial \theta}\left(\frac{\sin\theta}{r}\frac{\partial u}{\partial \theta}\right) \\ &= \dots \end{align}
and likewise for $u_{yy}$ to prove the Laplacian identity.
Best Answer
In two variables it looks ok to use polar coordinates $(sin \theta, cos\theta)$ .You can also manage three dimensional polar co-ordinates but higher (for n- sphere) things get messy. But for higher variables you can to use definiton of derivative. That implies
$$‖\frac{f(a+h,b+k)-f(a,b)-\nabla{f}.‖(h,k)‖}{‖(h,k)‖}‖\to 0$$This is for two variables (Checking differentiability at $(a,b)$ and $(h,k)$ is a point on the ball close to $(a,b)$ )but you can generalize .
To prove that a function is not differentiable your observation is right .
Further there is a theorem that "If all partial derivatives exist and continuos in a neighborhood of $x$ , then $f$ is differentiable at $x$."