[Math] the difference between exact and partial differentiation

derivatives

My understanding of partial $\left( \frac{\partial}{\partial} \right)$ and total
$\left( \frac{d}{d} \right)$ differentiation/derivative is that assuming $f(x_1, x_2, …,x_n )$ where $x_i$s are not necessarily independent:

$$\frac{d f}{ dx_i}=\sum^n_1 \left(\frac{\partial f}{\partial x_j}\frac{d x_j}{dx_i} \right)$$

Where $\frac{\partial f}{\partial x_i}$ is the symbolic derivative of the equation $f(x_1, x_2, …,x_n )$ assuming all $x_j$s except $x_i$ are constants. Of course when $x_i$s are independent:

$$\frac{\partial f}{\partial x_i}=\frac{d f}{ dx_i}$$

But in thermodynamics I see that they have this exact differential

$$\left(\frac{\partial f}{\partial x_i} \right)_{x_j}$$

which to me looks exactly the same as partial differential. For example see these videos of thermodynamic lectures from MIT. I find this concept/notation redundant and confusing. I would appreciate if you could explain the difference between partial and exact differentials and give me a tangible example when they are not the same.

P.S.1. This post also approves my point:

In fact, the constancy of the other variables is implicit in the partial differential notation (∂/∂x) but it is customary to write the variables that are constant under the derivative when discussing thermodynamics, just to keep track of what other variables we were considering in that particular case.

Which if true, is an awful idea. Partial differential equations are already long and confusing enough without these redundant notations. Why on earth should we make it even more difficult?

Best Answer

If you consider $f$ simply as a function of $n$ variables, there is no such definition of "exact" differentiation. The fact is that, sometimes, the variables depend on each other on "external" variables. Let me clarify with an example. Consider a functions $f:\mathbb{R}^2\to\mathbb{R}$ and $u:\mathbb{R}\to \mathbb{R}$. $f$ has two partial derivatives $f_x$ and $f_y$. Suppose that we have a composition $$ \Big( f\circ (\cdot,u(\cdot))\Big)(z) = f(z,u(z)). $$ We can still talk about partial derivatives for $f$, and these are $$ f_x(z,u(z)) \quad \text{and} \quad f_y(z,u(z)). $$ We talk about "total derivative" for $f$ when we differentiate with respect to the parameter to which all the variables depend, namely $z$. This is $$ (f(z,u(z)))' = f_x(z,u(z))+f_y(z,u(z))u'(z). $$ A nice example where both concepts come in action is Euler-Lagrange equation.

The connection with your linked page (differentiable forms) is the following. Suppose you have a function $F:\mathbb{R}^2\to\mathbb{R}^2$. For all $(x,y)\in \mathbb{R}^2$, $F$ is a form of the dual of $\mathbb{R}^2$. The theory developed to understand whether this is also a gradient of a function $f:\mathbb{R}^2\to \mathbb{R}$ requires you to perform integration over curves on the space $\mathbb{R}^2$. Curves are functions $(x(t),y(t))\in \mathbb{R}^2$, $t\in [t_0,t_1]$. If the potential $f$ exists, it's partial derivatives are $$ f_x = F_1 \quad f_y = F_2. $$ On the other side, the derivative of $f$ along a curve $(x(t),y(t))$ is $$ f(x(t),y(t))' = f_x(x(t),y(t))x'(t)+f_y(x(t),y(t))y'(t) = F(x(t),y(t))\cdot (x'(t),y'(t)). $$ Therefore you can define the total derivative of $f$ to be $$ df = f_xdx + f_ydy, $$ where $dx$ and $dy$ must be understood as "small" variation of space variables "in time".

Related Question