Should the differential of a function be zero when it has a zero value

derivativesdifferentialoptimization

Suppose $ x_1,x_2 $ are two different variables, and the function $$ g(x_1,x_2)=0. $$

Then, how can we approve the following formula:
$$ dg = \frac{\partial g}{\partial x_1}dx_1 + \frac{\partial g}{\partial x_2}dx_2 = 0 .$$
If the formula is not always right. What are the conditions to have it true?

Edit:

Now, I write my question as I have it from my book

Consider the extrema of a function $$ f(x_1,x_2). $$ with two interdependent variables $ x_1,x_2 $, subject to the condition
$$ g(x_1,x_2)=0. \space\space\space\space\space\space (1)$$
As a necessary condition for extrema, we have
$$ df = \frac{\partial f}{\partial x_1}dx_1 + \frac{\partial f}{\partial x_2}dx_2 = 0. \space\space\space\space\space\space\space\space\space (2)$$
However, since $ dx1$ and $ dx2 $ are not arbitrary, but related by the condtion
$$ dg = \frac{\partial g}{\partial x_1}dx_1 + \frac{\partial g}{\partial x_2}dx_2 = 0 .\space\space\space\space\space\space\space\space\space(3)$$

How did the authoer conclude the last formula

Best Answer

If $g$ is the constant function $0$, then $dg = 0$ because the partial derivatives are zero.

If $g = 0$ but the partial derivatives aren't equal to $0$, then by the Implicit Function Theorem that means that $x_2$ is some implicit function of $x_1$. Then $dg = 0$ if

$$ \frac{\partial g}{\partial x_2}dx_1 = - \frac{\partial g}{\partial x_2}dx_2 $$

Related Question