Why does the method of Lagrange multiplier fail

lagrange multipliermultivariable-calculusoptimization

Question

Let $\mathbf{x}$ and $\mathbf{y}$ be two vectors in $\mathbb{R}^2$, and $f$ be a function defined by $f(\mathbf{x},\mathbf{y}) := \mathbf{x} \cdot \mathbf{y}$. Can this function be minimised subject to the conditions $\parallel \mathbf{x} \parallel = \parallel \mathbf{y} \parallel = 1$, with the method of Lagrange multiplier?

Attempt

I failed to used Lagrange multiplier to solve this question. The primary reason is that as I try to express $f$ as a function of four variables $x_1, x_2, x_3, x_4$, the gradient of $g_1$ and the gradient of $g_2$ has the last two entries being $0$. While the gradient of $f$ does not have zero elements. Thus, it seems there can be no $\lambda_1$ and $\lambda_2$ that equates the three gradient vectors. Am I correct? Or am I grossly simplifying the question when I express $f$ as a function of four variables?

Best Answer

Let $f,g_1,g_2:\mathbb R^{2d}\to\mathbb R$ be given by $$ f(x,y)=x\cdot y \qquad g_1(x,y)=\|x\|^2 \qquad g_2(x,y)=\|y\|^2. $$ You want to minimize $f$ subject to $g_1=g_2=1$. Then $$ \nabla f(x,y) = (y,x) \qquad \nabla g_1(x,y) = (2x, 0) \qquad \nabla g_2(x,y) = (0, 2y). $$ From Lagrange you get $$ (y,x) = \lambda_1(2x,0) + \lambda_2(0,2y) = 2(\lambda_1 x, \lambda_2 y). $$ From this you get that $x$ and $y$ bust be linearly dependent, hence $y=\pm x$ because of the constraint. $y=-x$ is a global minimizer, $y=x$ is a global maximizer.