[Math] Lagrangian Multipliers

convex optimizationnonlinear optimization

I have a fundamental question about Lagrange multipliers. Here it is:

I have a function to maximize with respect to a parameter say $\theta$, subject to two constraints. Lets assume that the first constraint is multiplied by the Lagrangian multiplier $\alpha_1$ and the second constraint is multiplied by the second Lagrangian multiplier $\alpha_2$. The function to be maximized is concave and the constraints are either convex or concave in $\theta$

Question: When we get two equations for $\alpha_1$ and $\alpha_2$. Can we say that there is a unique solution for these two parameters?, i.e., are the parameters will be monotone in the equation?

Best Answer

The number of constraints does not matter here, as long as there are fewer constraints than variables. If the objective function is strictly concave and the constraints are strictly convex in the variable $\theta$, then there is an easy uniqueness result. (There is also a more complicated result, where you only look at the behavior of these functions along the admissible set, and which involves what is called the bordered Hessian.)

Let $f(\theta)$ be the function you want to maximize, where $\theta$ will have the components $\theta=(\theta_1,\dots,\theta_n)$, and let the constraints be $g_1(\theta)=b_1$ and $g_2(\theta)=b_2$. Introduce the Lagrangian function $$L=L(\theta,\alpha)=f(\theta)-\alpha_1(g_1(\theta)-b_1)-\alpha_2(g_2(\theta)-b_2)$$ The Lagrangian conditions are the same as $${\partial L \over \partial \theta_i}=0,\qquad i=1,\dots,n \\ {\partial L\over \partial \alpha_j}=0, \qquad j=1,2\qquad{}$$ which means that the optimal solution $(\theta^*,\alpha^*)$ should be a stationary point for $L$.

Cases where the gradient vectors $\nabla g_1(\theta)$ and $\nabla g_2(\theta)$ are linearly dependent at some admissible point have to be checked separately, as you are not sure that Lagrange's method will find the answer then.

If $\nabla g_1(\theta)$ and $\nabla g_2(\theta)$ are linearly independent at all admissible points (this is often the case), then the Lagrange conditions above are necessary at an optimal point, and $\alpha^*$ will be uniquely determined.

Now, if $f(\theta)$ is strictly concave and each $g_j(\theta)$ is strictly convex then $L(\cdot,\alpha^*)$ will be strictly concave in $\theta$, which means that it can have at most one stationary point $\theta^*$. Hence the solution, if it exists, is unique.

Related Question