[Math] How to maximize scalar product using Lagrangian methods

lagrange multiplieroptimization

maximize $U(x) = u \cdot x$ with respect to $p \cdot x = w$

given that $u, p, x \in \mathbb{R}^L_+$, $w \in \mathbb{R_+}$.

I can solve it classically:

\begin{align}
u \cdot x &= \sum u_i x_i \\
&= \sum \frac{u_i}{p_i} p_i x_i\\
&\le \max{\frac{u_i}{p_i}} \sum p_j x_j\\
&= \max{\frac{u_i}{p_i}} \cdot w
\end{align}

with equality when $x = \frac{w}{p_i} e_i$ where $i = \text{argmax}_j \frac{u_j}{p_j}$ and $e_i$ is the $i$-th basis vector.

But I would like to see a proof using Lagrange multipliers.

Best Answer

Let $f(x_1,\ldots,x_n)=u_1x_1+\ldots+u_nx_n$ and $g(x_1,\ldots,x_n)=p_1x_1+\ldots+p_nx_n$, where $\textbf{u}\ge 0$ and $\textbf{p}> 0$ (coordinate-wise).

Solve

\begin{align} \qquad\qquad\qquad\qquad \max\,\,\, &f(x_1,\ldots,x_n) \qquad\qquad\text{subject to} \\ &g(x_1,\ldots,x_n) = a \\ &x_1 \ge 0 \\ &...\\ &x_n \ge 0 \\ \end{align} This is a constrained optimization problem with inequality constraints, which is solved by Kuhn-Tucker method (wiki and original paper). We use $\lambda$ as multiplier on the equality constrain and $\mu_i$ as multipliers on the non-negativity constraints. Kuhn-Tucker conditions:

\begin{align} u_i &= \lambda p_i + \mu_i, \text{ for } i=1,...,n \\ a &= p_1x_1+\ldots+p_nx_n \\ \mu_i &\le 0, \text{ for } i=1,...,n \\ \mu_i x_i &= 0, \text{ for } i=1,...,n \\ \end{align} Clearly, not all $x_i$ equal to $0$. Let $i^*$ denote the index for which $x_{i^*}>0$. Then $\mu_{i^*}=0$ and $\lambda = \frac{u_{i^*}}{p_{^*}}$. It then follows that for indices $i\ne i^*$ we must have $\mu_i = u_i - \frac{u_{i^*}}{p_{^*}}p_i$ and $\frac{\mu_i}{p_i} = \frac{u_i}{p_i} - \frac{u_{i^*}}{p_{^*}}$ which must be non-positive ($p_i>0$ by assumption).

Thus, $i^*$ must be the index for which $\frac{u_i}{p_i}$ is the largest. And for all other indices $x_i=0$ because $\mu_i >0$, so that $x_{i^*} = \frac{a}{p_{^*}}$.

Related Question