Optimization problem with Lagrangian multipliers

calculuslagrange multipliermultivariable-calculusoptimization

I'm trying to solve the following optimization problem:
\begin{equation}\end{equation} \begin{align} \text{argmax}_{\left\{x_i\right\},\left\{y_j\right\}} & \frac{1}{2}\left( \sum_{i=1}^{N} \log_2\left(1+\alpha_i x_i\right) + \sum_{j=1}^{M} \log_2\left(1+ \beta_j y_j\right) \right) \\[4pt] \nonumber \text{subject to: }& \sum_{i=1}^N x_i + \sum_{j=1}^M y_j \leq A \\
& \sum_{j=1}^{M} \log_2\left(1+ \beta_j y_j\right) \leq B \end{align}

To this end, I tried to use the Lagrangian multipliers method. I defined the Lagrangian function
\begin{align}
\mathcal{L}(x_1,\ldots,x_N,y_1,\ldots,y_M,\lambda_1,\lambda_2) &= \left( \sum_{i=1}^{N} \log_2\left(1+\alpha_i x_i\right) + \sum_{j=1}^{M} \log_2\left(1+ \beta_j y_j\right) \right) \\
& – \lambda_1 \left(\sum_{i=1}^N x_i + \sum_{j=1}^M y_j – A\right) \\
& -\lambda_2 \left(\sum_{j=1}^{M} \log_2\left(1+ \beta_j y_j\right) -B\right)
\end{align}

and I computed the partial derivatives of the Lagrangian function with respect to generic $x_i$ and $y_j$. Setting them to zero led to
\begin{align}
& \frac{\partial \mathcal{L}}{\partial x_i} = \frac{\alpha_i}{\alpha_i x_i \log(2) + \log(2)} – \lambda_1 = 0, \\
& \frac{\partial \mathcal{L}}{\partial y_j} = \frac{\beta_j}{\beta_j y_j \log(2) + \log(2)} – \lambda_1 -\lambda_2 \frac{\beta_j}{\beta_j y_j \log(2) + \log(2)} = 0,
\end{align}

which resulted in
\begin{align}
& x_i = \frac{1}{\log(2) \lambda_1} – \frac{1}{\alpha_i}, \\
& y_j = \frac{1-\lambda_2}{\log(2) \lambda_1} – \frac{1}{\beta_j}.
\end{align}

I don't know how to proceed from here onwards. Any help?

Best Answer

Hint.

I will use $\ln$ instead to facilitate the notation issues. First you should transform your inequalities into equations with the introduction of some slack variables $s_1, s_2$ so the Lagrangian reads

\begin{align} \mathcal{L}(x_1,\ldots,x_N,y_1,\ldots,y_M,\lambda_1,\lambda_2,s_1,s_2) &= \frac 12\left( \sum_{i=1}^{N} \ln\left(1+\alpha_i x_i\right) + \sum_{j=1}^{M} \ln\left(1+ \beta_j y_j\right) \right) \\ & - \lambda_1 \left(\sum_{i=1}^N x_i + \sum_{j=1}^M y_j - A+s_1^2\right) \\ & -\lambda_2 \left(\sum_{j=1}^{M} \ln\left(1+ \beta_j y_j\right) -B+s_2^2\right) \end{align}

now, the stationary points are the solutions for

\begin{align} & x_i = \frac{1}{2 \lambda_1} - \frac{1}{\alpha_i}, \\ & y_j = \frac{1-2\lambda_2}{2\lambda_1} - \frac{1}{\beta_j},\\ & \sum_{i=1}^N x_i + \sum_{j=1}^M y_j - A+s_1^2= 0,\\ & \sum_{j=1}^{M} \ln\left(1+ \beta_j y_j\right) -B+s_2^2=0,\\ & \lambda_1s_1 = 0,\\ & \lambda_2 s_2 = 0, \end{align}

We can proceed by substituting the values found to $x_i,y_j$ into the restrictions giving

\begin{align} &\frac{N}{2 \lambda_1} - \sum_{i=1}^N\frac{1}{\alpha_i}+M\frac{1-2\lambda_2}{2\lambda_1}- \sum_{j=1}^M\frac{1}{\beta_j}-A+s_1^2=0,\\ &\left(\frac{1-2\lambda_2}{2\lambda_1}\right)^M\prod_{j=1}^M\beta_j=2^Me^{B-s_2^2} \end{align}

Now using $\lambda_is_i=0$ we can (?) calculate $\lambda_i$ and finally to have the stationary points. The solutions obtained with $s_i = 0$ are solutions at the feasible region boundary.

Related Question