[Math] Lagrange Multiplier equal zero

lagrange multiplier

I have the following analytical cost function: $$ \textit{J}(\bf{w}) = \textit{k} +\bf{q}^{T}\bf{Bw}+\bf{w}^{T}\bf{Cw}-\lambda\bf{w}^{T}\bf{Dw} $$
where $\textit{k}$ is a constant, bold lowercase variables are vectors (dimension Nx1) and bold upercase are matrices (dimension NxN). The only variable is $\bf{w}$ (others are constant). The constraint is $\bf{w}^{T}\bf{Dw}\geq0$ and $\lambda$ is the lagrange multiplier. The derivatives are:

\begin{array}{l}
\frac{{\partial {J(\bf{w})}}}{{\partial {\bf{w}}}} = {{\bf{B}}^T}{\bf{q}} + \left( {{\bf{C}} + {{\bf{C}}^T}} \right){\bf{w}} – \lambda \left( {{\bf{D}} + {{\bf{D}}^T}} \right){\bf{w}} = {\bf{0}}\\
\frac{{\partial {J(\bf{w})}}}{{\partial \lambda }} = – {{{\bf{w}}}^T}{\bf{D}\bf{w}} = 0\\
\lambda \ge 0
\end{array}

Pre-multiplying the first derivative by $\textbf{w}^{T}$ results in

${{\bf{\hat w}}^T}\left[ {{{\bf{B}}^T}{\bf{q}} + \left( {{\bf{C}} + {{\bf{C}}^T}} \right){\bf{\hat w}}} \right] = 0$

As a result

${\bf{\hat w}} = {\left( { – {\bf{C}} – {{\bf{C}}^T}} \right)^{ – 1}}{{\bf{B}}^T}{\bf{q}}$

The constraint disapeared! What happened with it? Was it intrinsically satisfied? For any $\bf{D}$?

Best Answer

Your last equation doesn't follow from anything you'd written above; certainly not from the equation above it, which is a scalar equation and thus can only determine at most one degree of freedom of $\hat{\mathbf w}$. If you want to get a vector equation such as the last equation, you have to solve the equation $\partial J(\mathbf w)/\partial\mathbf w=\mathbf 0$ without projecting it along a single direction, and then $\mathbf D+\mathbf D^T$ does appear in the matrix whose inverse you need to take.

Related Question