The steepest descent direction constrained to non-negative variables

gradient descentoptimizationsubgradient

Recently, another user asked about the steepest descent direction constrained to non-negative variables, but using the $L_1$ norm to avoid null directions, see this link. His ideas led to the following optimization problem:
$$
\begin{align} \tag{*}
\min_{d \in \mathbb{R}^{n}} &\ c^{T} d \\
\text{subject to } & \text{e}^{T} d = 1 \\
& d\geq 0,
\end{align}
$$
in which $\text{e} = (1,1,1, \cdots ,1)^{T}.$ However, this direction of descent might be thought as not a good one, since it tends to look for directions whose coordinates are sparse. Hence, if one wants to minimize in the positive octant using a descent algorithm, it would be preferable to look for non-zero descent directions whose Euclidean norm is 1. That is, instead of looking for directions of (*), one should look for directions $d$ solving the problem
$$
\begin{align} \tag{**}
\min_{d \in \mathbb{R}^{n}} &\ c^{T} d \\
\text{subject to } & \dfrac{1}{2}\|d\|^2_{2} = \dfrac{1}{2} \\
& d\geq 0,
\end{align}
$$

where $\| \|_2$ means the Euclidean norm. The question is:

What are the directions $d$ which solves the problem (**)?

It does not seem hard to find good candidates for minimizers, but it does not seem trivial also — after some thought.

Best Answer

The Non-Negative Least-Squares problem is $$\min_x\;\|Ax-b\|^2_2 \qquad{\rm s.t.}\;\;x_k\ge 0$$ Remapping the variables to $\;\left(A=c^T,\;b=[\,{\tt1}\,],\;x=d\,\right)\;$ recovers the initial problem.

The standard solver for this problem is Matlab's lsqnonneg which has been ported to many other languages (R, Octave, Scilab, Julia, etc).

An algorithm for very large problems is described in this paper by Dhillon, Kim & Sra.

Another possibility is the Sequential Coordinate-wise algorithm by Frank, Navara & Hlavac.

Related Question