Solve Matrix Linear Least Squares with Frobenius Norm Regularization and Linear Equality Constraints

convex optimizationleast squaresoptimization

How to solve the following constrained minimization problem:
$$
\arg_S min_\; \frac{1}{2}\left \{ \left \| K_2SK_1^T-M \right \|_F^2 +\lambda \left \| S \right \|_F^2\right \} \\
s.t. \sum_{1}^{col}S=Sum1 \\
\sum_{1}^{row}S=Sum2 \\
$$

where $K_1$,$K_2$,$M$ and $S$ are 2d Matrix, and only $S$ is unknown. In the constraints, $Sum1$ is the sum along the column of $S$, which is a row vector. $Sum2$ is the sum along the row of $S$, which is a column vector.

Here is the data stored in mat format. How to solve this kind of problem?

    load('matlab.mat');
%  min norm( K2*X*K1'-M,'fro')^2+lambda*norm(X,'fro')^2
%   s.t. sum(X,1) = Sum1 ;   sum(X,2) = Sum2;

Best Answer

The idea is to bring the problem into the form:

$$\begin{aligned} \arg \min_{ \boldsymbol{s} } \quad & \frac{1}{2} {\left\| K \boldsymbol{s} - \boldsymbol{m} \right\|}_{2}^{2} + \frac{\lambda}{2} {\left\| \boldsymbol{s} \right\|}_{2}^{2} \\ \text{subject to} \quad & A \boldsymbol{s} = \boldsymbol{u} \\ \quad & B \boldsymbol{s} = \boldsymbol{v} \end{aligned}$$

Using the Kronecker Product we can see that:

  • $ K = {K}_{1} \otimes {K}_{2} $.
  • $ \boldsymbol{s} = \operatorname{vec} \left( S \right) $ where $ \operatorname{vec} \left( \cdot \right) $ is the vectorization operator.
  • $ \boldsymbol{m} = \operatorname{vec} \left( M \right) $.

The matrices $ A $ and $ B $ are just Selectors of the corresponding elements in $ \boldsymbol{s} $.

Remark
Pay attention that if $ A $ and $ B $ represent a matrix which selects each element of $ \boldsymbol{s} $ exactly once then $ \sum_{i} {u}_{i} = \sum_{i} {v}_{i} $ must hold as it represent the sum of $ \boldsymbol{s} $. Namely $ \boldsymbol{1}^{T} A \boldsymbol{s} = \boldsymbol{1}^{T} B \boldsymbol{s} = \sum_{i} {s}_{i} $. This is the case for your constraints. So it must be like that in order to have a feasible solution.

Now the above is a basic Convex problem which can be solved by Projected Gradient Descent where we project onto the intersection of the 2 equality constraints.

You could even do something simpler by concatenate the matrices and vectors:

$$ C \boldsymbol{s} = \begin{bmatrix} A \\ B \end{bmatrix} \boldsymbol{s} = \boldsymbol{w} = \begin{bmatrix} \boldsymbol{u} \\ \boldsymbol{v} \end{bmatrix} $$

Then it is very similar to Linear Least Squares with Equality Constraint.

An interesting resource with that regard is Robert M. Freund - Projection Methods for Linear Equality Constrained Problems.