Show an objective function is convex with respect to multiple matrix variables

convex-analysiskronecker productmatricesmatrix-calculusoptimization

Before go through the details, you can just first consider a toy example $(m,n,N,d_1,d_2)=(2,3,4,1,1)$ which is the simplest case.
I'm solving an optimization problem numerically(probably with Newton's method or BFGS) and proving convergence,
\begin{equation}
\begin{aligned}\label{}
& \underset{F\in \mathbb{R}_{m,d_1} G\in \mathbb{R}_{n,d_2}}{\text{minimize}}
& & \phi(F,G) =\|(L\otimes F)^TVG\|_F^2\\
& \text{subject to}
& & \sum_{i=1}^N \|A_i^Tf_j\|_2^2=1 , \ j = 1, \ldots, d_1.\\
&&& \sum_{i=1}^N \|A_ig_j\|_2^2=1 , \ j = 1, \ldots, d_2.
\end{aligned}
\end{equation}

where $A_i$'s are known $m$ by $n$ full rank matrices (rank = min{m,n}), and $f_i$, $g_i$ are columns of $F$, $G$ respectively.
$L$ is a known $N$ by $N$ matrix. $V = [A_1^T\ A_2^T\ \ldots \ A_N^T ]^T$
$\otimes$ is the kronecker product (or tensor product)
Define $H=[A_1\ A_2\ \ldots \ A_N ]^T$.
I want to show $\phi$ is convex first, I know it is convex w.r.t F and G separatively, but how to show $\phi$ is convex w.r.t F and G jointly (or nonconvex)?
My thought is to define $x = [\text{vec}(F)^T \text{vec}(G)^T]^T$ and show the hessian matrix(second order derivative w.r.t x) $\succ 0$ (or at least $\succeq$)
After some computations, the hessian would be
\begin{bmatrix}P_{11}& P_{12}\\ P_{12}^T & P_{22} \end{bmatrix}
where $P_{11}=I\otimes(H(M\otimes GG^T)H^T)$ and $P_{22}=I\otimes(V^T(M\otimes FF^T)V)$
$P_{12}=\sum_{i}\sum_jM_{ij}\{(F^TA_jG)\otimes A_i+ [(F^TA_i)\otimes(A_jG)]K_{n,{d_2}}\} $
where $M=L^TL$ and $K_{m,n}$ is commutation matrix of dimension $mn$ by $mn$
Next step is to show $P_{11}-P_{12}P_{22}^{-1}P_{12}^T\succ 0$ (according to thm 1 in link)
Is there any way to simplify $P_{12}$(I mean replace the double sum with some function of $H$, $V$, and $M$)?
Or is there other way to prove convex or nonconvex?

Best Answer

Consider the simplest case, $m = n = d_1 = d_2 = 1$. Then $\phi = L^2V^2F^2G^2$, which is neither jointly convex nor concave in $F$ and $G$ (unless $L$ and/or $V$ are zero, which can't happen due to full rank assumption). Things do not become convex as any of these dimensions increase.

Furthermore, the constraints are nonlinear equality constraints, hence non-convex.

Verdict: This is not a convex optimization problem.