[Math] Proof for a Rank-One Decomposition Theorem of Positive (semi) Definite Matrices

linear algebramatrices

Consider the following result which I recently came across in a research paper in my area (Signal Processing)

Let $X$ be a $N\times N$ positive semidefinite (psd) matrix whose rank
is $r$. Let $A$ be any symmetric $N\times N$ matrix. Then, there exist
a set of vectors $x_1,\dots,x_r$ such that \begin{align} X & =
\sum_{i=1}^{r}x_ix_i^T \\ x_i^TAx_i &=
\frac{\mbox{trace}\{AX\}}{r},~~~\forall i \end{align}

The following is the proof for it which I can't verify.

Proof:
Consider the following step-wise procedure whose inputs are $X$ and $A$.

$~~$0.$~~$ Inputs are $X$ (given $X\geq 0$, $rank(X)=r$) and $A$ (symmetric).

  1. Decompose $X=RR^T$.

  2. Generate the eigen decomposition $R^TAR=U\Lambda U^T$.

  3. Let $h$ be any $N\times 1$ vector such that $\lvert h_i\rvert=1$ (each entry of $h$). Generate the vector $x_1$ and matrix $X_1$ as
    \begin{align}
    x_1&=\frac{1}{\sqrt{r}}RUh \\
    X_1&=X-x_1x_1^T
    \end{align}

  4. Outputs are $X_1$ and $x_1$.

The paper then claims that

  • $X_1$ is psd and has rank $r-1$
  • $x_1^TAx_1=\frac{1}{r}trace(AX)$

While I am able to verify the second claim, am not able to verify the first one? How is it true? If this can be done, the rest of the proof is straight forward. I am looking for a rigorous proof.

Read this if you are interested to know where this proof heads. Now do the stepwise algorithm earlier with inputs $X_1$ and $A$ to get $x_2$ and $X_2$ such that \begin{align}X_2&=X_1-x_2x_2^T\\&=X-x_1x_1^T-x_2x_2^T\end{align} and $$x_2^TAx_2=\frac{1}{r-1}trace(AX_1)=\frac{1}{r}trace(AX)$$ Then the result of the paper is that you can do this procedure $r$ times and get a rank-one decomposition $$X=\sum_{i=1}^{r}x_ix_i^T$$ with the property $$x_i^TAx_i\,=\,\frac{1}{r}trace(AX),~\forall i$$for any given psd X with rank $r$ and any symmetrix $A$.

Best Answer

I don't know what's going on with the paper, but here is an argument regarding existence of such decompositions.

Given a rank one decomposition $$X = \sum_{i=1}^R x_ix_i^T$$ one has $\sum_{i=1}^R x_i^TAx_i = {\rm Trace}(AX)$, so the question is how to make these pairings $x_i^TAx_i$ equal to each other. Consider the minimum of the function $$ \sum_{i} (x_i^TAx_i-\frac 1R{\rm Trace}(AX))^2 $$ over the set of all possible rank one decompositions of $X$ (which is easily a compact set). Suppose this minimum occurs at some $\{x_i\}$ and is positive.

By reindexing, we may assume that $x_1^TAx_1<\frac 1R {\rm Trace}(AX)$ and is the minimum of $x_i^TAx_i$ and that $x_2^TAx_2>x_1^TAx_1$. Consider a small rotation $\alpha$: $$y_1(\alpha)=x_1 \cos \alpha + x_2 \sin \alpha,~ y_2(\alpha) = -x_1\sin\alpha + x_2\cos\alpha,$$ so that $y_1y_1^T+y_2y_2^T=x_1x_1^T+x_2x_2^T$. By minimality, $$ y_1^T(\alpha)Ay_1(\alpha) \leq x_1^TAx_1 $$ at all sufficiently small $\alpha$, else we can move the pairings closer together by replacing $x_1$ and $x_2$ by $y_1$ and $y_2$ for small $\alpha$ and keeping $x_{>2}$, which would contradict the minimality. This first of all implies by differentiating at $\alpha=0$ that $$ x_1^TAx_2=0. $$ Thus we get $$ y_1^T(\alpha)Ay_1(\alpha) = x_1^TAx_1\cos^2\alpha + x_2^TAx_2\sin^2\alpha =x_1^TAx_1 + (x_2^TAx_2-x_1^TAx_1)\sin^2\alpha > x_1^TAx_1 $$ which is a contradiction.

Related Question