[Math] The orthogonal complement of the space of row-null and column-null matrices

linear algebra

I propose the following lemma and its proof. It is related to row-null and column-null matrices – i.e. matrices whose rows and columns both sum to zero. Could you please give your opinion on the plausibility of the lemma, and the validity of the proof?

Lemma: Let $Z\in\text{GL}(n,\mathbb{R})$ be a general $n\times n$ real matrix, and let $Y\in\mathcal{S}(n,\mathbb{R})$, where $\mathcal{S}(n,\mathbb{R})$ is the space of row-null column-null $n\times n$ real matrices. Then $\text{Tr}(ZY)=0$ for all $Y$ in $\mathcal{S}(n,\mathbb{R})$ if and only if $Z$ has the form $$Z_{ij}=\left(p_{j}-p_{i}\right)+\left(q_{j}+q_{i}\right)$$.

Proof:
Consider the space of row-null and column-null matrices

$$\mathcal{S}(n,\mathbb{R})= \left\{ Y_{ij}\in GL(n,\mathbb{R}):\sum_{i}Y_{ij}=0,\sum_{j}Y_{ij}=0 \right\} $$

Its dimension is
$$\text{dim}(S(n,\mathbb{R}))=N^{2}-2N+1$$
since the row-nullness and column-nullness are defined by $2N$ equations, only $2N-1$ of which are linearly independent.
Consider the following space

$$\mathcal{G}(n,\mathbb{R})=\left\{ Z_{ij}\in GL(n,\mathbb{R}):Z_{ij}=\left(p_{j}-p_{i}\right)+\left(q_{j}+q_{i}\right)\right\}$$

Its dimension is

$$\text{dim}(\mathcal{G}(n,\mathbb{R}))=2N-1$$
where $N-1$ is the contribution from the antisymmetric part and $N$ is from the symmetric part.

Assume $Y\in\mathcal{S}$ and $Z\in\mathcal{G}$, then the Frobenius inner product of two such elements is
$$
\text{Tr}(ZY) =\sum_{ij}\left[\left(p_{j}-p_{i}\right)Y_{ji}+\left(q_{j}+q_{i}\right)Y_{ji}\right]
$$
$$
=\sum_{j}(q_{j}+p_{j})\sum_{i}Y_{ji}+\sum_{i}(q_{i}-p_{i})\sum_{j}Y_{ji}=0
$$
Since $\text{dim}(\mathcal{G})+\text{dim}(\mathcal{S})=\text{dim}(GL)$ and $\mathcal{G}\perp\mathcal{S}$, then $\mathcal{G}$ and $\mathcal{S}$ must be complementary in $GL$. Therfore, if $Y$ is orthogonal to all the matrices in $\mathcal{S}$, it must lie in $\mathcal{G}$.

PS: How can I get the curly brackets {} to render in latex mode?

Best Answer

Here is an alternate way of proving your Lemma. I'm not sure if its any simpler than your proof -- but it's different, and hopefully interesting to some.

Let $S$ be the set of $n\times n$ matrices which are row-null and column-null. We can write this set as: $$ S = \left\{ Y\in \mathbb{R}^{n\times n} \,\mid\, Y1 = 0 \text{ and }1^TY=0\right\} $$ where $1$ is the $n\times 1$ vector of all-ones. The objective is the characterize the set $S^\perp$ of matrices orthogonal to every matrix in $S$, using the Frobenius inner product.

One approach is to vectorize. If $Y$ is any matrix in $S$, we can turn it into a vector by taking all of its columns and stacking them into one long vector, which is now in $\mathbb{R}^{n^2\times 1}$. Then $\mathop{\mathrm{vec}}(S)$ is also a subspace, satisfying: $$ \mathop{\mathrm{vec}}(S) = \left\{ y \in \mathbb{R}^{n^2\times 1} \,\mid\, (\mathbf{1}^T\otimes I)y = 0 \text{ and } (I \otimes \mathbf{1}^T)y = 0 \right\} $$ where $\otimes$ denotes the Kronecker product. In other words, $$ \mathop{\mathrm{vec}}(S) = \mathop{\mathrm{Null}}(A),\qquad\text{where: } A = \left[ \begin{array}{c} \mathbf{1}^T\otimes I \\ I \otimes \mathbf{1}^T \end{array}\right] $$ Note that vectorization turns the Frobenius inner product into the standard Euclidean inner product. Namely: $\mathop{\mathrm{Trace}}(A^T B) = \mathop{\mathrm{vec}}(A)^T \mathop{\mathrm{vec}}(B)$. Therefore, we can apply the range-nullspace duality and obtain: $$ \mathop{\mathrm{vec}}(S^\perp) = \mathop{\mathrm{vec}}(S)^\perp = \mathop{\mathrm{Null}}(A)^\perp = \mathop{\mathrm{Range}}(A^T) $$ So every vector in $vec(S^\perp)$ is of the form $(\mathbf{1}\otimes I)a + (I\otimes \mathbf{1})b$ for some vectors $a$ and $b$ in $R^{n\times 1}$. It follows that every matrix in $S^\perp$ is of the form $a1^T + 1b^T$. This parametrization is equivalent to the one you presented if you set $a_i = q_i-p_i$ and $b_j = q_j + p_j$.