[Math] Prescribing areas of parallelograms (or 2×2 principal minors)

linear algebramg.metric-geometry

Let $(a_{ij})$ be a $n\times n$ symmetric matrix such that $a_{ij}\geq 0$ for all $i,j$ and $a_{ii}=0$ for all $i$. Under which conditions on the $a_{ij}$'s can one find $n$ vectors $v_1,\ldots,v_n\in{\mathbb R}^n$ such that for all $i,j$ the area of the parallelogram spanned by $v_i$ and $v_j$ equals $a_{ij}$:

$\forall i,j:\quad\|v_i\|^2\|v_j\|^2-\langle v_i,v_j\rangle^2=a_{ij}^2$ ?

Here is the only and obvious necessary condition I know about: if $a_{ij}=0$ for some $i\neq j$, then $a_{ik}a_{jl}=a_{il}a_{jk}$ for all $k,l$.

What if $a_{ij}>0$ for all $i\neq j$ ?

Thank you.

Edit. As Noah Stein suggested, a useful reformulation of the question is: can one prescribe the $2\times2$ principal minors of a symmetric positive semidefinite matrix?

Edit 2. See also George Lowther comment. It is always possible (and easy!) to prescribe the $2\times 2$ principal minors of a symmetric $n\times n$ matrix. If the $a_{ij}$'s, $1 \leq i < j \leq n $,
are to be those minors, we simply need to choose $n$ numbers $g_{ii}$ such that $g_{ii}g_{jj}\geq a_{ij}$ for all $i\neq j$. Then we are done with the symmetric matrix $G=(g_{ij})$ whose off diagonal entries are given by $g_{ij}=\epsilon_{ij}\sqrt{g_{ii}g_{jj}-a_{ij}}$, where $\epsilon_{ij}=\pm 1$.

So the initial question becomes : under what conditions on the $a_{ij}$'s, can one find $n$ real numbers $g_{ii}\geq 0$, and $\epsilon_{ij}=\pm 1$, so that the matrix $G$ defined above is positive semidefinite?

Best Answer

In the case where the matrix $(a^2_{ij})\_{i,j=1,\ldots,n}$ is nonsingular, then the problem reduces to the condition that it has a single positive eigenvalue. In fact, we have the following for any $n\times n$ nonzero symmetric matrix $A$ with nonnegative components and zero diagonal (I'll remove the square from $a_{ij}$, as it doesn't seem to help).

If there exist $\nu_i\in \mathbb{R}^n$ such that $$ \begin{align} A_{ij}=\lVert\nu_i\rVert^2\lVert\nu_j\rVert^2-\langle\nu_i,\nu_j\rangle^2&&{\rm(1)} \end{align} $$ then $A$ has a single positive eigenvalue (counting multiplicities).

Conversely, if $A$ is nonsingular and has a single positive eigenvalue, then there exists $\nu_i\in\mathbb{R}^n$ satisfying (1).

First, suppose that (1) holds. Then, there exist nonnegative reals $\lambda_i$ and a positive semidefinite matrix $S$ such that $A_{ij}=\lambda_i\lambda_j(1-S_{ij}^2)$, simply by taking $\lambda_i=\lVert\nu_i\rVert^2$ and $S_{ij}=\langle\hat\nu_i,\hat\nu_j\rangle$, where $\hat\nu_i=1_{\lbrace\nu_i\not=0\rbrace}\nu_i/\lVert\nu_i\rVert$. Let $\lambda=(\lambda_i)\_{i=1,\ldots,n}\in\mathbb{R}^n$. Using the fact that the componentwise square of a positive semidefinite matrix is itself positive semidefinite, $$ x^{\rm T} A x= \langle x,\lambda\rangle^2-\sum_{ij}(\lambda_ix_i)S_{ij}^2(\lambda_jx_j)\le\langle x,\lambda\rangle^2. $$ In particular, $x^{\rm T}A x\le0$ for all vectors orthogonal to $\lambda$. So, the space generated by eigenvectors with positive eigenvalues cannot contain any nonzero members orthogonal to $\lambda$, and has dimension at most one. But, as the trace of $A$ is zero, it must have at least one positive eigenvalue.

Conversely, suppose that $A$ has a single positive eigenvalue and is nonsingular. Diagonalization gives $$ A=u u^{\rm T}-\sum_{\alpha=1}^{n-1} v_{\alpha}v_{\alpha}^{\rm T} $$ for nonzero orthogonal $u,\nu_\alpha\in\mathbb{R}^n$. As the diagonal of $A$ is zero, $$ u_i^2=\sum_\alpha\nu_{\alpha,i}^2. $$ Using Cauchy–Schwarz, $$ A_{ij}\le u_iu_j+\sqrt{\sum_\alpha v_{\alpha,i}^2\sum_\beta v_{\beta,j}^2} =u_iu_j+\vert u_iu_j\vert. $$ So, the $u_i$ are all nonzero, otherwise $A$ would have a row with no positive elements. Then, $u_iu_j > 0$.Writing $S=\lbrace i=1,2,\ldots,n\colon u_i > 0\rbrace$ we would have $A_{ij}=0$ for $i\in S$ and $j\not\in S$. Breaking $A$ down into two blocks on which the row and column indices are respectively in $S$ and not in $S$, we can break the problem down to the case where $u_i$ are all of the same sign. W.l.o.g., take $u_i > 0$. For $1 > \epsilon > 0$, define the matrix $$ \begin{align} S_{ij}&=\sqrt{1-\epsilon u_i^{-1}u_j^{-1} A_{ij}}\cr &=1-\frac12\epsilon u_i^{-1}u_j^{-1} A_{ij}+O(\epsilon^2)\cr &=1-\epsilon/2+\frac\epsilon2\sum_{\alpha}u_i^{-1}u_j^{-1}v_{\alpha,i}v_{\alpha,j}+O(\epsilon^2) \end{align} $$ As the vectors $u,v_{\alpha}$ are linearly independent, the same is true of the vectors $\tilde u=(1,1,\ldots,1)$ and $\tilde v_\alpha=(u_1^{-1}v_{\alpha,1},\ldots,u_n^{-1}v_{\alpha,n})$. Then, $$ x^{\rm T}Sx=(1-\epsilon)\langle\tilde u,x\rangle^2+\frac\epsilon2\left(\langle\tilde u,x\rangle^2+\sum_\alpha\langle\tilde v_\alpha,x\rangle^2\right)+O(\epsilon^2\lVert x\rVert^2) $$ is nonnegative for all $\epsilon$ small enough and $x\in\mathbb{R}^n$. In this case $S$ is positive definite and (by Gram-Schmidt, for example) there are $\hat\nu_i\in\mathbb{R}^n$ with $S_{ij}=\langle\hat\nu_i,\hat\nu_j\rangle$. Setting $\nu_i=\epsilon^{-1/4}u_i^{1/2}\hat\nu_i$ gives (1).


That concludes the case where $A$ is nonsingular. The singular case is, I think, considerably more complicated.

Related Question