[Math] Reconstruction of a matrix given its eigenvalues and eigenvectors dot products

eigenvalues-eigenvectorslinear algebramatrices

This question is connected with the previous one .

Suppose we know distinct $n$ eigenvalues $\lambda_1,\lambda_2,..\lambda_n$ for an unknown matrix $A_{{n}\times{n}}$ and dot products ${v_i}^T {v_j}$ for any pair of unit length eigenvectors ${v_i} , {v_j}$(they represent cosines of angles between these unit vectors. We can assume – if needed – that they are all non-negative).

Question:

  • how to reconstruct from these data any matrix $A$ with given properties?

Of course, there are plenty of such matrices and all are probably similar to each other, so we can choose a basis for a searched representation of the matrix – for example – the eigenvector $v_1$ might be equal $ [ 1 \ 0 \ 0 \ … \ 0]^T$, other vectors should be calculated taking into account this starting point. As in the previous question it's relatively easy to calculate it for the dimension $n=2$. For higher dimensions problem seems to be more complicated and hard to deal with… but maybe some method exists..

Best Answer

Every nonsingular matrix has a unique polar decomposition. It follows that if $P$ is the unique positive definite matrix square root of $V^TV$, then $A=U(P\Lambda P^{-1})U^T$ for some real orthogonal matrix $U$. Since inner products are preserved under changes of orthonormal bases, there is not enough information to determine $U$ and you can only determine $A$ up to unitary equivalence.

Related Question