Is any symmetric matrix of which its diagonalvector equals its eigenvalues a diagonal matrix

diagonalizationeigenvalues-eigenvectorslinear algebra

Let $A$ be an $n \times n$ a real, positive semidefinite and symmetric matrix. Suppose we know that the diagonal of this matrix is equal to its eigenvalues. (So if we have eigenvalues $\lambda_1, … \lambda_n$, then $diag(A)$ is a permutation of $(\lambda_1, …, \lambda_n)$.)

Can we prove from these assumptions that $A$ is a diagonal matrix? I.e. all non-diagonal entries are $0$? And if so, how?

It's equal to this question, only the counterexample given there is not symmetric.


Ideas so far:

Proving it the other way around is easier. If a matrix is diagonal and $n \times n$, then its entries are equal to its eigenvalues. But this doesn't help with proving it the other way around.

I know that since $A$ is real, and positive semidefinite, we can do a diagonalization by eigendecomposition to find $A = VBV^{T}$ with $V$ an orthonormal matrix and $B$ a diagonal matrix with eigenvalues on the diagonal. I've tried to prove from here that the $diag(A)$ being a permutation of $diag(B)$ must mean that $V$ is a permutation matrix, i.e. $V$'s orthonormal rows consist of $n-1$ zeros and a single 1, but I couldn't get the math working in cases when there are two equal eigenvalues on $B$'s diagonal.

Best Answer

the underlying ideas come from majorization but all you need to do is check the Frobenius norm

$\big \Vert A\big \Vert_F^2 = \sum_{k=1}^n\lambda_k^2 = \sum_{k=1}^n a_{k,k}^2 = \big(\sum_{k=1}^n a_{k,k}^2\big) + \big(\sum_{k=1}^n\sum_{j\neq k} a_{j,k}^2 \big)=\big \Vert A\big \Vert_F^2$

so all off diagonal entries are zero.

With a little care, you can also do this with Hadamard Determinant Inequality.