Is it possible to find an orthogonal matrix $V\in M_n(\Bbb R)$ s. t. $A=VDV^T$ with a column not proportional to any column of $U$

linear algebramatricesorthogonal matricessymmetric matrices

Let $A\in M_n(\Bbb R)$ be a symmetric matrix with (strictly) less than $n$ distinct eigenvalues. Since $A$ is diagonalizable, we can write it as $A=UDU^T$ where $U\in M_n(\Bbb R)$ is orthogonal and $D\in M_n(\Bbb R)$ is diagonal.

Question:

Is it possible to find an orthogonal matrix $V\in M_n(\Bbb R)$ s. t. $A=VDV^T$ under the condition that at least one column of $V$ isn't proportional to any column of $U$?


My thoughts:

I think the fact there are less than $n$ distinct eigenvalues guarantees it is possible to find such $V$, otherwise, it would be impossible.

Since there are less than $n$ distinct eigenvalues, there is an eigenspace $E_{\lambda'}$ corresponding to the eigenvalue $\lambda'$ s. t. $\dim\left(E_{\lambda'}\right)=k\geqslant 2$.

Let $\{e_1,\ldots,e_k\}$ be an orthonormal basis for the eigenspace $E_{\lambda'}$ and let's observe one plane in $\Bbb R^n$ spanned by, say, $M=\operatorname{span}\{e_1,e_2\}$.

Let $f_1=\frac{e_1+e_2}{\left\|e_1+e_2\right\|}$. Then $f_2\in M$ is another unit vector (in the same plane) s. t. $f_1\perp f_2$.

Actually, we could apply Gramm-Schmidt to an arbitrary basis written as $\{\alpha e_1+\beta e_2,\gamma e_1+\delta e_2\},\alpha,\beta,\gamma,\delta\in\Bbb R$.

I thought I could also reach the same result by rotating $e_1$ and $e_2$ in the plane $M$ for some angle $\varphi\ne k\pi,k\in\Bbb Z$.

If this part of my statement holds, then, of course, $\{f_1,f_2,e_3,\ldots,e_k\}$ is also an orthonormal basis for $M$. I believe this could inductively hold for any $M\leqslant E_{\lambda'}$, where $2\leqslant\dim M\leqslant\dim E_{\lambda'}$.


May I ask for verification of the statement and advice on how to concisely (dis)prove it?

Thank you in advance!

Best Answer

The answer is yes.

I recommend the following approach. First, note that $$ A = VDV^T = UDU^T \implies\\ VDV^T = UDU^T \implies\\ U^TVDV^TU = U^TUDU^TU \implies\\ (U^TV) D(U^TV)^T = D. $$ With that in mind, let $W$ denote the orthogonal matrix $W = U^TV$. We have $$ WDW^T = D \implies WD = DW. $$ In other words, $W$ is an orthogonal matrix for which $WD = DW$. Keep in mind that once we have $W$, we have $W = U^TV \implies V = UW$.

Now, $A$ has a repeated eigenvalue; call this eigenvalue $\lambda$. Without loss of generality, suppose that $\lambda$ comes first among the diagonal entries of $D$, and write $$ D = \pmatrix{\lambda I_k & 0\\0 & D'} $$ where $I_k$ is a size $k$ identity matrix (with $k \geq 2$) and $D'$ is also diagonal. I claim that if $W_1$ is any $k \times k$ orthogonal matrix an $W_2$ is diagonal with $\pm1$'s, then the block matrix $$ W = \pmatrix{W_1 & 0\\0 & W_2} $$ will be orthogonal and satisfy $WD = DW$. Let's stipulate that for our choice of $W$, $W_1$ has no zero-entries.

Now, note that the entries of $W$ are the dot-products of columns of $U$ with columns of $V$. With that in mind, conclude that because the first column of $W$ has $k \geq 2$ non-zero entries, the first column of $V$ is not a multiple of any of the columns of $U$.

Related Question