Linear Algebra – Finding a Unitary Matrix That Diagonalizes a Given Matrix

linear algebra

Let $$T=\begin{pmatrix}5 & 0 & 0 \\ 0 & 2 & i\\ 0 & -i & 2 \end{pmatrix}$$ be a Hermitian matrix.

I found the eigenvalues and eigenvectors already and they are $1,3,5$ and $\begin{pmatrix}0\\-i\\1\end{pmatrix}$,$\begin{pmatrix}0\\i\\1\end{pmatrix}$, and $\begin{pmatrix}1\\0\\0\end{pmatrix}$.

Normalizing each vector, I get $<1,0,0>$, $<0,\frac{-i}{\sqrt{2}},\frac{1}{\sqrt{2}}>$, and $<0,\frac{i}{\sqrt{2}},\frac{1}{\sqrt{2}}>$.

I need to find a matrix $P$ such that $P^*AP$ is diagonal. My first idea was $P=\begin{pmatrix}1 & 0 & 0\\ 0 & -i & i\\ 0 & 1 & 1 \end{pmatrix}$. $P^*=P^{-1}$, but $P^*P\neq I$.

I'm having a some problems trying to find a $P$ that will allow me to diagonalize $T$.

Best Answer

Let's check your idea:

$$P=\begin{pmatrix}1 & 0 & 0\\ 0 & \!\!-i & i\\ 0 & 1 & 1 \end{pmatrix}\;,\;\;P^*=\begin{pmatrix}1 & 0 & 0\\ 0 & i & 1\\ 0 & \!\!-i & 1 \end{pmatrix}$$

and since

$$PP^*\neq I\;\;\text{then}\;\;P^*\neq P^{-1}$$

The problem is this matrix's columns (rows) aren't orthonormal though they're orthogonal. We must apply Gram-Schmidt (I assume the usual euclidean inner product and let's write all the vector as row ones, for simplicity)):

$$u_1=(1,0,0)$$

$$w_2=(0,-i,1)-\langle\;(1,0,0),(0,-i,1)\;\rangle\,(1,0,0)=(0,-i,1)-0\cdot(1,0,0)=(0,-i,1)$$

and since

$$||w_2||=\sqrt 2\implies u_2=\frac1{\sqrt 2}(0,-i,1)$$

$$w_3:=(0,i,1)-\langle\;(0,i,1),(1,0,0)\;\rangle\,(1,0,0)-\left\langle\;(0,i,1),\frac1{\sqrt 2}(0,-i,1)\;\right\rangle\,\frac1{\sqrt 2}(0,-i,1)=$$

$$=(0,i,1)-\frac13\cdot 0=(0,i,1)$$

and since

$$||w_2||=\sqrt 2\implies u_3=\frac1{\sqrt2}(0,i,1)$$

Thus, our $\,P\,$ is (well, let's call it $\,Q\,$ to avoid confussion):

$$Q=\begin{pmatrix}1&0&0\\ 0&-\frac{i}{\sqrt 2}&\frac1{\sqrt 2}\\ 0&\frac i{\sqrt 2}&\frac1{\sqrt 2}\end{pmatrix}\;\implies\;Q^*=\begin{pmatrix}1&0&0\\ 0&\frac{i}{\sqrt 2}&-\frac i{\sqrt 2}\\ 0&\frac 1{\sqrt 2}&\frac1{\sqrt 2}\end{pmatrix}$$

It is now a simple exercise to verify that indeed $\,QQ^*=UI\,$ and, thus, $\,Q^{-1}=Q^*\,$ ...

Note: The above G-S process was almost trivial because the columns of $\,P\,$ were already orthogonal. The above stuff shows how to carry on the process in general, though in this case it was sufficient to divide each vector by its norm.