[Math] Finding unitary matrix

eigenvalues-eigenvectorsmatricesmatrix decomposition

Find unitary matrix $U\in\mathbb{C}^{2\times2}$ so that $D=UAU^*$ is diagonal where
$$A=\begin{bmatrix} 3 & -4\\4 &3 \end{bmatrix} \in \mathbb{R}^{2\times 2}$$

I know that for unitary matrix it holds that $U^* U = U U^* = I$ and if we know that $D$ is diagonal:

\begin{align}
D&= U^* A U \;\;\;|\cdot U\\
UDU^*&=A \;\;\;\;\;\;\;\;\;\;\,| \cdot U^*
\end{align}

Do I start to find the eigenvalues & vectors for $A$? So if $A$ is diagonalizable then for it holds $A=PDP^{-1}$ and because $U$ is unitary $P=U$ in this case? So first the characteristic polynomial:

\begin{align}
f(\lambda_A) &= \operatorname{det}(A-I\lambda)=\lambda ^2-6 \lambda +25\\
\text{Roots are} \\
f(\lambda_A) &=0\to\lambda=3-4 i\lor \lambda =3+4 i
\end{align}
Corresponding eigenvectors
\begin{align}
Ax&=\lambda x \\
\begin{bmatrix} 3 & -4\\4 &3 \end{bmatrix}\begin{bmatrix}x_1 \\ x_2\end{bmatrix} &=\begin{bmatrix} 3x_1-4ix_1\\ 3x_2-4ix_2\end{bmatrix} \\
3x_1-4x_2 &= 3x_1-4ix_1 \\
x_2 &=ix_1\to x=\begin{bmatrix}1 \\ i \end{bmatrix}
\end{align}
For $\lambda=3+4i \to x=\begin{bmatrix}1 \\ -i\end{bmatrix}$. (Mathematica gave me $\left(
\begin{array}{cc}
i & 1 \\
-i & 1 \\
\end{array}
\right)$ for some reason…)
So according to decomposition $A=PDP^{-1}$ it holds that $D=\begin{bmatrix}3-4i & 0 \\ 0 & 3+4i\end{bmatrix}$ and $P=\begin{bmatrix}1 & 1 \\ i & -i \end{bmatrix}= U$? Is this correct or did I fumble somewhere?

Test:

\begin{align}
\begin{bmatrix}1 & 1 \\ i & -i \end{bmatrix}\begin{bmatrix}3-4i & 0 \\ 0 & 3+4i\end{bmatrix}\begin{bmatrix}1 & 1 \\ i & -i \end{bmatrix}^*&=A \\
\begin{bmatrix}3-4i & 3+4i \\ 4+3i & -3-4i\end{bmatrix}\begin{bmatrix}1 & -i\\ 1 & -i\end{bmatrix}&=\begin{bmatrix} 6 & -8 \\ 8 & 6 \end{bmatrix} \neq A
\end{align}

So I fumbled. Hints?

Best Answer

As we're looking for a unitary matrix $U$ which diagonalizes $A= \begin{pmatrix} 3& -4\\4&3\end{pmatrix},$ such that $U^*AU =D$ it's a good idea to look for the eigenvalues - we know that $P^{-1}A P = D$ is a diagonal matrix which contains the eigenvalues when $P$ is an invertible matrix containing the eigenvectors as columns. This looks similar to the required equation, but not quite.

An orthogonal matrix satisfies $U^* U =U U^*=I$. If we denote by $u_1, \dots, u_n$ the columns of $U$, this can also be rephrased as

$$(u_i,u_i)= \overline u_i ^T u_i = 1$$ and $$(u_i,u_j)= \overline u_i ^T u_j = 0,$$

i.e. the column vectors of $U$ have norm $1$ and are orthogonal to each other. If this is the case, then $U^* = U^{-1}.$

In order to diagonalize $A$ with an orthogonal matrix, we need not any matrix $P$ which diagonalizes, but an orthogonal matrix.

Calculate the eigenvalues of $A$ as above: $\lambda_\pm = 3 \pm 4i$.

Solve $(A - \lambda_\pm )v_\pm = \begin{pmatrix} \mp 4i& -4\\4&\mp 4i\end{pmatrix} v_\pm=0$ and get some eigenvectors $v_+ = \begin{pmatrix} 1 \\i \end{pmatrix}, v_- = \begin{pmatrix} 1 \\-i \end{pmatrix}$.

If we put these as columns of a matrix, will the matrix be orthogonal? We'll check with the scalar product which has a complex conjugate for complex vector spaces (the lack of this might be the reason you end up with a norm of 0).

$(v_+,v_-)= \overline{\begin{pmatrix} 1 & i \end{pmatrix}}\begin{pmatrix} 1 \\-i \end{pmatrix}= 1*1 + (-i)*(-i)=1-1=0,$ which shows that $v_+$ and $v_-$ are indeed orthogonal.

But what's the norm of these vectors? We have $(v_+,v_+)= \overline{\begin{pmatrix} 1 & i \end{pmatrix}}\begin{pmatrix} 1 \\i \end{pmatrix}= 1*1 + (-i)*i= 1 + 1 = 2$ and analogously $(v_-,v_-)=2$, so both vectors don't have norm $1$, but norm $\sqrt{2}$. We can fix this by taking $u_1 = \frac{1}{\sqrt{2}}v_+, \ u_2 = \frac{1}{\sqrt{2}}v_-$ and defining $U$ as the matrix which contains these as columns.

This is a special choice for $P$ which will diagonalize $A$, but which also is orthogonal so that we can use $P^*$ instead of $P^{-1}$.