[Math] Determining whether matrix is self-adjoint

linear algebra

I have three matrices and I am trying to determine if they represent a self-adjoint linear transformation in some basis on an inner product space. The matrices are:

$$A_1 = \left( \begin{array}{ccc}
2 & 0 & 0 \\
0 & 1 & 1 \\
0 & 0 & 0 \end{array} \right)
A_2 = \left( \begin{array}{ccc}
2 & 0 & 0 \\
0 & 1 & 1 \\
0 & 0 & 1 \end{array} \right)
A_3 = \left( \begin{array}{ccc}
2 & 0 & 0 \\
0 & 1 & 1 \\
0 & 0 & 2 \end{array} \right)
$$

Now for the first one I can see immediately that this only has a 2 dimensional eigenspace so cannot be self-adjoint. For the other two I am quite lost though.

I know that in an orthonormal basis the matrices should be the conjugate transpose of themselves, but here we have no idea what the basis/inner product us, so I'm not sure how we can tell.

Any help is very much appreciated!

Best Answer

You want to use the following fact:

A matrix $A$ represents a self-adjoint linear transformation in some basis if and only if the $A$ is real diagonalizable.

To prove the forward direction, notice that if $A$ represents a self-adjoint linear transformation, then $A = X B X^{-1}$, where $B$ is self-adjoint with respect to the standard inner product. But $B$ is real diagonalizable, and hence $A$ is.

For the reverse direction, if $A = X \Lambda X^{-1}$, where $\Lambda$ is a real diagonal matrix, then $\Lambda$ is self-adjoint in the standard basis.

Related Question