[Math] Geometric interpretation of normal and anti-hermitian matrices

linear algebramatrices

How do I interpret following types of matrices as special types of transformations?
I mean what are the transformative properties of following types of matrices, from $\mathbb{R}^n $ to $ \mathbb{R}^n$, or $\mathbb{C^n}$ to $\mathbb{C^n}$?

Normal and Anti Hermitian Matrices?

ADDED

I expect answer something like this for orthogonal matrices Quoting Wikipedia:
As a linear transformation, an orthogonal matrix preserves the dot product of vectors, and therefore acts as an isometry of Euclidean space, such as a rotation or reflection. In other words, it is a unitary transformation.

ADDED

I managed to collect some more information:
Unitary matrices as Linear Maps preserve inner products, may be not necessarily for reals only, as orthogonal ones(???).

Best Answer

Let $V$ be a finite-dimensional complex inner product space.

  1. By the finite-dimensional spectral theorem, one has that $T \in L(V) := L(V,V)$ is normal if and only if there exists an orthonormal basis for $V$ consisting of eigenvectors of $T$. Geometrically, this means that $T$ acts on individual each coordinate axis (with respect to this orthonormal basis) by rescaling by a complex scaling factor (i.e., the eigenvalue for the eigenvector spanning that axis). In other words, $T$ is normal if and only if there exists some orthonormal coordinate system for $V$ such that $T$ fixes each coordinate axis, i.e., for each coordinate axis $\mathbb{C} v_k$, $T(\mathbb{C} v_k) \subseteq \mathbb{C} v_k$.

  2. A $1$-parameter groups of unitaries is a map $U : \mathbb{R} \to U(V)$, the group of unitaries in $L(V)$, such that $U(0) = I$ and $U(t_1+t_2) = U(t_1)U(t_2)$ for any $t_1$, $t_2$; geometrically, given a fixed orthonormal basis defining your initial orthonormal coordinate system, you can view $U$ as defining a a time-dependent orthonormal coordinate system on $V$.

    So, let $U : \mathbb{R} \to U(V)$ be a $1$-parameter group of unitaries. Observe that for $t \in \mathbb{R}$ and $h \neq 0$, $$ \frac{1}{h}(U(t+h)-U(t)) = \left(\frac{1}{h}\left(U(h)-I\right)\right)U(t), $$ so that $U$ is differentiable at $t$ if and only if it is differentiable at $0$. So, suppose that $U$ is differentiable at $0$. Then $$ (U^\prime(0))^\ast = \left(\lim_{h \to 0}\frac{1}{h}\left(U(h)-I)\right)\right)^\ast = \lim_{h\to 0}\frac{1}{h}\left(U(-h)-I\right) = -\lim_{k\to 0} \frac{1}{k} \left(U(k)-I\right) = -U^\prime(0), $$ so that $U(t)$ satisfies the ODE $$ U^\prime(t) = A U(t), $$ where $A := U^\prime(0)$ is skew-adjoint. Thus, from a geometric standpoint, skew-adjoint operators are precisely the infinitesimal generators of differentiable time-dependent orthonormal coordinate systems.

    In fact, bearing in mind that $T$ is skew-adjoint if and only if $iT$ is self-adjoint (Hermitian), and that in mathematical physics and theoretical physics one generally uses $e^{iA}$ instead of $e^A$, this is essentially why self-adjoint operators (on infinite-dimensional Hilbert space!) become essential in quantum mechanics. This is because in quantum mechanics, the time evolution (in the so-called Heisenberg picture) is effected by a $1$-parameter group of unitaries whose infinitesimal generator is the all-important Hamiltonian of your quantum mechanical system.

Now, suppose that $V$ is a finite-dimensional real inner product space.

  1. By the finite-dimensional spectral theorem one has that $T \in L(V)$ is symmetric (i.e., self-adjoint) if and only if there exists an orthonormal basis for $V$ consisting of eigenvectors. In other words, $T$ is symmetric if and only if there exists some orthonormal coordinate system for $V$ such that $T$ fixes each coordinate axis, i.e., for each coordinate axis $\mathbb{R}v_k$, $T(\mathbb{R}v_k) \subseteq \mathbb{R} v_k$. Observe that the notion of normal operator is no longer nearly as useful as in the complex case, since the finite-dimensional spectral theorem now characterizes precisely the symmetric operators. In particular, even though orthogonal operators are normal, examples of orthogonal operators with non-real complex eigenvalues are plentiful.

  2. When it comes to skew-symmetric (i.e., skew-adjoint) operators, everything carries over exactly as before, except that we now find that they are precisely the infinitesimal generators of $1$-parameter groups of orthogonal operators, viz, time-dependent orthonormal coordinate systems. Note, however, that skew-symmetric and symmetric operators are now fundamentally different now, since we can no longer multiply by $i$ to go between the skew-adjoints and self-adjoint, as we could in the complex case. In particular, symmetric operators necessarily have real eigenvalues, whilst examples of skew-symmetric operators with non-real complex eigenvalues are absolutely plentiful.