Well, one trivial connection is that if you look at $1\times 1$ matrices (which have only a single complex entry), then you'll find that it is real iff it is Hermitian, its complex conjugate is its conjugate transpose, and its polar decomposition is the polar form.
Also, just like a complex number can be uniquely decomposed into a real and an imaginary part ($z = a+\mathrm ib$ with real $a,b$), a complex matrix can be uniquely decomposed into a Hermitian and an "anti-Hermitian" part, i.e,. $M =A + \mathrm iB$ with $A$ and $B$ Hermitian. And just like $\Re(z)=\frac12(z+\bar z)$ and $\Im(z)=\frac1{2\mathrm i}(z-\bar z)$, the Hermitian part of a matrix is $\frac12(M+M^*)$ and the "anti-Hermitian" part is $\frac1{2\mathrm i}(M-M^*)$.
Moreover, just like $\bar zz$ is a non-negative real number, $M^*M$ is a positive semidefinite matrix.
Another point: Hermitian matrices have real eigenvalues, and unitary matrices have eigenvalues of the form $\mathrm e^{\mathrm i\phi}$.
About the usefulness of the analogy:
In classical physics, observables should be real. In quantum physics, observables are represented by Hermitian matrices. Also, the quantum analogue to probability densities, which are non-negative functions with integral $1$, are density operators, which are positive semidefinite matrices with trace $1$. So there's indeed some connection.
Hint : Here I have done for $2 \times 2$ matrix.
Let $A = \left(
\begin{array}{cc}
a & 0 \\
0 & b \\
\end{array}
\right)$
be a diagonal matrix with complex entries. Its eigenvalues
are precisely $a$, $b$. Because $A$ is Hermitian, they must be real. Also $A$
is unitary, they must each be of absolute value $1$. There are exactly four
matrices satisfying these conditions:
Let $A_1 = \left(
\begin{array}{cc}
1 & 0 \\
0 & 1 \\
\end{array}
\right)$, $A_2 = \left(
\begin{array}{cc}
1 & 0 \\
0 & -1 \\
\end{array}
\right)$, $A_3 = \left(
\begin{array}{cc}
-1 & 0 \\
0 & 1 \\
\end{array}
\right)$, $A_4 = \left(
\begin{array}{cc}
-1 & 0 \\
0 & -1 \\
\end{array}
\right)$
I hope this may help you.
Best Answer
Okay, I think I have managed to solve these questions.
We can have a skew-Hermitian matrix of any rank (thanks Jyrki); we just take an element $\alpha$ such that the trace $T(\alpha) = \alpha+\alpha^{q} = 0$. Then if we want an $n \times n$ skew-Hermitian matrix with rank $k$ we just take $$C = \alpha \begin{bmatrix} I_{k} & 0\\0&0 \end{bmatrix}.$$
If the characteristic of the field is $2$, we can take $\alpha = 1$.
I'm going to actually cop out and ignore my question about what happens when the diagonal is required to be zero. I mentioned this because in dealing with bilinear forms, this is naturally implied with a skew-symmetric matrix in odd characteristic, but needs to be explicitly stated in even characteristic for the bilinear form to define a symplectic geometry. Nothing like this comes up in the skew-symmetric situation.
I will approach the second problem by first considering Hermitian matrices. Let $C$ be an $n \times n$ Hermitian matrix with entries in $\mathbb{F}_{q^{2}}$ having rank $k$ (WLOG assume $k \geq 1$). I want to show that there is an invertible matrix $A$ such that $$ ACA^{*} = \begin{bmatrix} I_{k} & 0\\0 & 0 \end{bmatrix}$$ (block matrix representation where the bottom right block is $(n-k)\times (n-k)$). The key here (and the motivation for the question) is to think of $C$ as representing a Hermitian bilinear form $B(u,v) = u C v^{*}$, then the matrix $A C A^{*}$ represents this bilinear form with respect to a different basis (basis vectors being the rows of $A$). The collection of vectors $u$ such that $u C v^{*} = 0$ for all $v \in V = \mathbb{F}_{q^{2}}^{n}$ is a a subspace called the radical, so call this $W$. It is clear that $W = \mathrm{null}(C)$ so $\dim(W) = n-k$; therefore we will say that $\{w_{1}, \ldots, w_{n-k}\}$ is a basis for $W$. Then $V = V_{0} \oplus W$ for some subspace $V_{0} \leq V$ for which the Hermitian form restricted to $V_{0}$ is nondegenerate (and $\dim(V_{0}) = k$).
Now our bilinear form is nondegenerate on $V_{0}$, so we can find a vector $v \in V_{0}$ with $vCv^{*} = c \neq 0$. It can also be seen that $c = c^{q}$ (because $(vCv^{*})^{*} = vCv^{*}$), therefore $c \in \mathbb{F}_{q}$. The norm map $N:\mathbb{F}_{q^{2}} \to \mathbb{F}_{q}$ is surjective, so there exists an $\omega \in \mathbb{F}_{q^{2}}$ with $N(\omega) = \omega^{q+1} = c$, and so putting $v_{1} = \frac{1}{\omega}v$ we have $v_{1}Cv_{1}^{*} = 1$, and $V_{0} = \langle v \rangle \oplus \langle v \rangle^{\perp}$. Using induction on $\dim(V_{0})=k$ lets us find a basis $\{v_{1}, \ldots, v_{k}\}$ with $v_{i}Cv_{i}^{*} = 1$ for all $i$, and $v_{i} C v_{j}^{*} = 0$ for all $i \neq j$. Now finally we can put $$A = \begin{bmatrix} \phantom{\ldots \ldots} v_{1} \phantom{\ldots \ldots} \\ \vdots \\ v_{k} \\ w_{1} \\ \vdots \\ w_{n-k} \end{bmatrix}$$ and we have $$ACA^{*} = \begin{bmatrix} I_{k} & 0 \\ 0 & 0 \end{bmatrix}.$$
We will now prove the analogous result for skew-Hermitian matrices. Take $\alpha \in \mathbb{F}_{q^{2}}$ such that $T(\alpha) = \alpha^{q}+\alpha = 0$. WLOG we can assume that $N(\alpha) = \alpha^{q+1} = 1$, because otherwise if $N(\alpha) = c$ then we can replace $\alpha$ with $\frac{\alpha}{\sqrt{c}}$. Let $C$ be skew-Hermitian so $C^{*} = -C$. We want to show that there exists a matrix $A$ such that $$ACA^{*} = \alpha \begin{bmatrix} I_{k} & 0\\0&0\end{bmatrix},$$ where $k$ is the rank of $C$. Now $(\alpha^{q} C)^{*} = \alpha C^{*} = \alpha^{q} C$, that is, $\alpha^{q} C$ is Hermitian, so there exists a matrix $A$ such that $$A (\alpha^{q} C) A^{*} = \alpha^{q} A C A^{*} = \begin{bmatrix} I_{k} & 0 \\ 0 & 0 \end{bmatrix}.$$ Multiplying both sides by $\alpha$ we have $$\alpha^{q+1} A C A^{*} = A C A^{*} = \alpha \begin{bmatrix} I_{k} & 0 \\ 0 & 0 \end{bmatrix}.$$