Unitary matrices are precisely the matrices admitting a complete set of orthonormal eigenvectors such that the corresponding eigenvalues are on the unit circle. Hermitian matrices are precisely the matrices admitting a complete set of orthonormal eigenvectors such that the corresponding eigenvalues are real. So unitary Hermitian matrices are precisely the matrices admitting a complete set of orthonormal eigenvectors such that the corresponding eigenvalues are $\pm 1$.
This is a very strong condition. As George Lowther says, any such matrix $M$ has the property that $P = \frac{M+1}{2}$ admits a complete set of orthonormal eigenvectors such that the corresponding eigenvalues are $0, 1$; thus $P$ is a Hermitian idempotent, or as George Lowther says an orthogonal projection. Of course such matrices are interesting and appear naturally in mathematics, but it seems to me that in general it's more natural to start from the idempotence condition.
I suppose one could say that Hermitian unitary matrices precisely describe unitary representations of the cyclic group $C_2$, but from this perspective the fact that such matrices happen to be Hermitian is an accident coming from the fact that $2$ is too small.
Start with the facts you know, i.e. that you have a 2-by-2 complex matrix $\begin{pmatrix}w&z\\c&d\end{pmatrix}$, such that when you multiply it by its adjoint $\begin{pmatrix}w^*&c^*\\z^*&d^*\end{pmatrix}$ you get $\begin{pmatrix}1&0\\0&1\end{pmatrix}$. That means you have $ww^*+zz^* = 1$, $cc^*+dd^* = 1$ and $cw^*+dz^*=0$. Don't forget the other way, so you get $ww^*+cc^* = 1$, $zz^*+dd^* = 1$ and $w^*z+c^*d=0$. With these equations you should notice something immediately about both $cc^*$ and $dd^*$. You can work from there.
Best Answer
Any unitary matrix $\bf{U}$ can be written as $\exp(i\bf{H})$ where $\bf{H}$ is Hermitian. A brief explanation, where I use $\mathbf{A}^*$ to denote the conjugate transpose of a matrix $\bf{A}$.
$$\mathbf{U}\mathbf{U}^*=\mathbf{I}\\\exp(i\mathbf{H})\times\exp(i\mathbf{H})^*=\exp(i\mathbf{H})\times\exp(-i\mathbf{H}^*)=\exp(i\mathbf{H})\times\exp(-i\mathbf{H})\\\exp(i\mathbf{H})\times\exp(-i\mathbf{H})=\exp(i\mathbf{H}-i\mathbf{H})=\exp(\mathbf{0})=\mathbf{I}\\\therefore\mathbf{U}\mathbf{U}^*=\exp(i\mathbf{H})\times\exp(i\mathbf{H})^*\implies\exists\mathbf{H}:\mathbf{U}=\exp(i\mathbf{H})$$
Now in the last step, the "implies" part, this is a weak step: I don't think this is at all a rigorous proof, but I'm just trying to show that it can be true. According to Wikipedia, it is in fact true for every single unitary matrix.
The conjugate transpose of $\bf{H}$ is by definition also just $\bf{H}$, explaining one step, and the first step is true since in the expansion of the matrix exponential it is straightforward to see that $\exp(i\bf{H})^*$ is the same as $\exp(-i\bf{H})$ by some properties of the complex conjugate e.g. $\overline{a+b}=\overline{a}+\overline{b}$.
VERY importantly, note that I can only put the multiplication of two exponentials as one exponential in the second line since $i\bf{H}$ and $-i\bf{H}$ commute with one another. In general, $\exp(\bf A+B)\neq\exp(A)\times\exp(B)$ when $\bf A,B$ don't commute.