This answer addresses the first question.
A real-valued, symmetric matrix is positive-semidefinite if and only if all its eigenvalues are nonnegative -- $0$ counts, no matter how many times it occurs.
Also, positive-semidefiniteness everywhere of the Hessian is equivalent to convexity; see this Math.SE question. Hessian matrices of real-valued $C^2$ functions are symmetric. So $0$ eigenvalues do count in this equivalence too; if your computation was correct, then the first function would be convex.
However, your first eigenvalue computation is incorrect. Indeed, we see that
$$H \begin{pmatrix} 1 \\ -1 \end{pmatrix} = \begin{pmatrix} -1 \\ 1 \end{pmatrix} = (-1) \begin{pmatrix} 1 \\ -1 \end{pmatrix},$$
so there is an eigenvalue of $-1$. Also, there is an eigenvalue of $1$, since
$$H \begin{pmatrix} 1 \\ 1 \end{pmatrix} = \begin{pmatrix} 1 \\ 1 \end{pmatrix}.$$
Since the number of eigenvalues is less than or equal to the dimension, we have eigenvalues of $\pm 1$.
[How did I magically create these eigenvectors? Well, I saw that $H^2 = I$, so I figured the constants were nice, and since the matrix looks like a reflection over the axis $x_1 = x_2$ (i.e., changing the roles of $x_1$ and $x_2$), I figured the lines $x_2 = x_1$ and $x_2 = -x_1$ would be relevant.]
Regarding the "usual" method of characteristic polynomial for finding the eigenvalues, I get the equation for $\det (\lambda I - H ) = 0$ of
$$\lambda^2 - 1 = 0.$$
Did you make a computational error here?
In any event, with the correct computation, the first function is not convex. [Geometrically, if you are familiar with convexity of one-dimensional functions, if you let $x_2 = -x_1$, you get $g(x_1) \overbrace{=}^{defined} f(x_1, -x_1) = - x_1^2$, which shows a ``concave down'' effect].
P.S. Since for diagonalizable matrices (including real symmetric matrices), the determinant of the matrix is the product of the eigenvalues, and the determinant of your second matrix is 0 (I think), you should have at least one zero eigenvalue. You should double-check that the other eigenvalue is positive.
I will answer some of your questions.
The entries of a Hessian matrix $H$ of $f$ are second partials $H_{ij}=\partial_i\partial_jf$ and it is a standard result in multivariable calculus that $\partial_i\partial_jf=\partial_j\partial_if$ provided both second partials are continuous functions. In your case, the entries of the Hessian are constants so are continuous functions. This means that it is not at all accidental that the matrix is symmetric.
By the spectral theorem, symmetric matrices can always be diagonalised, so the symmetry of the matrix does play a role in allowing one to determine whether the Hessian is positive/negative (semi) definite. Provided you found the eigenvalues correctly, you have drawn the correct conclusion about $H_1$ and $H_2$.
Finally, if the Hessian is positive/negative definite then yes it will be strictly convex/concave.
Best Answer
Actually, for the Hessian matrix $H_f$ that you defined, the principal minors form a monotone increasing sequence.. If we denote them as $\Delta_1$, $\Delta_2$, $\ldots$, etc., then
$0 < \Delta_1 < \Delta_2 < \Delta_3 < \cdots < \Delta_k$ $\cdots$
As a consequence, it follows that
$\Delta_k > 0$ for all values of $k$.
Hence, by Sylvester's test for positive definiteness (a sufficient condition), we conclude that the Hessian matrix $H_f$ is strictly positive definite. $\blacksquare$