This answer addresses the first question.
A real-valued, symmetric matrix is positive-semidefinite if and only if all its eigenvalues are nonnegative -- $0$ counts, no matter how many times it occurs.
Also, positive-semidefiniteness everywhere of the Hessian is equivalent to convexity; see this Math.SE question. Hessian matrices of real-valued $C^2$ functions are symmetric. So $0$ eigenvalues do count in this equivalence too; if your computation was correct, then the first function would be convex.
However, your first eigenvalue computation is incorrect. Indeed, we see that
$$H \begin{pmatrix} 1 \\ -1 \end{pmatrix} = \begin{pmatrix} -1 \\ 1 \end{pmatrix} = (-1) \begin{pmatrix} 1 \\ -1 \end{pmatrix},$$
so there is an eigenvalue of $-1$. Also, there is an eigenvalue of $1$, since
$$H \begin{pmatrix} 1 \\ 1 \end{pmatrix} = \begin{pmatrix} 1 \\ 1 \end{pmatrix}.$$
Since the number of eigenvalues is less than or equal to the dimension, we have eigenvalues of $\pm 1$.
[How did I magically create these eigenvectors? Well, I saw that $H^2 = I$, so I figured the constants were nice, and since the matrix looks like a reflection over the axis $x_1 = x_2$ (i.e., changing the roles of $x_1$ and $x_2$), I figured the lines $x_2 = x_1$ and $x_2 = -x_1$ would be relevant.]
Regarding the "usual" method of characteristic polynomial for finding the eigenvalues, I get the equation for $\det (\lambda I - H ) = 0$ of
$$\lambda^2 - 1 = 0.$$
Did you make a computational error here?
In any event, with the correct computation, the first function is not convex. [Geometrically, if you are familiar with convexity of one-dimensional functions, if you let $x_2 = -x_1$, you get $g(x_1) \overbrace{=}^{defined} f(x_1, -x_1) = - x_1^2$, which shows a ``concave down'' effect].
P.S. Since for diagonalizable matrices (including real symmetric matrices), the determinant of the matrix is the product of the eigenvalues, and the determinant of your second matrix is 0 (I think), you should have at least one zero eigenvalue. You should double-check that the other eigenvalue is positive.
The test is not quite right. First, the diagonal entries of a symmetric matrix are rarely equal to its eigenvalues. For example
$$
\begin{pmatrix}
1 & 2 \\
2 & 1
\end{pmatrix}
$$
is a symmetric matrix whose eigenvalues are $3$ and $-1$. I think you may be confusing the terms "symmetric" and "diagonal", as the Hessian will always be symmetric (with light assumptions).
Second, the test is not correct. You are correct that if the Hessian is positive definite then you have a local minimum, and if the Hessian is negative definite then you have a local maximum. However, a $ 2 \times 2 $ matrix is negative definite if $a_1$ is negative and the determinant is positive. E.g.
$$
\begin{pmatrix}
-1 & 0 \\
0& -1
\end{pmatrix}
$$
is negative definite and has determinant equal to one. Also, if the determinant of a $2 \times 2$ matrix is negative then you have a nondefinite matrix regardless of the sign of $a_1$.
A third comment is, you can have a local max or min at a point where the Hessian is not positive definite. The Hessian must be positive semidefinite at a local min, but you cannot gaurantee that the eigenvalues are nonzero. E.g. if $f(x,y)=x^4+y^4$ then $(0,0)$ is the global minimum but the Hessian at $(0,0)$ is equal to zero.
And finally, while determinant is good for testing in the two variable case, if you go to three or more variables testing the determinant and the sign of $a_1$ is not enough. It doesn't seem you need to worry about this for the course you are taking, but its nice to keep in mind for the future.
Best Answer
I will answer some of your questions.
The entries of a Hessian matrix $H$ of $f$ are second partials $H_{ij}=\partial_i\partial_jf$ and it is a standard result in multivariable calculus that $\partial_i\partial_jf=\partial_j\partial_if$ provided both second partials are continuous functions. In your case, the entries of the Hessian are constants so are continuous functions. This means that it is not at all accidental that the matrix is symmetric.
By the spectral theorem, symmetric matrices can always be diagonalised, so the symmetry of the matrix does play a role in allowing one to determine whether the Hessian is positive/negative (semi) definite. Provided you found the eigenvalues correctly, you have drawn the correct conclusion about $H_1$ and $H_2$.
Finally, if the Hessian is positive/negative definite then yes it will be strictly convex/concave.