$ \DeclareMathOperator{\tr}{tr}$
You can get some information using the concepts of Schur-convexity and majorization:
Let $X$ be an $n \times n$ real symmetric matrix, with spectral decomposition
$ X = U \Lambda U^T$, $U$ is an orthogonal matrix, $\Lambda$ is diagonal with the eigenvalues. Then your function is
$$
g_\lambda(X)= \frac{1}{\lambda} \log \tr e^{\lambda X} =
\frac{1}{\lambda} \log\sum_{k=1}^n e^{\lambda \lambda_k}.
$$ (the matrix exponential function)
Now, $g_\lambda$ is a Schur-convex function (the proof is easy), the proof method given in the wikipedia article works. And, if now $X$ and $Y$ are two real symmetric matrices as above, then, if the eigenvalues of $X$ are $\lambda_i$ and the eigenvalues of $Y$ is $\omega_i$, we have the majorization result that the vector of eigenvalues of $\mu X + (1-\mu) Y$ $(0 \le \mu \le 1)$ is majorized by the vector $\mu \lambda + (1-\mu) \omega$. This gives the result that
$$
g_\lambda(\mu X + (1-\mu) Y) \le \frac{1}{\lambda} \log \sum_{k=1}^n e^{\lambda(\mu \lambda_k + (1-\mu) \omega_k)}
$$
which might be of some help in further analysis.
EDIT The following paper seems to give an answer: http://people.orie.cornell.edu/aslewis/publications/96-convex.pdf
Your function is convex. The result proved there is that a spectral matrix function (of Hermitian or real symmetric argument) ($g_\lambda$ is spectral, meaning that it depends only on the eigenvalues of the matrix argument):
convex spectral functions can be characterized as symmetric convex functions of the eigenvalues. $g_\lambda$ is clearly symmetric, and you proved yourself it is convex. That is all which is needed (after reading that paper above).
This answer addresses the first question.
A real-valued, symmetric matrix is positive-semidefinite if and only if all its eigenvalues are nonnegative -- $0$ counts, no matter how many times it occurs.
Also, positive-semidefiniteness everywhere of the Hessian is equivalent to convexity; see this Math.SE question. Hessian matrices of real-valued $C^2$ functions are symmetric. So $0$ eigenvalues do count in this equivalence too; if your computation was correct, then the first function would be convex.
However, your first eigenvalue computation is incorrect. Indeed, we see that
$$H \begin{pmatrix} 1 \\ -1 \end{pmatrix} = \begin{pmatrix} -1 \\ 1 \end{pmatrix} = (-1) \begin{pmatrix} 1 \\ -1 \end{pmatrix},$$
so there is an eigenvalue of $-1$. Also, there is an eigenvalue of $1$, since
$$H \begin{pmatrix} 1 \\ 1 \end{pmatrix} = \begin{pmatrix} 1 \\ 1 \end{pmatrix}.$$
Since the number of eigenvalues is less than or equal to the dimension, we have eigenvalues of $\pm 1$.
[How did I magically create these eigenvectors? Well, I saw that $H^2 = I$, so I figured the constants were nice, and since the matrix looks like a reflection over the axis $x_1 = x_2$ (i.e., changing the roles of $x_1$ and $x_2$), I figured the lines $x_2 = x_1$ and $x_2 = -x_1$ would be relevant.]
Regarding the "usual" method of characteristic polynomial for finding the eigenvalues, I get the equation for $\det (\lambda I - H ) = 0$ of
$$\lambda^2 - 1 = 0.$$
Did you make a computational error here?
In any event, with the correct computation, the first function is not convex. [Geometrically, if you are familiar with convexity of one-dimensional functions, if you let $x_2 = -x_1$, you get $g(x_1) \overbrace{=}^{defined} f(x_1, -x_1) = - x_1^2$, which shows a ``concave down'' effect].
P.S. Since for diagonalizable matrices (including real symmetric matrices), the determinant of the matrix is the product of the eigenvalues, and the determinant of your second matrix is 0 (I think), you should have at least one zero eigenvalue. You should double-check that the other eigenvalue is positive.
Best Answer
1) Jointly convex.
2) Element-wise convexity works for one variable only, other variables are supposed to be constant; joint convexity implies element-wise convexity, but the other implication is false. Consider a function $x^2+y^2-4xy$.
3) If the function is twice differentiable. For example, $f(x)=\|x\|$ is convex, but has troubles with Hessian in zero.