1) Jointly convex.
2) Element-wise convexity works for one variable only, other variables are supposed to be constant; joint convexity implies element-wise convexity, but the other implication is false. Consider a function $x^2+y^2-4xy$.
3) If the function is twice differentiable. For example, $f(x)=\|x\|$ is convex, but has troubles with Hessian in zero.
This answer addresses the first question.
A real-valued, symmetric matrix is positive-semidefinite if and only if all its eigenvalues are nonnegative -- $0$ counts, no matter how many times it occurs.
Also, positive-semidefiniteness everywhere of the Hessian is equivalent to convexity; see this Math.SE question. Hessian matrices of real-valued $C^2$ functions are symmetric. So $0$ eigenvalues do count in this equivalence too; if your computation was correct, then the first function would be convex.
However, your first eigenvalue computation is incorrect. Indeed, we see that
$$H \begin{pmatrix} 1 \\ -1 \end{pmatrix} = \begin{pmatrix} -1 \\ 1 \end{pmatrix} = (-1) \begin{pmatrix} 1 \\ -1 \end{pmatrix},$$
so there is an eigenvalue of $-1$. Also, there is an eigenvalue of $1$, since
$$H \begin{pmatrix} 1 \\ 1 \end{pmatrix} = \begin{pmatrix} 1 \\ 1 \end{pmatrix}.$$
Since the number of eigenvalues is less than or equal to the dimension, we have eigenvalues of $\pm 1$.
[How did I magically create these eigenvectors? Well, I saw that $H^2 = I$, so I figured the constants were nice, and since the matrix looks like a reflection over the axis $x_1 = x_2$ (i.e., changing the roles of $x_1$ and $x_2$), I figured the lines $x_2 = x_1$ and $x_2 = -x_1$ would be relevant.]
Regarding the "usual" method of characteristic polynomial for finding the eigenvalues, I get the equation for $\det (\lambda I - H ) = 0$ of
$$\lambda^2 - 1 = 0.$$
Did you make a computational error here?
In any event, with the correct computation, the first function is not convex. [Geometrically, if you are familiar with convexity of one-dimensional functions, if you let $x_2 = -x_1$, you get $g(x_1) \overbrace{=}^{defined} f(x_1, -x_1) = - x_1^2$, which shows a ``concave down'' effect].
P.S. Since for diagonalizable matrices (including real symmetric matrices), the determinant of the matrix is the product of the eigenvalues, and the determinant of your second matrix is 0 (I think), you should have at least one zero eigenvalue. You should double-check that the other eigenvalue is positive.
Best Answer
Think about the case where $n=1$. You are given that $$\begin{pmatrix}g''(x) & g'(x)\\\ g'(x) & 1\end{pmatrix}$$
is positive semi-definite. This is true if and only if the all principal minors are non-negative. Thus we can conclude that $$g''(x)\geq [g'(x)]^2.$$
(Also $g''(x)\geq 0$, but we already knew that because we are told that $g$ is convex).
The first derivative of $h$ is $$h'(x)=\exp(-g(x))g'(x)$$
The second derivative of $h$ is $$\begin{align}h''(x)&=\exp(-g(x))g'(x)\\&=-\exp(-g(x))[g'(x)]^2+\exp(-g(x))g''(x)\\ &=\exp(-g(x))(g''(x)-[g'(x)]^2)\end{align}$$
Since the first term in the product is positive, and we know the term in brackets is non-negative, we can conclude that $h''(x)\geq 0$ and so $h$ is convex.
Now "all" you need to do is rewrite that for the general case.