Compute the eigenvalues of the hessian.
If all the eigenvalues are nonnegative, it is positive semidefinite.
If all the eigenvalues are positive, it is positive definite.
If all the eigenvalues are nonpositive, it is negative semidefinite.
If all the eigenvalues are negative, it is negative definite.
Otherwise, it is indefinite.
Edit:
For that example, you have found $c=(0,0,0)$.
$$H(f(c))=\begin{bmatrix} 0 & 1 & 1 \\ 1 & 0 & 0 \\ 1 & 0 & 0\end{bmatrix}$$
$$H(f(c))-\lambda I= \begin{bmatrix} -\lambda & 1 & 1 \\ 1 & -\lambda & 0 \\ 1 & 0 & -\lambda\end{bmatrix}$$
\begin{align}\det(H(f(c))-\lambda I)&= \det\left(\begin{bmatrix}1 & 1 \\ -\lambda & 0 \end{bmatrix} \right) -\lambda \det \left( \begin{bmatrix}-\lambda & 1 \\ 1& -\lambda \end{bmatrix} \right)
\\&=\lambda-\lambda(\lambda^2-1)
\\&=\lambda(2-\lambda^2)\end{align}
Hence, the eigenvalues are $0$, $\sqrt{2}$ and $-\sqrt{2}$. Hence it is indefinite.
The test is not quite right. First, the diagonal entries of a symmetric matrix are rarely equal to its eigenvalues. For example
$$
\begin{pmatrix}
1 & 2 \\
2 & 1
\end{pmatrix}
$$
is a symmetric matrix whose eigenvalues are $3$ and $-1$. I think you may be confusing the terms "symmetric" and "diagonal", as the Hessian will always be symmetric (with light assumptions).
Second, the test is not correct. You are correct that if the Hessian is positive definite then you have a local minimum, and if the Hessian is negative definite then you have a local maximum. However, a $ 2 \times 2 $ matrix is negative definite if $a_1$ is negative and the determinant is positive. E.g.
$$
\begin{pmatrix}
-1 & 0 \\
0& -1
\end{pmatrix}
$$
is negative definite and has determinant equal to one. Also, if the determinant of a $2 \times 2$ matrix is negative then you have a nondefinite matrix regardless of the sign of $a_1$.
A third comment is, you can have a local max or min at a point where the Hessian is not positive definite. The Hessian must be positive semidefinite at a local min, but you cannot gaurantee that the eigenvalues are nonzero. E.g. if $f(x,y)=x^4+y^4$ then $(0,0)$ is the global minimum but the Hessian at $(0,0)$ is equal to zero.
And finally, while determinant is good for testing in the two variable case, if you go to three or more variables testing the determinant and the sign of $a_1$ is not enough. It doesn't seem you need to worry about this for the course you are taking, but its nice to keep in mind for the future.
Best Answer
The Fundamental Strategy of Calculus is to take a nonlinear function (difficult) and approximate it locally by a linear function (easy). If $f:\mathbb R^n \to \mathbb R$ is differentiable at $x_0$, then our local linear approximation for $f$ is $$ f(x) \approx f(x_0) + \nabla f(x_0)^T(x - x_0). $$ But why not approximate $f$ instead by a quadratic function? The best quadratic approximation to a smooth function $f:\mathbb R^n \to \mathbb R$ near $x_0$ is $$ f(x) \approx f(x_0) + \nabla f(x_0)^T (x - x_0) + \frac12 (x - x_0)^T Hf(x_0)(x - x_0) $$ where $Hf(x_0)$ is the Hessian of $f$ at $x_0$.