That matrix is symmetric. It is a consequence of linear algebra that a symmetric matrix is orthogonally diagonalizable. That means there are two perpendicular directions upon which that matrix acts as scaling by $\lambda_1$ and by $\lambda_2$.
These $\lambda_i$ represent the quadratic coefficient of a parabolic approximation to the function $f$ at $(x_0,y_0)$ as you move through in the direction of each eigenspace. Since you already are looking at a critical point, the quadratic approximation is reaching its tip at $(x_0,y_0)$. If the two $\lambda_i$ are opposite in sign, you will have two parabolas orthogonal to each other opening in different directions, clearly creating a saddle. If you have two $\lambda_i$ that are of the same sign, then depending on that sign you either have a max or a min.
But the determinant of a $2\times2$ matrix works out to be the same thing as the product of the two eigenvalues. So you can see how a negative determinant implies $\lambda_i$ of opposite sign, which implies a saddle point, and a positive determinant similarly implies either a max or a min.
Locally at any $(x_0,y_0)$, there is a higher dimensional version of the Taylor series, grouped here by increasing order of derivative:
$$\begin{align*}
f(x,y)&=f(x_0,y_0)+\Big[f_x(x_0,y_0)\cdot(x-x_0)+f_y(x_0,y_0)\cdot(y-y_0)\Big]\\
&\phantom{{}={}}+\frac12\Big[f_{xx}(x_0,y_0)\cdot(x-x_0)^2+f_{xy}(x_0,y_0)\cdot(x-x_0)(y-y_0)\\
&\phantom{{}={}}+f_{yx}(x_0,y_0)\cdot(y-y_0)(x-x_0)+f_{yy}(x_0,y_0)\cdot(y-y_0)^2\Big]+\cdots\\
&=f(x_0,y_0)+\nabla f(x_0,y_0)\cdot\left((x,y)-(x_0,y_0)\right)^T\\
&\phantom{{}={}}+\frac12\left((x,y)-(x_0,y_0)\right)\cdot H(x_0,y_0)\cdot\left((x,y)-(x_0,y_0)\right)^T+\cdots
\end{align*}$$
When you are at a critical point, this simplifies to
$$\begin{align*}
f(x,y)&=f(x_0,y_0)+\frac12\left((x,y)-(x_0,y_0)\right)\cdot H(x_0,y_0)\cdot\left((x,y)-(x_0,y_0)\right)^T+\cdots
\end{align*}$$
And if we could change coordinates to an $s$ and $t$ variable that run in the directions of $H$'s eigenspaces, based at the critical point, we'd just have
$$f(s;t)=f(0;0)+\frac12\lambda_1s^2+\frac12\lambda_2t^2+\cdots$$ which I hope helps to see the parabolas and the role of the eigenvalues of $H$.
To say that the test is inconclusive when the determinant $f_{xx} f_{yy} - f_{xy}^2$ is zero at a point is to say just that: The test doesn't tell us anything, so if we want to determine the type of the critical point, we must do a little more. (Indeed, the functions $(x, y) \mapsto x^4 + y^4$, $(x, y) \mapsto x^4 - y^4$, and $(x, y) \mapsto -x^4 - y^4$ all have a critical point at $(0, 0)$ with zero Hessian determinant, but the three functions respectively have a minimum, a saddle point, and a maximum there.)
First, note that something like this issue already occurs for single-variable functions. Checking shows that $g(x) := x^4$ has a critical point at $x = 0$, and computing the second derivative there gives $g''(0) = 0$, so we cannot conclude whether $g$ has a minimum, a maximum, or neither, at that point. We can still determine the type of critical point, however, by observing that $g(x) > 0$ for any $x \neq 0$, and so the critical point must be an (isolated) minimum.
In the case of our two-variable function $f(x, y)$, we can proceed as follows: Computing gives that $$f(x, 0) = x^4, $$ which we've already said has an isolated minimum at $x = 0$. On the other hand, $$f(0, y) = y^4 - y^2 $$ Applying the (single-variable) Second Derivative Test gives $\frac{d^2}{dy^2} [f(0, y)] = -2$, so $f(0, y)$ has an isolated maximum at $y = 0$. So, $f(x, y)$ takes on both positive and negative values in any open set containing $(0, 0)$, so it must be a saddle point.
Best Answer
If we look at the 3D-plot, we have:
You are correct, using the typical tests (Find all critical points of $f(x,y) = x^3 - 12xy + 8y^3$ and state maximum, minimum, or saddle points.), the results are inconclusive and you do not have a local or global minimum or maximum at the single critical point $(0,0)$.