On the line $y=0$, $g(x,y)=x^6$, which is concave up.
On the curve $x=y^2$, $g(x,y)=y^{12}-y^{10}=y^{10}(y^2-1)$, which is concave down.
More details, as requested:
A saddle point is stationary, but neither a local max nor a local min. $g(x,y)$ is stationary at the origin, because both partials are zero. $(0,0)$ is not a local max by the first observation above, it is not a local min by the second one.
To say that the test is inconclusive when the determinant $f_{xx} f_{yy} - f_{xy}^2$ is zero at a point is to say just that: The test doesn't tell us anything, so if we want to determine the type of the critical point, we must do a little more. (Indeed, the functions $(x, y) \mapsto x^4 + y^4$, $(x, y) \mapsto x^4 - y^4$, and $(x, y) \mapsto -x^4 - y^4$ all have a critical point at $(0, 0)$ with zero Hessian determinant, but the three functions respectively have a minimum, a saddle point, and a maximum there.)
First, note that something like this issue already occurs for single-variable functions. Checking shows that $g(x) := x^4$ has a critical point at $x = 0$, and computing the second derivative there gives $g''(0) = 0$, so we cannot conclude whether $g$ has a minimum, a maximum, or neither, at that point. We can still determine the type of critical point, however, by observing that $g(x) > 0$ for any $x \neq 0$, and so the critical point must be an (isolated) minimum.
In the case of our two-variable function $f(x, y)$, we can proceed as follows: Computing gives that $$f(x, 0) = x^4, $$ which we've already said has an isolated minimum at $x = 0$. On the other hand, $$f(0, y) = y^4 - y^2 $$ Applying the (single-variable) Second Derivative Test gives $\frac{d^2}{dy^2} [f(0, y)] = -2$, so $f(0, y)$ has an isolated maximum at $y = 0$. So, $f(x, y)$ takes on both positive and negative values in any open set containing $(0, 0)$, so it must be a saddle point.
Best Answer
I presume we're talking about a twice differentiable function $f$ defined in a neighbourhood of a point $p = (p_1, p_2, \ldots, p_n)$ such that the gradient $\nabla f(p) = 0$, and the Hessian matrix $H = H(f)(p)$ is indefinite. Thus there exist eigenvectors $u$ and $v$ of $H$ corresponding to eigenvalues $\lambda$ and $\mu$ which are positive and negative respectively. From the Taylor expansion $$ f(p + \epsilon x) = f(p) + \epsilon x^T \nabla f(p) + \frac{\epsilon^2}{2} x^T H x + o(\epsilon^2)$$ we find that $$f(p + \epsilon u) = f(p) + \frac{\epsilon^2 \lambda u^T u}{2} + o(\epsilon^2)$$ and similarly $$f(p + \epsilon v) = f(p) + \frac{\epsilon^2 \mu v^T v}{2} + o(\epsilon^2)$$ so $f(p + \epsilon v) > f(p) > f(p + \epsilon u)$ for sufficiently small $\epsilon>0$, and thus $p$ is a saddle point.