Edited per Ryan's clarification below.
Statement 1: Yes, this is fine. If $M$ is neither positive nor negative definite, and has no zero eigenvalues, then it must have at least one positive and one negative eigenvalue. Notice that this is a sufficient but not necessary condition on $M$ being indefinite. $\left[\begin{array}{ccc}0 & 0 &0\\0 & 1 & 0\\0 & 0 & -1\end{array}\right]$ is indefinite, for instance.
Statement 2: No, this is false. Consider for instance $\left[\begin{array}{cc}1 & 0\\0 & 0\end{array}\right]$ which is positive-semidefinite.
It is impossible to characterize indefinite matrices from the leading minors alone. For example, if the first row and column of a symmetric matrix $M$ is zero, the matrix might be positive-semidefinite, negative-semidefinite, or indefinite, yet all of the leading minors will be zero.
A complete, correct statement requires looking at all principal minors, for example: a symmetric matrix $M$ is indefinite (has positive and negative eigenvalues) if and only if:
- $\Delta_k < 0$ for some even $k$; or
- $\Delta_{k_1} > 0$ and $\Delta_{k_2} < 0$ for two different odd $k_1$ and $k_2$.
Knowing that $M$ is not strictly positive- or negative-definite does not really help. You can check that if $M$ satisfies neither of these conditions, then it must satisfy one of the rows of the purple box.
EDIT: Proof of the "only if" direction. Let $M$ be indefinite. Suppose, for contradiction, that neither of the above two hold. Then either all of the odd-dimensional minors are nonnegative, or all are nonpositive.
In the former case, $M$ satisfies the third row of the purple box above, and $M$ is positive-semidefinite, a contradiction.
In the latter case, $M$ satisfies the fourth row of the purple box above, and $M$ is negative-semidefinite, a contradiction.
EDIT 3: Proof of the "if" direction. Suppose one of the even-dimensional minors is negative, and suppose, for contradiction, that $M$ is positive-semidefinite or negative-semidefinite. Then by row three or four of the purple box (as appropriate), that minor is in fact positive, a contradiction. Therefore $M$ is neither positive- nor negative-semidefinite, and so is indefinite.
Suppose instead one of the odd-dimensional minors is positive, and another is negative, and suppose $M$ is positive-semidefinite. Then both of those minors are positive, a contradiction. Now suppose $M$ is negative-semidefinite. Then both of those minors are negative, a contradiction. The only remaining possibility is that $M$ is indefinite.
You can prove it easily with the (already mentioned) Cauchy interlacing theorem, the fact that the determinant is equal to the product of eigenvalues and that the (real) symmetric matrix is positive definite if and only if its eigenvalues are positive.
The statement is obviously true for $1\times 1$ matrices. Assume that "all leading principal minors of $A$ are positive implies $A$ is positive definite" is true for $k\times k$ matrices, $k\leq n-1$, and consider $A$ to be $n\times n$ with all leading principal minors positive. By the induction assumption, we know that the leading principal $(n-1)\times(n-1)$ submatrix of $A$ is positive definite. By the interlacing property, all "larger" $n-1$ eigenvalues of $A$ are positive up to (possibly) the smallest one. However, the smallest eigenvalue cannot be nonpositive since otherwise the determinant of $A$ would not be positive.
Best Answer
There is a method, algorithm really, that deserves to be better known. Given a symmetric matrix $H$ of integers, it provides a matrix $P$ with rational (or integer) entries and $\det P = 1,$ along with a diagonal matrix $D,$ such that $$ P^T H P = D. $$ Since $\det P = 1 \;$ (and $P$ is usually upper triangular), it is not so hard to find $Q = P^{-1},$ after which $$ Q^T D Q = H. $$
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
$$ P^T H P = D $$ $$\left( \begin{array}{rrr} 1 & 0 & 0 \\ - 1 & 1 & 0 \\ 0 & - 1 & 1 \\ \end{array} \right) \left( \begin{array}{rrr} 1 & 1 & 1 \\ 1 & 2 & 2 \\ 1 & 2 & 3 \\ \end{array} \right) \left( \begin{array}{rrr} 1 & - 1 & 0 \\ 0 & 1 & - 1 \\ 0 & 0 & 1 \\ \end{array} \right) = \left( \begin{array}{rrr} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \\ \end{array} \right) $$ $$ Q^T D Q = H $$ $$\left( \begin{array}{rrr} 1 & 0 & 0 \\ 1 & 1 & 0 \\ 1 & 1 & 1 \\ \end{array} \right) \left( \begin{array}{rrr} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \\ \end{array} \right) \left( \begin{array}{rrr} 1 & 1 & 1 \\ 0 & 1 & 1 \\ 0 & 0 & 1 \\ \end{array} \right) = \left( \begin{array}{rrr} 1 & 1 & 1 \\ 1 & 2 & 2 \\ 1 & 2 & 3 \\ \end{array} \right) $$
See, for example, reference for linear algebra books that teach reverse Hermite method for symmetric matrices
Illustrated here, with notation change $D$ = h2.