[Math] Probability that a random integer matrix is invertible

linear algebramatricesprobability

Assume that $A_{n\times m}$ is a random matrix, i.e. each of $A$'s entries is selected independently from a uniform distribution over $\mathbf{Z}$.

I want to show that $Pr(A\mbox{ is invertible})=1$.

I only know how to show this for $n=2$ because:

  1. With probability 1 all entries are not zero (probability to select uniformly a single element from infinite set is zero).
  2. Given 1, if we selected $A_{1,1}$, $A_{1,2}$ and $A_{2,1}$ and $\lambda \in\mathbf{Z}$ is the (only, if it exist) constant satisfying $A_{2,1} = \lambda A_{1,1}$ then if we select $A_{2,2} \neq\lambda A_{1,2}$ (which happens with probability 1 from the same considerations) then line 2 is linearly independent in line 1.

However the same argument will not work for line 3 since there are infinitely many combinations of $\lambda_1, \lambda_2$ that satisfy $A_{3,1} = \lambda_1A_{1,1}+ \lambda_2 A_{2,1}$.

Best Answer

On any line $A+tB$ in matrix space, the set of singular matrices is given by the solutions of the polynomial equation $$ \det(A+tB)=0 $$ Since there are only finitely many, this strongly hints that for most probability distributions the chance of hitting a singular matrix is zero.

Or told another way, the set of singular matrices forms a hypersurface, which thus has Lebesgue measure zero. Any probability measure that has a Radon-Nikodym density relative to the Lebesgue measure will also have the singular matrices as nullset.

Related Question