I think the simplest way to look at it is considering the dimensions of the Matrices $A$ and $A^{-1 }$ and apply simple multiplication.
So assume, wlog $A$ is $m \times n $, with $n\neq m$ then $A^{-1 }$ has to be $n\times m$ because thats the only way $AA^{-1 }=I_m$
But it must also be true that $A^{-1 } A=I_m$ but now instead of $I_m$ you get $I_n$ wich is not in accordance with the definition of an Inverse ( see ZettaSuro)
Hence $m$ must be equal to $n$
Yes nullity is same as dimension of null space.
Observe that if $A$ and $B$ are similar then $\exists \, P$ such that $A=PBP^{-1}$. Using this we can also say that $A-\lambda I=PBP^{-1}-\lambda I = P(B-\lambda I)P^{-1}$. This means both $A-\lambda I$ and $B-\lambda I$ are also similar. So to prove what you have asked it is enough to show that the nullity of similar matrices is same. So here is an outline to show that if $A$ and $B$ are similar then there nullities will be same.
Let us assume there exists an invertible matrix $P$ such that $B=PAP^{-1}$.
Let $S=\{x_1,x_2, \ldots, x_k\}$ be a basis of the null space $N(B)$, then $Bx_i=0$ for all $i \in \{1,2,3, \ldots, k\}$. Observe that
\begin{align*}
Bx_i & = 0 \\
PAP^{-1}x_i & = 0\\
A(P^{-1}x_i) & = 0
\end{align*}
Thus the set $\{P^{-1}x_1, P^{-1}x_2, \ldots, P^{-1}x_k\} \subseteq N(A)$. You can easily show that this set is linearly independent. Now if we can show that $\text{span}\{P^{-1}x_1, P^{-1}x_2, \ldots, P^{-1}x_k\}=N(A)$, then we are done.
Let $y \in N(A)$, then $Ay=P^{-1}BPy=0$. This means $Py \in N(B)$. Thus $Py=c_1x_1+c_2x_2+ \dotsb + c_kx_k$ for some scalars $c_i$. Now multiply both sides by $P^{-1}$.
Best Answer
The first practical application that comes to mind should be handling of matrix equations and solving of a system of linear equations, see here. If you have an equation of the form $$Ax=y$$ with $A\in\mathbb R^{n\times n},x,y\in\mathbb R^{n\times 1}$ and you know that $A$ is invertible, you can find the solution via multiplication: $$Ax=y \iff A^{-1}Ax=A^{-1}y \iff I_nx=A^{-1}y \iff x=A^{-1}y.$$
But this only deals with one application, there is much more theory to matrices to discover e.g. in linear algebra. Assume we have $\mathcal V$ and $\mathcal W$ finite-dimensional vector spaces over the same field $F$, then every linear map from $\mathcal V$ to $\mathcal W$ can be represented by a matrix $A\in F^{m\times n}$ with $\dim(\mathcal V)=n$ and $\dim(\mathcal W)=m$.
Now let $\dim(\mathcal V)=\dim(\mathcal W)$, then we have a matrix $A\in F^{n\times n}$. If we know that $A$ is invertible, we immediately know that the corresponding linear map $\varphi: \mathcal V\rightarrow\mathcal W$ is bijective and we also know that the corresponding linear map to $A^{-1}$ is $\varphi^{-1}$. Using some properties of a linear map we also know that e.g. $0$ is not an eigenvalue of $\varphi$ nor $\varphi^{-1}$ etc. So just by having $A$ invertible we know lots of things about the corresponding linear map.