Let $A$ be an $n\times m$ matrix. This matrix represents a linear transformation $L_A\colon\mathbb{R}^m\to\mathbb{R}^n$. A left inverse would correspond to a linear transformation $T\colon\mathbb{R}^n\to\mathbb{R}^m$ such that $T\circ L_A = \mathrm{Id}_{\mathbb{R}^m}$. For the composition to be the identity, it is necessary that $L_A$ be one-to-one; in particular, we need $m\leq n$ and for $A$ to be of full rank.
A right inverse would correspond to a linear transformation $U\colon \mathbb{R}^n\to\mathbb{R}^m$ such that $L_A\circ U=\mathrm{Id}_{\mathbb{R}^n}$. For the composition to be the identity, we need $L_A$ to be onto; in particular, we need $m\geq n$ and for $A$ to be of full rank.
In other words, a necessary condition for $A$ to have a one-sided inverse is that it be of full rank (that is, $\mathrm{rank}(A)=\min(n,m)$).
In fact, the condition is sufficient as well:
Suppose first that $n\leq m$ and $\mathrm{rank}(A)=n$. Then $L_A$ is onto, so $A\mathbf{e}_1,\ldots,A\mathbf{e}_m$ span $\mathbb{R}^n$; so we pare the set down to a basis. If $i_1\lt\cdots\lt i_n$ are such that $A\mathbf{e}_{i_j}$ are a basis for $\mathbb{R}^n$, then define $U\colon\mathbb{R}^n\to \mathbb{R}^m$ by $U(A(\mathbf{e}_{i_j})) = \mathbf{e}_{i_j}$ and extend linearly. Since the $A\mathbf{e}_{i_j}$ are a basis, this can be done and defines $U$ uniquely; clearly, $L_A\circ U=I_{\mathbb{R}^n}$; computing the coordinate matrix of $U$ relative to the standard basis of $\mathbb{R}^n$ gives a right inverse for $A$.
Next, suppose that $n\geq m$ and $\mathrm{rank}(A)=m$. Then $A$ is one-to-one, so $A\mathbf{e}_1,\ldots, A\mathbf{e}_m$ are linearly independent. Complete to a basis for $\mathbb{R}^n$, $A\mathbf{e}_1,\ldots,A\mathbf{e}_m,\mathbf{w}_{m+1},\ldots,\mathbf{w}_n$, and define $T\colon \mathbb{R}^n\to\mathbb{R}^m$ by $T(A\mathbf{e}_i)=\mathbf{e}_i$, and $T(\mathbf{w}_j)$ arbitrary (say, $\mathbf{0}$). Then $T\circ L_A=I_{\mathbb{R}^m}$; computing the coordinate matrix of $T$ relative to the standard basis of $\mathbb{R}^n$ gives a left inverse for $A$.
Note, moreover, that if $n\neq m$, then there are many different one-sided inverses for $A$ (when $A$ has full rank); so one should not talk about "the" left (or right) inverse of $A$.
Let $A$ be a full rank $m\times n$ matrix. By full rank we mean $\DeclareMathOperator{rank}{rank}\rank(A)=\min\{m,n\}$.
If $m<n$, then $A$ has a right inverse given by
$$
A^{-1}_{\text{right}}=A^\top(AA^\top)^{-1}
$$
If $m>n$, then $A$ has a left inverse given by
$$
A^{-1}_{\text{left}}=(A^\top A)^{-1} A^\top
$$
Of course, a right inverse is useful for solving an equation of the form $XA=Y$ and a left inverse is useful for solving an equation of the form $AX=Y$.
Best Answer
Yes, it's true (assuming that $n$ and $m$ are finite for this answer). Here's an argument:
Details:
The first column of $AX$ can be thought of as a linear combination of the cols of $A$ by the entries of the first col of $X$; the same goes for each other col. The fact that $AX = I$ then means that the span of the columns of $I$ is included in the span of the columns of $A$, hence, if $A$ is $n \times k$, we know that the cols of $A$ span $\mathbb R^k$; there are $k$ indep. cols.
The same argument applies to rows.
For part 3, one cheap argument relies on determinants: $p$ (column) $n$-vectors are independent if and only if you can select $p$ of the $n$ rows to form a $p \times p$ matrix whose determinant is nonzero. Thus you cannot have more than $n$ independent $n$-vectors. The same applies to row-vectors.
Finally, if the matrix is non-square, the number of independent rows or columns is at most the smaller of the number of rows and number of cols, hence one set or the other is not independent, so either a left or right inverse can't exist.