So, let us suppose that $A$ is a square matrix, and that $B$ is a matrix such that $BA=I$. You want to show that $B$ is the unique left inverse of $A$ (that is).
Note that a system $A\mathbf{x}=\mathbf{b}$ has at most one solution, namely $B\mathbf{b}$: if $A\mathbf{x}=\mathbf{b}$, then
$$\mathbf{x} = I\mathbf{x} = BA\mathbf{x} = B\mathbf{b}.$$
If $CA=I$, then again a system $A\mathbf{x}=\mathbf{b}$ has at most one solution, namely $C\mathbf{b}$. Thus, $B\mathbf{b}=C\mathbf{b}$ for any $\mathbf{b}$ for which the system has a solution.
If we can show that $A\mathbf{x}=\mathbf{e}_i$ has a solution for each $i$, where $\mathbf{e}_i$ is the $i$th standard basis vector ($1$ in the $i$th entry, $0$s elsewhere) this will show that $B=C$, since they have the same columns.
Because $A\mathbf{x}=\mathbf{0}$ has a solution, that solution must be $B\mathbf{0}=\mathbf{0}$. That means that the reduced row-echelon form of $A$ is $I$. Because the reduced row-echelon form of $A$ is $I$, performing row reduction on the augmented coefficient matrix $[A|\mathbf{e}_i]$ yields the matrix $[I|\mathbf{y}]$ for some $\mathbf{y}$, with $\mathbf{y}$ being the solution to $A\mathbf{x}=\mathbf{e}_i$. Since this vector is equal to both $\mathbf{b}_i=B\mathbf{e}_i$ (the $i$th column of $B$) and to $\mathbf{c}_i=C\mathbf{e}_i$, as noted above, then the $i$th columns of $B$ and $C$ are equal; thus, $B=C$, and the matrix has a unique left inverse.
Now, let us suppose that $A$ is a square matrix and has a right inverse, $AB=I$. We want to show that $B$ is the unique right inverse of $A$. Taking transposes, we get $I = I^T = (AB)^T = B^TA^T$. By what was proven above, $B^T$ is the unique left inverse of $A^T$. If $AC=I$, then $C^TA^T=I^T = I$, so $C^T=B^T$, hence $C=B$. Thus, $B$ is the unique right inverse of $A$.
If your matrix $A$ is square, then $A$ has a left inverse if and only if $A$ is invertible. Also, if $A$ is square, then $A$ has a right inverse if and only if $A$ is invertible.
It is possible to construct noninvertible nonsquare matrices with a right inverse. For example, consider the projection $\Bbb R^2\to \Bbb R$. This is given by the matrix $$\begin{bmatrix}1 &0\end{bmatrix}$$ Its right inverse is $$\begin{bmatrix}1\\ 0\end{bmatrix}$$
Of course, the second matrix in this example is an example of a noninvertible matrix with a left inverse.
Best Answer
It has to be a square matrix. This can be shown by using basic properties about linear equation systems:
Let $A$ be a $m \times n$-matrix. An $n \times m$-matrix $B$ is a left inverse of $A$ if and only if $$ \sum_{k=1}^m B_{ik} A_{kj} = (B \cdot A)_{ij} = I_{ij} = \delta_{ij} \quad \text{for every $1 \leq i,j \leq n$}. $$ This results in a linear equation system in the variables $B_{ij}$. It has $mn$ variables and $n^2$ equations, so for the uniqueness of the solution we must have $n^2 \geq nm$ and thus $n \geq m$.
On the other hand we have $BAx = x$ for every $x \in K^n$, so the linear equation system $By = x$ has a solution for every $x \in K^n$. By looking at the echelon form of $B$ we see that this can only happen if $n \leq m$ (if $n > m$ then the echelon form of $B$ contains zero rows). So we must have $n = m$, so $A$ and $B$ are both square matrices.