$A$ is an invertible matrix.
Find $B$ such that $AB = BA = I$ or show that it is impossible.
$A$ is row equivalent to the n x n identity matrix.
Reduce $A$ to reduced row echelon form. See if is identity.
$A$ has n pivot positions
Pretty much the same as above.
The equation $Ax = 0$ has only the trivial solution.
Solve the system.
The columns of $A$ for a linearly independent set.
Write down your columns as vectors in $\Bbb R^n$. See if $\{x_1,x_2,\ldots,x_n\}$ is linearly dependent or independent by your usual means.
The linear transformation $x ↦ Ax$ is one-to-one.
Solve $Av = 0$ as above. Pick one solution $v_0$ and define $y = x + v_0$ for any vector $x$. Note that $x= y$ if and only if $v_0 = 0$.
The equation $Ax = b$ has at least one solution for each $b$ in $ℝ^n$.
If $A$ is invertible, $A^{-1}b$ is the solution. If not, find $b$ such that $Ax = b$ has no solutions (hint: reduce extended matrix $(A|b)$ to row echelon form. If there are zero rows in transformed $A$, choose $b$ such that its transformed component in that row is non-zero).
The columns of $A$ span $ℝ^n$.
Write down columns of $A$ as vectors in $\Bbb R^n$. See if they generate $\Bbb R^n$ or not. Since there are $n$ columns, it is equivalent to check if columns are linearly dependent or not.
The linear transformation $x ↦ Ax$ maps $ℝ^n$ onto $ℝ^n$.
You must check surjectivity of $A$, i.e. for all $b\in ℝ^n$, you must find solution to $Ax = b$ or find $b$ such that there is no solution.
There is an n x n matrix $C$ such that $CA = I$.
Write general matrix $C$, multiply by $A$ and show that coefficients of $C$ can be chosen to get identity or disprove it.
There is an n x n matrix $D$ such that $AD = I$.
As above.
$AT$ is an invertible matrix.
As the first one.
No, it is not.
The simplest example would be the null map, which sends everyone to $0$. It is linear, but almost never onto.
Other examples would be the projections on a sub-vector space:
Let $E$ and $F$ be two vector spaces of finite dimensions $m$ and $n$ respectively (with $m>n$.
Then take $H$ a sub-vector space of $F$ of dimension $k<n$.
Define $p_H$ the projection on $H$.
Then $p_H$ is linear, but not onto because $\mathrm{dim}(\mathrm{Im}(p_H))=\mathrm {dim}(H)=k<n$.
In your particular example, the rank of your matrix is $1$, so it will span a space of dimension $1$, so it can be $\mathbb R^2$. So it is indeed not onto.
Best Answer
Suppose
$L^k = 0, \; k \ge 1; \tag 1$
then consider the identity, which holds for any $m \ge 1$,
$L^m - I = (L - I)(\displaystyle \sum_0^{m - 1} L^j) = (L - I)(L^{m - 1} + L^{m - 2} + \ldots L + I); \tag 2$
this equation may easily be proved (by induction on $m$ if you like), and is quite likely familiar to the reader either from high-school algebra or the study of roots of unity in field theory. Be that as it may, with (1) in place we see that (2) becomes, with $m = k$,
$-I = (L - I)(L^{k - 1} + L^{k - 2} + L + I), \tag 3$
which shows that $I - L$ is invertible with inverse
$(I - L)^{-1} = L^{k - 1} + L^{k - 2} + L + I. \tag 4$
The particular case at hand may be resolved by taking $k = 3$.