Let denote $(e_1,\ldots,e_n)$ the standard basis of $\mathbb R^n$ and suppose that $(f_1,\ldots,f_{n+1})$ a set of linearly independent vectors. We can write
$$f_1=\sum_{k=1}^n a_k e_k$$
and since $f_1\ne 0$ there's $a_k\ne 0$ and WLOG suppose $a_1\neq0$ so
$$e_1=\frac{1}{a_1}\left(f_1-\sum_{k=2}^n a_k e_k\right)$$
hence we see that $(f_1,e_2,\ldots,e_n)$ spans $\mathbb R^n$.
Now repeat $n$ times the same method (induction) and we find that $(f_1,\ldots,f_{n})$ spans $\mathbb R^n$ so the vectors $f_{n+1}$ is a linear combination of the other vectors $f_i$ which is a contradiction.
If the induction step was not obvious, consider the following:
Since we have established that $(f_1,e_2,\ldots,e_n)$ spans $\mathbb R^n$, write
$$f_2=b_1f_1+\sum_{k=2}^n b_k e_k$$
Since $f_2\neq0$, there's at least one $b_l\neq0$. Also note that not all $(b_2,b_3,\ldots,b_n)$ can be zero because otherwise it would imply that $f_2=b_1f_1$ contradicting the assumed linear independence of $(f_1,\ldots,f_{n+1})$. So we can take $b_l\neq0$ where $l\geq 2$. WLOG suppose $b_3\neq0$. Now,
$$e_3=\frac{1}{b_3}\left(f_2-b_1f_1-b_2e_2-\sum_{k=4}^n b_k e_k\right)$$
From this we see that, $(f_1,e_2,f_2,e_4,\ldots,e_n)$ spans $\mathbb R^n$, where we replaced $e_3$ with $f_2$. The assumed linear independence of the $f_i$'s means that we can repeat this process to replaces all the $e_i$'s.
The row space is generated by two nonzero vectors. It is easy to see these are linearly independent, so they form a basis for the row space.
Similarly, the column space is generated by three vectors. Two are identical, and the other is independent of the duplicated vector, so we obtain from these a basis of two vectors for the column space.
For the null-space of $A$, we note first that it must be 2-dimensional by the rank-nullity theorem. Since the first column is null, the vector $(1,0,0,0)$ is in the null space. To find a second vector in the nullspace, write out $Ax = 0$ explicitly.
$x_1\begin{bmatrix}
0 \\
0 \\
0 \\
\end{bmatrix}
+ x_2\begin{bmatrix}
3 \\
0 \\
1 \\
\end{bmatrix}
+ x_3\begin{bmatrix}
3 \\
0 \\
0 \\
\end{bmatrix}
+ x_4\begin{bmatrix}
3 \\
0 \\
1 \\
\end{bmatrix} = 0$.
The 2nd column equals the 4th, so subtracting the 4th from the 2nd gives zero. This is the same as $Ax$ when $x = (0,1,0,-1)$. Both vectors are linearly independent, so they form a basis.
By the rank-nullity theorem again, we see that the null space of $A^T$ has dimension $1$, so all we need to do is find one non-zero vector in this space. Since the second column of $A^T$ is null, $(0,1,0)$ works.
Best Answer
It turns out $A=LU$ has rank $3$. To see this, we'll show that $R(A)=\mathbb{R}^3$. Choose $b\in \mathbb{R}^3$ arbitrary. Note $Ax=b$ has a solution iff $\Big[U\Big|L^{-1}b\Big]$ is consistent. The latter augmented system is consisent since $U$ has $3$ pivot columns, so $A$ has rank $3$; any basis for $\mathbb{R}^3$ will suffice as a basis for $R(A)$. Next, observe how $Ax=0$ iff $Ux=0$ so $N(A)=N(U)$. You can find $N(A^T)$ using the fact $\Big[R(A)\Big]^{\perp}=N(A^T)$. Lastly, to find $R(A^T)$, use the facts that $$\text{rank}(A^T)=\text{rank}(A)=3$$ $$\text{rank}(U^T)=3$$ $$R(A^T)\subseteq R(U^T)$$ to conclude $R(A^T)=R(U^T)$.