You can try to continue with Gaussian elimination, but taking $z$ as a constant and reducing the system to a two variable problem:
$$
\left[\begin{array}{rrr|r}
\color{green}{1} & \color{green}{4} & \color{red}{2} & 0 \\
\color{green}{0} & \color{green}{-3} & \color{red}{-2} & 1 \\
0 & 0 & 0 & 0
\end{array}\right] \longrightarrow
\left[\begin{array}{rr|r}
\color{green}{1} & \color{green}{4} & 0\color{red}{-2z} \\
\color{green}{0} & \color{green}{-3} & 1\color{red}{+2z} \\
\end{array}\right]$$
Note that you need to change the sign of the column corresponding to $z$, since you are moving it to the right-hand side of the equations.
You will obtain the general solution in terms of $z$.
It’s not quite Gaussian elimination, since that only left-multiplies by invertible matrices. The underlying idea is that you can find bases for the domain and codomain in which the matrix of $f$ has the requisite form. $X$ and $Y$ are just the corresponding change-of-basis matrices.
Recall that the columns of the matrix are the images $f(v_i)$ of the domain’s basis vectors. So, choose an ordered basis $(v_{r+1},\dots,v_m)$ for the kernel of $f$ and extend it to a complete basis $(v_1,\dots,v_r,v_{r+1},\dots,v_m)$ for the domain. Relative to this basis, the first $r$ columns of the matrix are nonzero, while the rest are zero.
The images $\{f(v_1),\dots,f(v_r)\}$ of the first $r$ basis vectors are linearly independent (prove this!). Extend this to an ordered basis $(f(v_1),\dots,f(v_r),w_{r+1},\dots,w_n)$ of the codomain. Relative to this basis, $f(v_1)=(1,0,0,\dots,0)^T$, $f(v_2)=(0,1,0,\dots,0)^T$, and so on, which gives you $I_r$ padded below with zeros for the first $r$ columns of the matrix.
You can compute a suitable $X$ and $Y$ by performing Gaussian elimination twice. First, augment $A$ by the appropriately-sized identity matrix and reduce to obtain $$[A\mid I_m] \to [B\mid X].$$ The last $m-r+1$ rows of $B$ will be zero as required, and $XA=B$. Then, augment $B^T$ and reduce again: $$[B^T\mid I_n] \to [C^T\mid Y^T].$$ The last $n-r+1$ columns of $C$ are zero, the last $m-r+1$ were already zero when you obtained $B$, and since $C^T$ is in RREF, the upper-left block is $I_r$. Finally, C^T=Y^TB^T, so $C=XAY$. The fact that the row rank and column rank are equal is part of the content of the Rank-Nullity theorem, but you can see why that must be so from the basis construction above. Observe, too, that $X$ and $Y$ are not unique: you are free to choose the kernel basis and its extension to a basis of the domain, as well as the extension of the image of the kernel basis to an image of the codomain. Each of these choices potentially generates a different $X$ and $Y$.
Best Answer
Not everything is computation of solutions. In general, it is useful to know the structure of the set of solutions. Sometimes one wants to find a system that has certain properties.