The unique matrix that satisfies $EA = B$ is the matrix that "swaps" the
first and third rows. It is given as
$$E=\begin{bmatrix} 0 & 0 & 1 \\ 0 & 1 & 0 \\ 1 & 0 & 0
\\ \end{bmatrix}.$$
Edit:
Due to a question in the comments, here comes a bit longer explanation.
(1) The rows of matrix $B$ and $A$ are the same, except for the fact that we have to swap
the first and the third row.
(2) $E$, defined above, is the special matrix that swaps the first and third rows of any $3 \times 3$ matrix $O$ when multiplied by it from the left. This can, for example, be seen by simple matrix multiplication
$$\begin{bmatrix} 0 & 0 & 1 \\ 0 & 1 & 0 \\ 1 & 0 & 0
\\ \end{bmatrix}\begin{bmatrix} p_1 & p_2 & p_3 \\ q_1 & q_2 & q_3 \\ r_1 & r_2 & r_3
\\ \end{bmatrix} = \begin{bmatrix} r_1 & r_2 & r_3 \\ q_1 & q_2 & q_3 \\ p_1 & p_2 & p_3
\\ \end{bmatrix}. $$
Hence, as a particular case, we also have $EA=B$. (Moreover, since $A$ and $B$ are non-singular matrices, the solution to the matrix equation $XA=B$ is unique: $X=BA^{-1}$,
calculating this we would again get $X=E$.)
(3) Elementary matrices (see definition here) differs from the identity matrix by one single elementary row operation. After swapping the first and third row of $E$ (which is an elementary row operation) we arrive to matrix
$$\begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1
\\ \end{bmatrix},$$
which is exactly the identity matrix. Hence $E$ is an elementary matrix.
Best Answer
An elementary matrix $E$ is a square matrix that the effect of performing a single elementary row operation to $I$. For example, $\left[ \begin{matrix} 1 & 0 \\ 1 & 1 \end{matrix}\right]$ is an elementary matrix because adding the first row of $I$ to the second row of $I$ gives us this matrix. Moreover, matrix multiplication on the left by an elementary matrix is equivalent to performing the corresponding elementary row operation. Thus, with $A$ as in your question and $E$ as above, we have that $$EA=\left[ \begin{matrix} 1& 0 \\ 1 & 1 \end{matrix}\right]\left[ \begin{matrix} 3 & 1 \\ 1 & -2 \end{matrix}\right]=\left[ \begin{matrix} 3 & 1 \\ 3+1 & 1+(-2) \end{matrix}\right] =\left[ \begin{matrix} 3 & 1 \\ 4 & -1 \end{matrix}\right]$$
A square matrix $A$ is invertible if and only if it can be reduced to the identity matrix, which is to say that by multiplying by finitely-many elementary matrices on the left we get the identity: $$E_nE_{n-1}\cdots E_2E_1A=I$$ so that $$A=E_1^{-1}E_2^{-1}\cdots E_{n-1}^{-1}E_n^{-1}.$$ The above is well-defined since every elementary matrix is invertible (its inverse corresponds to the elementary row operation that reverses the elementary row operation corresponding to the original elementary matrix).
Thus, the first step is to row-reduce $A$ to the identity $I$, keeping track of what operations we used, and then multiplying the corresponding inverses in the opposite order as indicated above.
For example, take $A=\left[ \begin{matrix} 1 & 2 \\ 2 & 1 \end{matrix}\right]$. We can reduce this to $I$ by subtracting two times the second row from the first row, giving us $\left[ \begin{matrix} -3 & 0 \\ 2 & 1 \end{matrix}\right]$. We can then add $2/3$ the first row to the second to give us $\left[ \begin{matrix} -3 & 0 \\ 0 & 1 \end{matrix}\right]$. Finally, we multiply the first row by $-1/3$. This gives us the decomposition $$\left[ \begin{matrix} 1 & 2 \\ 2 & 1 \end{matrix}\right] =\left[ \begin{matrix} 1 & 2 \\ 0 & 1 \end{matrix}\right] \left[ \begin{matrix} 1 & 0 \\ -2/3 & 1 \end{matrix}\right] \left[ \begin{matrix} -3 & 0 \\ 0 & 1 \end{matrix}\right]$$