The word affine probably has a dozen meanings, but quotient space is not one of them.
Quotient space $V/W$ of two vector spaces $V,W,W\subsetneq V$ is itself a vector space, but it is not a subspace of $V$, because elements $v+W$ of $V/W$ are actually affine subsets of $V$ instead of vectors. These affine sets all have the same dimension as $W$. But
$$\dim V/W=\dim V-\dim W$$
as you said.
To understand the concept of quotient space, consider the scenario in which the information in $W$ is not important to you, and you want to lose it. One can find a complementary subspace $U$ of $W$ in $V$ (assuming that $V$ is finite dimensional) such that:
$$
V=W\oplus U
$$
Any vector $v\in V$ can be uniquely decomposed into components $v_W\in W$ and $v_U\in U$. So you simply dump $v_W$ and keep $v_U$. But the problem is that one usually does not have a natural choice of the complementary subspace $U$. The concept of quotient space is to create instead a unique model for all possible choices of complementary subspaces. More specifically, for the quotient space $V/W$, a natural projection is introduced:
$$
\pi:V\to V/W, \pi(v)=v+W
$$
which is a linear surjection. $\pi$ restricted to any complementary subspace $U$ of $W$ is an isomorphism:
$$
\pi|_U:U\xrightarrow[]{\sim}V/W
$$
To find a basis for a quotient space, you should start with a basis for the space you are quotienting by (i.e. $U$). Then take a basis (or spanning set) for the whole vector space (i.e. $V=\mathbb{R}^4$) and see what vectors stay independent when added to your original basis for $U$.
In your case, it's pretty easy to see that $\{ (2,-1,0,0), (0,0,3,-1) \}$ form a basis for $U$ (essentially $U$ is a null space - you could use Gaussian elimination to find such a basis). We now need $\dim(\mathbb{R}^4)-\dim(U)= 4-2=2$ more vectors to complete this to a basis for $\mathbb{R}^4$. It should be fairly obvious that $(1,0,0,0)$ and $(0,0,1,0)$ are independent of our basis for $U$ (to verify you could always chuck all of these vectors into a matrix, row reduce, and see that you get the identity matrix).
Ok. We now have a basis $\{ (2,-1,0,0), (0,0,3,-1), (1,0,0,0), (0,0,1,0) \}$ for $\mathbb{R}^4$ where the first two vectors form a basis for $U$. The second two vectors are independent of the first two. These vectors give representatives for cosets which form a basis for $\mathbb{R}^4/U$. That is, $\{ (1,0,0,0)+U, (0,0,1,0)+U \}$ is a basis for $\mathbb{R}^4/U$.
Here's more or less an algorithm: Suppose $U$ is a subspace of $V$. Let $A=[a_1\;a_2\;\cdots\;a_m]$ be a matrix whose columns span $U$. Let $B = [b_1\;b_2\;\cdots\;b_n]$ be a matrix whose columns span $V$. Row reduce the matrix $[A\;B] = [a_1\;\cdots\;a_m\;b_1\;\cdots\;b_n]$ to locate pivot columns. Say among the $a_i$'s, $c_1,\dots,c_k$ are pivots and among the $b_j$'s, $d_1,\dots,d_\ell$ are pivots. Then $\{c_1,\dots,c_k\}$ form a basis for $U$ and $\{d_1+U,\dots,d_\ell+U\}$ form a basis for $V/U$.
Best Answer
$\rho=\pi|_V$ means $\rho$ is the restriction of $\pi$ to $V$.
$\pi$ is onto. So any element $y$ of $\mathbb R^{2}/U$ is $\pi (x)$ for some $x$. Now since $U$ and $V$ are two different lines it follows that $U+V$ is two-dimensional, and hence $U+V=\mathbb R^{2}$. Thus, we can write $x=x_1+x_2$ with $x_1 \in U$ and $x_2 \in V$. Now $y=\pi (x)=\pi(x_1)+\pi (x_2)=0+\pi (x_2)$. But $\pi(x_2)=\rho (x_2)$. This proves that $y =\rho (x_2)$ and we have proved that $\rho$ is surjective.
Injectivity: $x \in V, \rho (x)=0$ impies $\pi (x)=0$ which implies $x \in U$. But $x \in V$ also so $x=0$.