Since you asked for an intuitive way to understand covariance and contravariance, I think this will do.
First of all, remember that the reason of having covariant or contravariant tensors is because you want to represent the same thing in a different coordinate system. Such a new representation is achieved by a transformation using a set of partial derivatives. In tensor analysis, a good transformation is one that leaves invariant the quantity you are interested in.
For example, we consider the transformation from one coordinate system $x^1,...,x^{n}$ to another $x^{'1},...,x^{'n}$:
$x^{i}=f^{i}(x^{'1},x^{'2},...,x^{'n})$ where $f^{i}$ are certain functions.
Take a look at a couple of specific quantities. How do we transform coordinates? The answer is:
$dx^{i}=\displaystyle \frac{\partial x^{i}}{\partial x^{'k}}dx^{'k}$
Every quantity which under a transformation of coordinates, transforms like the coordinate differentials is called a contravariant tensor.
How do we transform some scalar $\Phi$?
$\displaystyle \frac{\partial \Phi}{\partial x^{i}}=\frac{\partial \Phi}{\partial x^{'k}}\frac{\partial x^{'k}}{\partial x^{i}}$
Every quantity which under a coordinate transformation, transforms like the derivatives of a scalar is called a covariant tensor.
Accordingly, a reasonable generalization is having a quantity which transforms like the product of the components of two contravariant tensors, that is
$A^{ik}=\displaystyle \frac{\partial x^{i}}{\partial x^{'l}}\frac{\partial x^{k}}{\partial x^{'m}}A^{'lm}$
which is called a contravariant tensor of rank two. The same applies to covariant tensors of rank n or mixed tensor of rank n.
Having in mind the analogy to coordinate differentials and derivative of a scalar, take a look at this picture, which I think will help to make it clearer:
From Wikipedia:
The contravariant components of a vector are obtained by projecting onto the coordinate axes. The covariant components are obtained by projecting onto the normal lines to the coordinate hyperplanes.
Finally, you may want to read: Basis vectors
By the way, I don't recommend to rely blindly on the picture given by matrices, specially when you are doing calculations.
You can represent "contravariant vectors" as rows and "covariant vectors" as columns all right if you want.
It's just a convention. The dual space of the space of column vectors can be naturally identified with the space of row vectors, because matrix multiplication can then correspond to the "pairing" between a "covariant vector" and a "contravariant vector".
Remember that "covariant vectors" are defined as scalar-valued linear maps on the space of "contravariant vectors", so if $\omega$ is a covariant vector and $v$ is a contravariant vector, then $\omega(v)$ is a real number that depends linearly on both $v$ and $\omega$. If you make $v$ correspond to a column vector, and make $\omega$ correspond to a row vector then $$ \omega(v)=\omega v=(\omega_1,...,\omega_n)\left(\begin{matrix}v^1 \\ \vdots \\ v^n\end{matrix}\right)=\omega_1v^1+...+\omega_nv^n. $$
If $\omega$ was the column instead, then the above matrix multiplication would look as $\omega(v)=v\omega$, which would not look as aesthetically pleasing, as we are used to displaying the argument of a function to the right of the function, and in this case $v$ is the argument.
Best Answer
To get the components of the contravariant vector $v = v^i e_i$, where $e_i$ is the natural basis, we dot with the basis vectors $e^i$ for the dual space, $$v\cdot e^j = v^i e_i\cdot e^j = v^i \delta_{i}^j = v^j.$$ Likewise, to find the components of a covariant vector $w = w_i e^i$ we dot with basis vectors from the natural basis, $$w\cdot e_j = w_i e^i\cdot e_j = w_i \delta^{i}_j = w_j.$$
Sometimes the natural basis vectors are called covariant (since their indices are downstairs) and the dual basis vectors contravariant (since their indices are upstairs). With this convention a contravariant vector, with contravariant components, is written in terms of the covariant basis!
After a while, you get used to this sort of nonsense.
Addendum: The terms contravariant and covariant refer to how an object transforms under coordinate transformation, $x\to x'$. In physics, where one is often dealing with coordinates, this is especially vivid. Does the thing transform contravariantly with $\frac{\partial {x'}^j}{\partial x^i}$ or covariantly with $\frac{\partial {x}^j}{\partial {x'}^i}$? That is why the terminology is not so bad. $e^i$ really does transform contravariantly. This has to be the case so that $$\begin{eqnarray*} v &=& v^i e_i \\ &=& v^i \delta_i^j e_j \\ &=& v^i \frac{\partial {x'}^k}{\partial x^i} \frac{\partial {x}^j}{\partial {x'}^k} e_j \\ &=& {v'}^i {e'}_i. \end{eqnarray*}$$ To add another wrinkle, physicists also often say that an object that is invariant under transformation is covariant!