Usually, a matrix is thought of a representation of a linear operator: a map that takes a vector and spits out another vector. Say $A$ is some linear operator and $v$ is some vector, then $A(v)$ is the output vector.
An equivalent way of looking at it, however, is to say that there is a map $B$ that takes two vectors $v, w$ and spits out a scalar, given by $B(v,w) = A(v) \cdot w$, say. Such a map is what is usually described in the literature when talking about tensors.
Where do contravariance and covariance come in? Well, the above idea of a tensor is actually a bit of a cheat. There might not be an inner product; we might not be able to freely convert between vectors and covectors using it. So instead of saying that $B$ takes two vectors as arguments, let $B$ be a map taking one vector $v$ and a covector $\alpha$ instead, so that $B(v, \alpha) = \alpha(A(v))$.
(You'll note that, if there is a way to convert from vectors to covectors, then any tensor acting on $p$ vectors and $q$ covectors could be converted to one that acts on $p+q$ vectors, for instance.)
A general tensor could take any number of vector or covector arguments, or a mix of the two in any number.
In physics, it's common to look at the components of a tensor with respect to some basis--rather than supply whatever vectors or covectors that might be relevant to a problem, we supply a set of basis vectors and covectors instead, so we need only remember the coefficients. If $e_i$ is the $i$th basis vector and $e^j$ is the $j$th basis covector, then $B(e_i, e^j) = {B_i}^j$ takes us from the more math-inclined definition of a tensor to the more familiar form to a physicist.
The word tensor is often abused. Firstly, a tensor is simply an element of the tensor product of some vector spaces or bimodules or something. In this sense, of course there are non-square tensors. For example an element of $V\otimes_k W$ would be called a tensor, for any $k$-vector spaces $V$ and $W$. But the words covariant and contravariant don't have any meaning here.
Secondly (and this is more closely aligned with the topic of your question), tensor might also mean a tensor (in the first sense above) valued function on a manifold. For example, let $T_p M$ denote the tangent space to a smooth manifold at the point $p\in M$. A tensor can mean a choice of element $Z_p\in T_pM\otimes \cdots \otimes T_pM \otimes (T_pM)^\ast\otimes \cdots\otimes (T_pM)^\ast$ for each point $p\in M$, which depends differentiably on $p$. For example vector fields are tensors in this sense. The words covariant and contravariant have their origins here in how the coordinates of $Z$ behave with respect to coordinate transformations on $M$.
For a "non-square" tensor of this type, one especially important example is the second fundamental form. If $M^k$ is a Riemannian manifold isometrically immeresed in some Riemannian manifold $N^{k+n}$, then the second fundamental form is roughly this: for a point $p\in M$ and a pair of tangent vectors $v,w\in T_pM\subset T_pN$, there is a normal vector $S_p(v,w)\in (T_pM)^\perp$ which is something like a second derivative (hence measures curvature). Since $S_p$ chews on two tangent vectors and spits out a normal vector, we can think of $S_p$ as an element of $(T_pM)^\perp\otimes (T_pM)^\ast\otimes (T_pM)^\ast$. This $S$ is a very important non-square tensor (dimensions $n\times k\times k$)!
For a more precise response to your questions:
Not really. Tensors don't really act on anything. However $\operatorname{End}(V)\cong V\otimes V^\ast$, so operators can be thought of tensors, but not usually vice versa. A tensor usually just means an element of a tensor product of vector spaces (mathematician) or as a tensor valued function (physicist).
I would say this is right. Without any context there's no reason to call an element of $V\otimes W^\ast$ a tensor of type $(1,1)$ or $(2,0)$, or whatever. These notions are undefined in general.
No.
Yes! See above.
Best Answer
In physics, the terms "vector" and more generally "tensor", refers to an object that transforms like a vector/tensor.
First of all, a matrix is just a representation of a tensor in a certain coordinate frame, the tensor itself is an abstract notion, independent of notation and choice of coordinates. Secondly, a tensor in physics is an object, that, when expressed in another coordinate frame, takes the form
$$T'_{ij}=P_{ik}P_{jl}T_{kl}$$ if matrix $P$ takes the original frame to the primed (') frame.
In this sense general nonsquare matrices aren't tensors, nor are the matrices that represent just some sets of linear equations instead of objects that are bound to geometry of the physical space.
Tensors are very commonly symmetric (Hermitian).
Moreover, we are talking here about 2nd rank tensors. A tensor can be of any dimension (e.g. tensor of 1st rank is a vector). For instance, force is a vector, but a set of three variables is not, in the context of physics, called a vector. Even magnetic field is not strictly speaking a vector, when you go to relativistic space-time representation because it doesn't transform as one (plus, its parity is wrong under coordinate transformations that invert the space, so even classically, it's called an pseudo vector).
This is a physicists' interpretation of the question what is a tensor and what is not.
If there are any questions, I can elaborate more.