Most vector spaces I've met don't have a natural basis. However this is question that comes up when teaching linear algebra. You want to motivate abstract vector spaces instead of working with $\mathbb{R}^n$ (or your favourite field in place of $\mathbb{R}$). One simple example, is this.
Consider $\mathbb{R}^n$ ($n>2$) as a euclidean space relative to the "dot" product and let $v = (1,1,\dots,1)$. Then the subspace $V \subset \mathbb{R}^n$ of vectors orthogonal to $v$ does not have a natural basis. If you don't like introducing an inner product, then take $V$ to be the annihilator of $v$ in the dual of $\mathbb{R}^n$. This actually comes up when discussing the root space of $\mathfrak{su}(n)$, say.
The answer is no, I think. Here is a proof sketch. (An unclear point in a previous version has now been removed, by slightly modifying the construction of the sequence.)
Let $(S_n)_{n\in\omega}$ be a family of ``pairs of socks''; that is, each $S_n$ has 2 elements, the $S_n$ are disjoint, but there is no set which meets infinitely many $S_n$ in exactly one point. Let $S$ be the union of the $S_n$.
Let $V$ be a vector space with basis $S$ over the 3-element field. For each $v\in V$, each $s\in S$ let $c_s(v)$ be the $s$-coordinate of $v$. (In your notation: $v(s)$.)
Consider the subspace $W$ of all vectors $w$ with the following property: For all $n$, if $S_n = \{a,b\}$, then $c_a(w)+c_b(w)=0$. The set of all $n$ such that for both/any $a\in S_n$ we have $c_a(w) \neq0$ will be called the domain of $w$. Clearly, each domain is finite, and for each finite subset of $\omega$ of size $k$ there are $2^k$ vectors $w\in W$ with this domain.
[Revised version from here on.]
I will show
- From any basis $C$ of $W$ we can define a 1-1 sequence of elements of $W$.
- From any 1-1 sequence of elements of $W$ we can define a 1-1 sequence of elements of $S$.
Together, this will show that there is no basis, as $S$ contains no countably infinite set.
For each set $D$ which appears as the domain of a basis vector, let $x_D$ be the sum of all basis vectors with this domain. So $x_D \neq 0$, and for $D\neq D'$ we get $x_D\neq x_{D'}$.
From a well-order of the finite subsets of $\omega$ we thus obtain a well-ordered sequence of nonzero vectors. Since there must be infinitely many basis vectors, and only finitely many can share the same set $D$, we have obtained an infinite sequence of vectors in $W$.
We are now given an infinite sequence $(w_n)$ of distinct vectors of $W$. The union of their domains cannot be finite, so we may wlog assume that the sequence $k_n:= \max(dom(w_n))$ is strictly increasing. (Thin out, if necessary.)
Now let $a_n$ be the element of $S_{k_n}$ be such that $c_{a_n}(w_n)=1$. Then the set of those $a_n$ meets infinitely many of the $S_k$ in a singleton.
Best Answer
The linearly independent set $\{ e^{sx} \}$ is generated by a simple mechanism: namely, it consists of eigenvectors for an operator $\frac{d}{dx}$ acting on a vector space all of whose eigenspaces are $1$-dimensional. The rational functions, I think, don't naturally appear in this way but they are all annihilated by some polynomial differential operator in the Weyl algebra $k \left[ x, \frac{d}{dx} \right]$.
In general if you have an algebra $A$ acting on a vector space $V$ it's interesting to look at the vectors $v \in V$ such that the $A$-submodule $Av$ generated by them is simple. (This is one of a few possible natural generalizations of being an eigenvector.) Then if $v_i, i \in I$ are vectors such that the corresponding $A$-submodules $A v_i$ are both simple and nonisomorphic, the $v_i$ are linearly independent.