Heuristically, I suppose we could see this being because the standard action of the complex numbers on $V = \mathbb R^{2n}$ is by rotation. That is, $(e_1, \ldots, e_{2n})$ is a basis for $V$, then we define multiplication by $i$ as
$ i e_{2k-1} = e_{2k}, \quad i e_{2k} = - e_{2k-1},$ for $k = 1, \ldots, n$,
so multiplying a vector $v$ by a complex number $\lambda$ will correspond to a scaling by a real and a rotation.
Now, if we have a rotation $A$ on the space $V$ and we want to find a line $l$ "invariant" under $A$, then we can try to look for a complex number $\lambda$ such that rotation of $l$ under $A$ is equivalent to the action of $\lambda$ on $l$. Thus, we can try and look for complex eigenvalues $\lambda$ of $A$.
This line of "reasoning" completely breaks down in the odd-dimensional case, because we can't define a complex structure on odd-dimensional spaces, but it might give a hint into why we'd look for complex eigenvalues at all. Then it becomes a matter of algebra to figure out that it actually works, and that it does so in a vector space of any finite dimension.
Finally, for the existence, as Alex already pointed out, we look for eigenvalues by finding roots of polynomials. All polynomials admit a root over the complex numbers, which translates into the existence of a complex eigenvalue.
The complex numbers and the algebra of all $n \times n$ complex matrices are particular examples of (complex) Banach algebras with involution. The involution is somewhat (though not always perfectly) analogous to complex conjugation. The algebra of bounded linear operators on a Hilbert space is one example of such an algebra (mentioned in the reference of Halmos in Theo's comment), which includes both these cases as subcase. In that case, the involution sends an operator to its adjoint, and the analogy with complex conjugation is quite strong. The behaviour of an operator with respect to this involution is strongly reflected in the behaviour of its spectrum (in the case of finite dimensional vector spaces, spectrum of a matrix equates to the set of eigenvalues of that matrix).
However, the complex numbers is unique among (complex) Banach algebras, in that the only Banach division algebra (that is, Banach algebra in which every non-zero element has
a multiplicative inverse) up to isomorphism is the complex numbers itself (this is the Banach-Mazur theorem).
Best Answer
What you are probably looking for is a discussion on Clifford algebras or maybe one tailored to so-called geometric algebras which is pretty much under the same umbrella.
Clifford algebra explains the connection between elements of a special algebra and rotations in a vector space (and more.) In particular, it generalizes what you see for the reals, complexes, and quaternions on $\mathbb R$, $\mathbb R^2$ and $\mathbb R^3$ respectively.