Always try out examples, starting out with the simplest possible examples (it may take some thought as to which examples are the simplest). Does for instance the identity matrix have complex eigenvectors? This is pretty easy to answer, right?
Now for the general case: if $A$ is any real matrix with real eigenvalue $\lambda$, then we have a choice of looking for real eigenvectors or complex eigenvectors. The theorem here is that the $\mathbb{R}$-dimension of the space of real eigenvectors for $\lambda$ is equal to the $\mathbb{C}$-dimension of the space of complex eigenvectors for $\lambda$. It follows that (i) we will always have non-real eigenvectors (this is easy: if $v$ is a real eigenvector, then $iv$ is a non-real eigenvector) and (ii) there will always be a $\mathbb{C}$-basis for the space of complex eigenvectors consisting entirely of real eigenvectors.
As for the proof: the $\lambda$-eigenspace is the kernel of the (linear transformation given by the) matrix $\lambda I_n - A$. By the rank-nullity theorem, the dimension of this kernel is equal to $n$ minus the rank of the matrix. Since the rank of a real matrix doesn't change when we view it as a complex matrix (e.g. the reduced row echelon form is unique so must stay the same upon passage from $\mathbb{R}$ to $\mathbb{C}$), the dimension of the kernel doesn't change either. Moreover, if $v_1,\ldots,v_k$ are a set of real vectors which are linearly independent over $\mathbb{R}$, then they are also linearly independent over $\mathbb{C}$ (to see this, just write out a linear dependence relation over $\mathbb{C}$ and decompose it into real and imaginary parts), so any given $\mathbb{R}$-basis for the eigenspace over $\mathbb{R}$ is also a $\mathbb{C}$-basis for the eigenspace over $\mathbb{C}$.
If an $N\times N$ matrix $A$ has an eigenvalue $\lambda$, there are infinitely many vectors $v$ satisfying $Av = \lambda v$; that is, $A$ has infinitely many eigenvectors for the eigenvector $\lambda$. In fact,
$$E_{\lambda} := \{v \in \mathbb{R}^N \mid Av = \lambda v\}$$
is a subspace of $\mathbb{R}^N$.
This answers your first question, but let me point out an error in your thinking.
When we say $A$ has $N$ eigenvalues, we mean that the characteristic equation for $A$, $|\lambda I - A| = 0$, has $n$ zeroes; this is always true by the fundamental theorem of algebra, but some of the eigenvalues may be complex (e.g. a $2\times 2$ rotation matrix). Note, we count eigenvalues with multiplicity, so $\lambda$ could be repeated multiple times. We call the order of the zero $\lambda$ the algebraic multiplicity of $\lambda$.
For any eigenvalue $\lambda$, some say that $\lambda$ has $k$ corresponding eigenvectors if $\dim E_{\lambda} = k$ (this terminology is not often defined but is instead used in verbal communication). If $A$ has distinct eigenvalues $\lambda_1, \dots, \lambda_M$, one might say that $A$ has $\dim E_{\lambda_1} + \dots + \dim E_{\lambda_M}$ corresponding eigenvectors. The dimension of $E_{\lambda}$ is called the geometric multiplicity of $\lambda$.
We have the following result relating the two notions of multiplicity:
The geometric multiplicity is less than or equal to the algebraic multiplicity.
There are cases where the geometric multipicity of $\lambda$ is strictly less than the algebraic multiplicity, and therefore $A$ has less than $N$ corresponding eigenvectors. For example,
$$A = \left[\begin{matrix}1 & 1\\ 0 & 1\end{matrix}\right]$$
has a repeated eigenvalue of $1$ but only has a one-dimensional eigenspace.
To reconcile the difference between geometric multiplicity and algebraic multiplicity, one can consider generalised eigenvectors.
Best Answer
Identify each of the following functions from $R^2$ to $R^2$ with their matrix representations : $f(x,y)=(x,0)$ , $g(x,y)=(2 x,0)$ , $h(x,y)=(0,y)$ , $i(x,y)=((x+y)/ 2,(x+y)/2)$. Now $f,g$ have the same eigenvectors but different eigenvalues. And $h,i$ have the same eigenvalues but different eigenvectors.