Once we have one eigenvector $v$ corresponding to the eigenvalue $\lambda$, any non-zero scalar multiple of $v$ is also an eigenvector corresponding to $\lambda$. So rather than talking about the number of eigenvectors, we talk about the number of linearly independent eigenvectors ( for example corresponding to a particular eigenvalue), equivalently the dimensions of the eigenspaces (corresponding to a particular eigenvalue).
One useful thing is that for an eigenvalue, its geometric multiplicity cannot exceed its algebraic multiplicity. In general, one is often forced to solve for the eigenspace. Depending on the information you have, you may be able to make some deductions without solving for the eigenspace/s.
Again in regard to your last question, you need to be clear on what you are asking.
For example $A = \left(\begin{matrix} 1 & 1\\0 & 1\end{matrix}\right)$ has one repeated eigenvalue (multiplicity two) $\lambda =1$. Solving for the corresponding eigenspace, we get $v = t \left(\begin{matrix} 1\\0\end{matrix}\right)$, $t\in\mathbb{R}$. So we can write down only one eigenvector, if we are asked to give the set of linearly independent eigenvectors (any nonzero scaler multiple of $\left(\begin{matrix} 1\\0\end{matrix}\right)$ will do).
On the other hand, the $2\times 2$ identity matrix again has only one eigenvalue (multiplicity two) but its corresponding eigenspace has dimension $2$ (note any nonzero vector is an eigenvector). So we have two linearly independent eigenvectors.
The answer is yes. Let's assumme that $v_1, v_2$ are the eigenvectors that correspond to the same eigenvalue $\alpha_1$. Observe that $\lambda_1 v_1 + \lambda_2 v_2$ is an eigenvector for the eigenvalues $\alpha_1$ and is therefore linearly independent of our third vector $v_3$. This means that if $\lambda_1 v_1 + \lambda_2 v_2 + \lambda_3 v_3 = 0$ we necessarily have $\lambda_3 = 0$. Now this implies $\lambda_1 v_1 + \lambda_2 v_2 = 0$, which by assumption yields $\lambda_1 = \lambda_2 = 0$.
Note that this is nothing else than observing that the sum of eigenspaces to different eigenvalues is a direct sum.
Best Answer
Answer is the zero matrix obviously.
EDIT, here is a simple reason: let the matrix be $(c_1\ c_2)$, where $c_1$ and $c_2$ are both $2\times1$ column vectors. For any eigenvector $(a_1 \ a_2)^T$ with eigenvalue $0$, $a_1c_1 + a_2c_2 = 0$. Similarly, for another eigenvector $(b_1 \ b_2)^T$, $b_1c_1 + b_2c_2 = 0$. So $(a_2b_2 - a_1b_2)c_1 = 0$, therefore $c_1=0$ as the eigenvectors are linearly independent. From this, $c_2=0$ also.