[Math] Could a repeated eigenvalues of zero still have independent eigenvectors

linear algebramatrices

If a matrix $A$ has repeated eigenvalues, its eigenspace matrix would be dependent because there isn't enough eigenvector to tag along with the eigenvalue.

But if a matrix $A$ has repeated eigenvalues of $0$, that means the matrix $A$ is singular, then the dimension of the nullspace of matrix $A$ is the number of $0$ eigenvalues that matrix $A$ has. Although I read that the eigenvectors of the matrix A are dependent if there are repeated eigenvalues, I'm thinking why couldn't the eigenvectors be independent if the repeated eigenvalues were $0$?

Say in a matrix that has 2 eigenvalues of $0$, this implies that $dim(N(A))=2$, then wouldn't the eigenvector to each of this eigenvalue of $0$ be the 2 independent nullspace vectors? Then when putting all the eigenvectors together to form an eigenspace matrix, they will all be independent, wouldn't it?

What do I don't understand that lead me to this thought?
Thanks for any help.

Update:

With the example suggested by Didier Piau,
$$A=\begin{bmatrix}
0 & 1\\
0 & 0
\end{bmatrix}$$
The $rank(A)=1$, $N(A)=\begin{bmatrix}
1\\
0
\end{bmatrix}$, $eigenvals(A)=0, 0$ and true enough, the eigenvector can only be $\begin{bmatrix}
1\\
0
\end{bmatrix}$ and therefore the eigenspace is dependent. There isn't enough eigenvectors to tag to the other eigenvalue of zero.

But say with another example of matrix $B$,
$$B=\begin{bmatrix}
1 & 2 & 3\\
1 & 2 & 3\\
2 & 4 & 6
\end{bmatrix}$$
The $rank(B)=1$, the $N(A)=c_{1}\begin{bmatrix}
-2\\
1\\
0
\end{bmatrix}
+
c_{2}\begin{bmatrix}
-3\\
0\\
1
\end{bmatrix}, c_{1}, c_{2} \in \mathbb{R}$, $eigenvals(B)=9, 0,0$ then the eigenvectors are…
$$B\begin{bmatrix}
0.5\\
0.5\\
1
\end{bmatrix}=9\begin{bmatrix}
0.5\\
0.5\\
1
\end{bmatrix}$$
Then for the next two eigenvalues of $0$, which eigenvector do I choose to use as their eigenvector? Do I use one for each or only just one of the two or both of the two?
$$B\begin{bmatrix}
0.5\\
0.5\\
1
\end{bmatrix}=0\begin{bmatrix}
-2\\
1\\
0
\end{bmatrix}$$
and it could also be…
$$B\begin{bmatrix}
0.5\\
0.5\\
1
\end{bmatrix}=0\begin{bmatrix}
-3\\
0\\
1
\end{bmatrix}$$
Then for the eigenspace, should it be: $\begin{bmatrix}
0.5 & -2 & 3\\
0.5 & 1 & 0\\
1 & 0 & 1
\end{bmatrix}$ or $\begin{bmatrix}
0.5 & -2 & -2\\
0.5 & 1 & 1\\
1 & 0 & 0
\end{bmatrix}$ or $\begin{bmatrix}
0.5 & 3 & 3\\
0.5 & 0 & 0\\
1 & 1 & 1
\end{bmatrix}$?

Best Answer

One can test one's understanding of this question on the matrix A=[0,1|0,0], since this example already exhibits many of the relevant features of the problem. For eample, one cannot find two linearly independent vectors v and w such that Av=Aw=0 (try to solve for v the equation Av=0) although 0 is a double root of the characteritic polynomial det(XI-A)=X^2.

This is not specific to the eigenvalue 0. Recall that for any eigenvalue one distinguishes its algebraic multiplicity n (in this case, n=2) from its geometric multiplicity m (in this case, m=1). See here for more details and a worked out example.