[Math] Eigenvector corresponding to zero eigenvalue / identical eigenvalues, not-identical eigenvectors

eigenvalues-eigenvectorslinear algebramatricesvector-spaces

Assume symmetricmatrix $B\in\mathbb{R}^{n\times n}$ is given, and a transformation $$A=JBJ,$$
where $J=I – \frac{1}{n}1_n1_n^T$ and $I$ denoting the identity matrix, hence
centering its rows and columns to zero (sum of row entries equals zero, for all rows; same holds for columns). It is known that at there is one (at least) eigenvalue zero ($0$); I wonder if it can be shown that the corresponding eigenvector is $1_n=[1~\dots~1]^T\in\mathbb{R}^n$ (ie. vector of all $1$'s)?

I'm trying to establish relation with the geometric implication in case $1_n$ is an eigenvector, similarly as is done with arrows in example from
http://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors

The thing that bothers me is that in case of identical eigenvalues related to
different eigenvectors. An explanation on this would be appreciated.
It has to do with the method I use for spectral decomposition, and it actually inspired the question: sometimes, on different invocation of the method for dominant eigenvalue extraction I get different eigenvectors, but the eigenvalue is always the same (what does that tell about the matrix?). Since there might be more zero eigenvalues, I wonder if it's possible to somehow restrict the method to get the eigenvectors different than the one with all equal elements.

Best Answer

Certainly. Since $1_n^\top1_n=n$, we have $J1_n=(I-\frac1n1_n1_n^\top)1_n=I1_n-\frac1n1_n1_n^\top1_n=1_n-\frac nn1_n=0$.