[Math] Eigenvalues of same-row matrices

diagonalizationeigenvalues-eigenvectorslinear algebramatrices

It has previously been discussed here that the eigenvalues of an all-ones $n \times n$ matrix $A$ such as the following are given by $0$ with multiplicity $n – 1$ and $n$ with multiplicity $1$, hence a total multiplicity of $n$ which means that the given matrix is diagonalizable.
$$A =
\begin{bmatrix}
1 & 1 & \cdots & 1 \\
1 & 1 & \cdots & 1 \\
\vdots & \vdots & \ddots & \vdots \\
1 & 1 & \cdots & 1 \\
\end{bmatrix}
$$

I recently wrote an exam that asked us to diagonalize a matrix with multiple (3) rows that contained the same entries, so I was wondering if there was some general case to apply.

Thus the question I am asking is given the following $n \times n$ matrix A, what are its eigenvalues?
$$A =
\begin{bmatrix}
a_1 & a_2 & \cdots & a_n \\
a_1 & a_2 & \cdots & a_n \\
\vdots & \vdots & \ddots & \vdots \\
a_1 & a_2 & \cdots & a_n \\
\end{bmatrix}
$$
For the sake of simplicity, lets first assume that $a_1, a_2, \ldots, a_n \in \mathbb{R} – \{0\}$; however, what happens if any (or all) are zero?

It seems logical that there be the eigenvalue $0$ with $n – 1$ multiplicity since the rank of this matrix will be $1$ (assuming at least one nonzero entry), and that the other eigenvalue be the sum of entries on the diagonal by observation $a_1 + a_2 + \cdots + a_n$ with $1$ multiplicity. I could not, however, write a formal proof for that second statement.

Best Answer

You can say a lot more about the matrix you presented. Lets define the following two vectors

$$ u=\begin{pmatrix}1\\1\\\vdots\\1\end{pmatrix} \:{\rm and}\:v=\begin{pmatrix}a_{1}\\a_{2}\\\vdots\\a_{n}\end{pmatrix} $$

Then your matrix is exactly

$$A=uv^{T}$$

Assume $v\neq 0$ (because the zero matrix is a trivial case)

First Case: $\sum_{i=1}^{n}a_{i}\neq 0$.

You can easily see that

  • The eigenvalue $\lambda=0$ is of multiplicity $n-1$, with $n-1$ linearly independent eigenvectors given by any basis of the subspace ${\rm Span}\left\{v\right\}^{\perp}$ (that's true because ${\rm Span}\left\{v\right\}\oplus{\rm Span}\left\{v\right\}^{\perp}=\mathbb{R}^{n}$ and ${\rm Span}\left\{v\right\}$ is of dimension $1$).
  • The eigenvalue $\lambda=\sum_{i=1}^{n}a_{i}$ corresponds to the eigenvector $u$, since

$$Au=uv^{T}u=\left(v^{T}u\right)u=\left(\sum_{i=1}^{n}a_{i}\right)u$$

Therefore, in this case you have $n$ linearly independent eigenvectors and the matrix is diagonalizable.

Second Case: $\sum_{i=1}^{n}a_{i}=0$.

In this case the first point still applies, but the matrix is not diagonalizable since it has only $n-1$ linearly independent eigenvectors (you can't have $n$ linearly independent eigenvectors with eigenvalue $\lambda=0$, unless the matrix is zero). It does, however, admits the following Jordan's canonical form

$$A\sim\begin{pmatrix}J_{2}\left(0\right)&0_{2\times\left(n-2\right)}\\0_{\left(n-2\right)\times2}&0_{\left(n-2\right)\times\left(n-2\right)}\end{pmatrix}=J_{2}\left(0\right)\oplus 0_{\left(n-2\right)\times\left(n-2\right)}$$

where

$$J_{2}\left(0\right)=\begin{pmatrix}0&1\\0&0\end{pmatrix}$$

is a $2\times 2$ Jordan's block of eigenvalue $\lambda=0$.