Picking up from where you left off.
From $\text{rank}(A-5I)=3$ and from $\text{rank}(A+2I)+\text{rank}(A+3I)+\text{rank}(A-5I)=9$ you get
$$\text{rank}(A+2I)+\text{rank}(A+3I)=6.$$
Now prove that $A+2I$ is invertible. What does that tell you about the rank of $A+2I$?
Infer that $\text{rank}(A+3I)=2$. What does that tell you about the geometric multiplicity of $-3$?
Conclude.
In general what you can say is that the rank of an $n\times n$ matrix is at most equal to the number of nonzero eigenvalues. The simplest counterexample is the matrix
$$
\begin{bmatrix}0 & 1 \\ 0 & 0\end{bmatrix}
$$
that has only the zero eigenvalue, but has rank $1$.
You surely know that, for an eigenvalue $\lambda$ of the matrix $A$, the following inequalities hold:
$$
1\le d\le m
$$
where $d$ is the dimension of the eigenspace $E_A(\lambda)=\{v\in\mathbb{C}^n:Av=\lambda v\}$, usually called the geometric multiplicity of $\lambda$, and $m$ is the algebraic multiplicity, that is, the maximum exponent $m$ such that $(\lambda-X)^m$ divides the characteristic polynomial $\det(A-XI_n)$ ($I_n$ is the $n\times n$ identity matrix).
In the particular case of $\lambda=0$, the eigenspace $E_A(0)$ is the null space of $A$, so this dimension is $n-k$, where $k$ is the rank of $A$. The number of nonzero eigenvalues (counted with their algebraic multiplicity) is $n-m$, where $m$ is the algebraic multiplicity of the zero eigenvalue. Since $n-k\le n-m$ by the above inequality, we get $m\le k$.
In the special case where $A$ has rank $1$ and every row is a nonzero multiple of the first row, there is at least a nonzero eigenvalue: indeed, $A=uu^H$ for some vector $u\ne0$ ($H$ denotes the hermitian transpose), so
$$
Au=uu^Hu=(u^Hu)u
$$
and the scalar $\mu=u^Hu$ is a nonzero eigenvalue. Thus, by the above considerations, the geometric multiplicity of the zero vector must be the same as the algebraic multiplicity, both equal to $n-1$.
The characteristic polynomial of a rank $1$ matrix is thus $(0-X)^{n-1}(\mu-X)$.
If you are dealing only with real matrices, just change references to $\mathbb{C}$ with $\mathbb{R}$ and the hermitian transpose with the transpose.
Best Answer
The rank is $1$, as the columns of the matrix are spanned by a single non-zero vector, specifically $$\operatorname{colspace}(J_{n \times n}) = \operatorname{span}\left\{\begin{bmatrix} 1 \\ 1 \\ \vdots \\ 1 \end{bmatrix}\right\}.$$ This implies the columnspace is $1$-dimensional, and this dimension is the rank (by definition.
The Rank-Nullity theorem implies that the nullspace of $J_{n \times n}$ is of $n - 1$ dimensions. The nullspace is the subspace of vectors $x$ such that $Jx = 0 = 0x$, which is to say, the eigenspace corresponding to $\lambda = 0$. This means, we may find $n - 1$ linearly independent eigenvectors for $J$.
So, now we are left with two possibilities. Either, we have another non-zero eigenvalue, or we have only $0$ as our eigenvalue, with multiplicity $n$. The latter possibility would mean that $J$ is not diagonalisable, but it could be a possibility considering only the previous calculations (this can be very quickly rejected, since $J$ is symmetric!).
At this point, we just need a good guess as to the eigenvalue, or indeed, an eigenvector. It's not hard to see that
$$\begin{bmatrix} 1 & 1 & \cdots & 1 \\ 1 & 1 & \cdots & 1 \\ \vdots & \vdots & \ddots & \vdots \\ 1 & 1 & \cdots & 1 \end{bmatrix}\begin{bmatrix} 1 \\ 1 \\ \vdots \\ 1 \end{bmatrix} = n\begin{bmatrix} 1 \\ 1 \\ \vdots \\ 1 \end{bmatrix}.$$
This confirms, by definition, that there is another eigenvalue: $n$. We now have the complete picture: there are two eigenvalues; $0$ with multiplicity $n - 1$ and $n$ with multiplicity $1$.