Let's break this down (I'm going for more of an intuitive answer here):
When you multiply a matrix $M$ and a vector $v_i$, you take each row of the matrix and do an inner product with the vector to get the elements of the resulting vector $v_o$ -- so row 1 of $M$ times $v_i$ gets you the first element in $v_o$, row 2 of $M$ gets you the second element of $v_o$, etc.
If $v_i$ is an eigenvector of $M$, then multiplying them will give you a $v_o$ that is effectively $v_i$ times a constant, where the constant is the eigenvalue corresponding to the eigenvector.
Since the whole output vector ($v_o$) is a scalar multiple of the input vector ($v_i$), the eigenvalues cannot be directly related to specific rows of $M$; each element of $v_o$ only "interacted" with a single row of $M$ -- a different row for each element -- during the multiplication, but every element of $v_o$ was scaled by the same (eigen)value from the original elements in $v_i$.
As for the columns, every column of $M$ affects every element of the output vector, so how do we separate out the distinct eigenvalues?
The whole of $M$ can be seen as a transformation. For certain vectors, the transformation only ends up scaling that vector -- this is just a result of applying a particular transformation to a particular vector.
$$\pmatrix{Q & 0 \\ 0 & 1}\pmatrix{\mathbf{0} & u \\ v^\top & a}\pmatrix{Q^{-1} & 0 \\ 0 & 1}=\pmatrix{\mathbf{0} & Qu \\ v^\top Q^{-1} & a}$$
This similarity operation preserves the value of the product of the vectors:
$$v^\top u \leftarrow v^\top Q^{-1} Q u = v^\top u $$
Say that the $Q$ is Gaussian elimination on the column $u$ so that you are left with a zero column except the one element, giving the $2 \times 2$ sub-matrix
$$\pmatrix{0 & 1 \\ v^\top u & a}$$
This represents the only two non-trivial eigenvalues (the rest are zero). It may be transformed with the similarity parameterized with some $k$:
\begin{align}
\pmatrix{1 & 0 \\ k & 1}\pmatrix{0 & 1 \\ v^\top u & a}\pmatrix{1 & 0 \\ -k & 1} \\
= \pmatrix{0 & 1 \\ v^\top u & k+a}\pmatrix{1 & 0 \\ -k & 1} \\
= \pmatrix{-k & 1 \\ v^\top u - k^2 - ka & k+a} \\
\end{align}
and of course you want to solve
$$v^\top u - k^2 - ka = 0$$
giving the two eigenvalues of $$\lambda_0=-k$$ and $$\lambda_1 = k+a$$
$$ k^2 + ka - v^\top u = 0 \Rightarrow k=\frac{-a \pm \sqrt{a^2 + 4 v^\top u}}{2}$$
$$\lambda = \frac{a \pm \sqrt{a^2 + 4 v^\top u}}{2}$$
Since your elements are $0$ or $1$ you have that $v^\top u\ge 0$, and you have eigenvalues both positive and negative when $v^\top u \gt 0$, and another zero eigenvalue ($\lambda_0 = 0$ and $\lambda_1 = a$) when $v^\top u = 0$.
Not all of them are positive since there are zero eigenvalues, and they are only non-negative when $$v^\top u = 0$$
Best Answer
In general what you can say is that the rank of an $n\times n$ matrix is at most equal to the number of nonzero eigenvalues. The simplest counterexample is the matrix $$ \begin{bmatrix}0 & 1 \\ 0 & 0\end{bmatrix} $$ that has only the zero eigenvalue, but has rank $1$.
You surely know that, for an eigenvalue $\lambda$ of the matrix $A$, the following inequalities hold: $$ 1\le d\le m $$ where $d$ is the dimension of the eigenspace $E_A(\lambda)=\{v\in\mathbb{C}^n:Av=\lambda v\}$, usually called the geometric multiplicity of $\lambda$, and $m$ is the algebraic multiplicity, that is, the maximum exponent $m$ such that $(\lambda-X)^m$ divides the characteristic polynomial $\det(A-XI_n)$ ($I_n$ is the $n\times n$ identity matrix).
In the particular case of $\lambda=0$, the eigenspace $E_A(0)$ is the null space of $A$, so this dimension is $n-k$, where $k$ is the rank of $A$. The number of nonzero eigenvalues (counted with their algebraic multiplicity) is $n-m$, where $m$ is the algebraic multiplicity of the zero eigenvalue. Since $n-k\le n-m$ by the above inequality, we get $m\le k$.
In the special case where $A$ has rank $1$ and every row is a nonzero multiple of the first row, there is at least a nonzero eigenvalue: indeed, $A=uu^H$ for some vector $u\ne0$ ($H$ denotes the hermitian transpose), so $$ Au=uu^Hu=(u^Hu)u $$ and the scalar $\mu=u^Hu$ is a nonzero eigenvalue. Thus, by the above considerations, the geometric multiplicity of the zero vector must be the same as the algebraic multiplicity, both equal to $n-1$.
The characteristic polynomial of a rank $1$ matrix is thus $(0-X)^{n-1}(\mu-X)$.
If you are dealing only with real matrices, just change references to $\mathbb{C}$ with $\mathbb{R}$ and the hermitian transpose with the transpose.