Let $A$ be any $n \times n$ matrix, symmetric or not, over any field $\Bbb F$, and suppose that $A$ is possessed of $n$ eigenvectors $v_1$, $v_2$, $\ldots$, $v_n$ with the corresponding eigenvalues $\lambda_1$, $\lambda_2$, $\ldots$, $\lambda_n$ such that
$Av_i = \lambda_i v_i, \; 1 \le i \le n; \tag 1$
with our OP user 774633 we define the matrix whose columns are the eigenvectors $v_i$:
$V = [v_1 \; v_2 \; \ldots \; v_n], \tag 2$
and observe that
$AV = [Av_1 \; Av_2 \; \ldots \; Av_n]; \tag 3$
now in accord with (1) we may write
$AV = [\lambda_1 v_1 \; \lambda_2 v_2 \; \ldots \; \lambda_n v_n]. \tag 4$
Now again in accord with our OP we set
$D = \text{diag}[\lambda_1 \; \lambda_2 \; \ldots\; \lambda_n], \tag 5$
and we have
$VD = [v_1 \; v_2 \; \ldots \; v_n]\text{diag}[\lambda_1 \; \lambda_2 \; \ldots\; \lambda_n], \tag 6$
and if we write $V$ and $D$ in full matrix form (which explicitly presents every element) we obtain
$V = \begin{bmatrix} v_{11} & v_{12} & \ldots & v_{1n} \\
v_{21} & v_{22} & \ldots & v_{2n} \\
\vdots & \vdots & \ldots & \vdots \\
v_{n1} & v_{n2} & \ldots & v_{nn} \end{bmatrix} = [v_{ij}], \tag 7$
and
$D = \begin{bmatrix} \lambda_1 & 0 & 0 & \ldots & 0 \\
0 & \lambda_2 & 0 & \ldots & 0 \\
0 & 0 & \lambda_3 & \ldots & 0 \\
\vdots & \vdots & \vdots & \ldots & \vdots \\
0 & 0 & 0 & \ldots & \lambda_n \end{bmatrix} = [\delta_{ij} \lambda_i]. \tag 8$
Note that the row indices in the matrix (7) are the first or $i$ indices in the entries $v_{ij}$, in conformance with the standard practice for writing out matrices; adopting this convention clarifies the ensuing calculations.
We may thus exploit the ordinary rule for matrix multiplication and we find
$VD = [v_{ij}][\delta_{ij} \lambda_i] = \left [ \displaystyle \sum_{k = 1}^n v_{ik}\delta_{kj}\lambda_k \right] = [v_{ij} \lambda_j], \tag 9$
and it is clear that
$[v_{ij} \lambda_j] = [\lambda_1 v_1 \; \lambda_2 v_2 \; \ldots \; \lambda_n v_n]
, \tag{10}$
whence, taking (3),(4) and (9) in concert, we arrive at
$AV = VD, \tag{11}$
the requisite relation 'twixt $A$, $V$, and $D$.
Perhaps a somewhat more elegant demonstration of (11) may be had via the observation that the $j$-th column of the matrix $D$ is in fact the vector
$\mathbf e_j = [\delta_{ij}], \tag{12}$
comprised of all $0$s save for a single $1$ in the $j$-th row, multiplied by the scalar $\lambda_j$, that is, $\lambda_j \mathbf e_j$. We may thus write
$D = \begin{bmatrix}\lambda_1 \mathbf e_1 & \lambda_2 \mathbf e_2 & \ldots & \lambda_n \mathbf e_n \end{bmatrix}; \tag{13}$
it is furthermore easy to see that
$V\mathbf e_i = v_i, \tag{14}$
whence
$VD = V\begin{bmatrix}\lambda_1 \mathbf e_1 & \lambda_2 \mathbf e_2 & \ldots & \lambda_n \mathbf e_n \end{bmatrix} = \begin{bmatrix}\lambda_1 V\mathbf e_1 & \lambda_2 V\mathbf e_2 & \ldots & \lambda_n V\mathbf e_n \end{bmatrix}$
$= \begin{bmatrix}\lambda_1 v_1 & \lambda_2 v_2 & \ldots & \lambda_n v_n \end{bmatrix} = \begin{bmatrix} Av_1 & Av_2 & \ldots & Av_n \end{bmatrix} = AV \tag{15}$
in accord with (3) and (4).
Best Answer
You do not need the diagonal entries of $D$ to be positive. Since $\det(I+AB)=\det(I+BA)$ we have $\det(I+xy^T)=1+y^Tx$ (which is not hard to prove directly). If $D$ is invertible, then $$ \det(D+ss^T) = \det(D(I+D^{-1}ss^T)) = \det(D) \det(I+D^{-1}ss^T) = (1+s^TD^{-1}s) \det(D). $$ This gives $$ \det(D+ss^T) = \det(D) \left(1+\sum_r \frac{s_r^2}{d_r}\right) $$ If two diagonal entries of $D$ are zero then $\det(D+ss^T)=0$. If just one is zero, $D_{n,n}$ say, then $$ \det(D+ss^T) = s_n^2\prod_{r=1}^{n-1} D_{r,r}. $$ If you define $\delta_r=\det(D)/D_{r,r}$ we have $$ \det(D+ss^T) = \sum_r \delta_r s_r^2, $$ which is neater and holds in all cases.