There is no significance of the fact that all eigenvalues are distinct in the context of this question; it is a superfluous hypothesis. A symmetric real matrix is always diagonalisable, whether or not its eigenvalues are distinct.
Maybe a failed attempt to put the student at ease by giving a supposedly reassuring hypothesis?
Your source talks about matrices over the complex numbers and $M$ is supposed to be Hermitian, that is, equal to its “conjugate transpose”.
I'll denote by $M^*$ the conjugate transpose of the (generic) matrix $M$. A couple of definitions; let $M$ be a square matrix:
The source also talks about the spectral theorem, which says that every normal matrix $M$ can be written as
$$
M=\sum_{k=1}^r \lambda_k P_k\tag{*}
$$
where $\lambda_1,\dots,\lambda_r$ are the distinct eigenvalues of $M$ and $P_1,\dots,P_r$ are the projections matrices to the eigenspaces; in particular they satisfy $P_k^2=P_k$ and $P_k^*=P_k$. Moreover $P_kP_l$ is the null matrix when $k\ne l$.
Clearly any Hermitian matrix is normal and it's easy to deduce that if $M$ is Hermitian then its eigenvalues are real: indeed, using its spectral decomposition (*), we have, from $M=M^*$,
$$
\sum_{k=1}^r \lambda_kP_k=\sum_{k=1}^r \lambda_k^*P_k
$$
and upon multiplying this by $P_l$ we get $\lambda_l=\lambda_l^*$.
The statement that any Hermitian matrix has real eigenvalues can be proved also without the spectral theorem, by directly using the definition: let $v$ be an eigenvector relative to the eigenvalue $\lambda$; then
$$
\lambda(v^*v)=v^*(\lambda v)=v^*Av=v^*A^*v=(Av)^*v=(\lambda v)^*v=\lambda^*v^*v
$$
Since $v^*v\ne0$, we get $\lambda=\lambda^*$.
Another important fact about normal matrices is that they can be diagonalized with a unitary matrix; this is actually an equivalent condition.
A square matrix $M$ is normal if and only if there exists a unitary matrix $U$ (that is, $U^{-1}=U^*$) such that $D=U^{-1}MU$ is diagonal.
Now, suppose $M$ is (Hermitian) and positive definite, that is, for $v\ne0$, $v^*Mv>0$. Consider $e_k$, the $k$-th vector of the canonical basis and consider $v_k=Ue_k$; then
$$
0<v_k^*Mv_k=e_k^*U^*UDU^*Ue_k=e_k^*De_k
$$
and, clearly, $e_k^*De_k$ is the entry at place $(k,k)$ in $D$. Since $D$ has on its diagonal the eigenvalues of $M$, we are done.
Conversely, if $D$ has all its diagonal entries positive, then, for every $v\ne0$ we have
$$
v^*Mv=(Uv)^*D(Uv)>0
$$
because clearly $D$ is positive definite.
Best Answer
1.)
Suppose $D\succ \mathbf 0$
$2D-DAD$ and $2I-D^{1/2}AD^{1/2}$
are congruent and by Sylvester's Law of Inertia they have the same signature.
2.)
suppose $\text{rank}\big(D\big) = r$ and with the usual ordering $d_{1,1}\geq d_{2,2}\geq...\geq d_{r,r}\gt d_{r+1, r+1}=0$
$2D-DAD = D^\frac{1}{2}\big(2I-D^\frac{1}{2}AD^\frac{1}{2}\big)D^\frac{1}{2}\succeq \mathbf 0$
implies the leading $r\times r$ sub-matrix is positive semidefinite and the matrix is zero everywhere else. In particular, working with the pseudo inverse of $D^\frac{1}{2}$ this implies the leading $r\times r$ sub-matrix of $\big(2I-D^\frac{1}{2}AD^\frac{1}{2}\big)$ is PSD -- i.e. the $r\times r$ leading submatrices of $2D-DAD$ and $\big(2I-D^\frac{1}{2}AD^\frac{1}{2}\big)$ are congruent.
Conveniently, $\big(2I-D^\frac{1}{2}AD^\frac{1}{2}\big)$ is real symmetric and block diagonal, so its spectrum (and hence signature) splits into the leading $r\times r$ matrix which we know is PSD, and the lower right matrix which is $I_{n-r}$. Hence $\big(2I-D^\frac{1}{2}AD^\frac{1}{2}\big)\succeq \mathbf 0$