If $\lambda_n$ is the largest eigenvalue of the normalized Laplacian matrix of a graph $G$, then $$\lambda_n=\sup_x\frac{\sum_{u\sim v}\big(x(u)-x(v)\big)^2}{\sum_v\big(x(v)\big)^2\deg(v)}\le2$$ since $$\big(x(u)-x(v)\big)^2\le2\bigg(\big(x(u)\big)^2+\big(x(v)\big)^2\bigg).$$ Therefore, equality holds only when $x(u)=-x(v)$ for every edge $\{u,v\}$ in $G$. Now, since $x\ne0$, $G$ has a connected component.
On the other hand, if $G$ has a connected bipartite component, then we can choose $x$ so as to make $\lambda_n=2$.
For (a), yes, the matrices in (2) and (3) are positive definite (assuming $G$ was connected). They are easily seen to be PSD as any principal submatrix of a PSD matrix is also PSD; however, the matrix-tree theorem you cite implies the determinant submatrix in (2) is a positive integer as any connected graph has at least one spanning tree, so the matrix is PSD and nonsingular, therefore positive definite. This implies that the matrices in (3) are also positive definite by applying Cauchy's interlacing theorem. Of course, symmetry alone here implies diagonalizability.
The matrix in (4) is not positive semi-definite in the traditional sense as it is not symmetric, but nonethless is diagonalizable and has nonnegative eigenvalues (see (c)).
For (b), the algebraic multiplicity of the zero eigenvalue of $\Delta^{-1}\mathbf{L}$ is the same as the algebraic multiplicity of the zero eigenvalue of $\mathbf{L}$ (because assuming $G$ has no isolated vertices, $\Delta^{-1}$ is full-rank), which is equal to the number of connected components.
For (c), again, assuming no isolated vertices, then $\Delta^{-1}\mathbf{L}$ is similar to $\Delta^{-1/2}\mathbf{L}\Delta^{-1/2}$, which is easily seen to be PSD by inspecting the quadratic form and applying a change of variables. Thus, $\Delta^{-1}\mathbf{L}$ is similar to a diagonalizable matrix with nonnegative eigenvalues, so the same holds.
For (d), see the above answers.
Best Answer
This is a quick exercise in matrix algebra:
$$ D^{-1/2} (D- A) D^{-1/2} = I - D^{-1/2} A D^{-1/2} $$
Then notice $\lambda$ is an eigenvalue of $M$ if and only if $1-\lambda$ is an eigenvalue of $I-M$. In fact the eigenvectors corresponding to $\lambda$ for $M$ are the same as the eigenvectors corresponding to $1-\lambda$ for $I-M$, and vice versa.