know that $UU^{T}=\mathbf{I}$ but what about $\boldsymbol{\chi} \boldsymbol{\chi}^T $ that contains the first $k$ eigenvectors of the normalized Laplacian $\mathbf{L}$ ?
Because $\chi^T\chi = I_k$, $\chi\chi^T$ is the orthogonal projection onto the span of the columns of $\chi$, i.e. the first $k$ eigenvectors of L.
Assume that I sum up all elements of the matrix $\boldsymbol{\chi} \boldsymbol{\chi}^T $ (call it $S$). Is there any condition to ensure that the sum $S$ decreases or takes the smallest possible value?
Let $\chi_k$ denote the matrix $\chi$ constructed above with $k$ columns. Let $e = (1,1,\dots,1)^T$. The sum of the elements of $\chi_k\chi_k^T$ is equal to
$$
S_k = e^T[\chi_k\chi_k^T]e = [e^T\chi_k][\chi_k^Te] = (\chi_k^Te)^T(\chi_k^Te) = \sum_{j=1}^k (e^Tv_j)^2,
$$
where $v_j$ denotes the $j$th unit eigenvector (i.e. the $j$th column of $\chi_k$). Note that $e^Tv_j$ is the sum of the entries of $v_j$. With that, it is clear that $S$ increases as $k$ increases since each term in the sum is non-negative.
Is there any relation between $S$, $\boldsymbol{\chi} \boldsymbol{\chi}^T $ and the the topology of the graph?
There is no relation to the topology of the graph that I can see. If you're interested in a more thorough answer, then this question should probably be asked as its own, separate post.
Let $A$ be any $n \times n$ matrix, symmetric or not, over any field $\Bbb F$, and suppose that $A$ is possessed of $n$ eigenvectors $v_1$, $v_2$, $\ldots$, $v_n$ with the corresponding eigenvalues $\lambda_1$, $\lambda_2$, $\ldots$, $\lambda_n$ such that
$Av_i = \lambda_i v_i, \; 1 \le i \le n; \tag 1$
with our OP user 774633 we define the matrix whose columns are the eigenvectors $v_i$:
$V = [v_1 \; v_2 \; \ldots \; v_n], \tag 2$
and observe that
$AV = [Av_1 \; Av_2 \; \ldots \; Av_n]; \tag 3$
now in accord with (1) we may write
$AV = [\lambda_1 v_1 \; \lambda_2 v_2 \; \ldots \; \lambda_n v_n]. \tag 4$
Now again in accord with our OP we set
$D = \text{diag}[\lambda_1 \; \lambda_2 \; \ldots\; \lambda_n], \tag 5$
and we have
$VD = [v_1 \; v_2 \; \ldots \; v_n]\text{diag}[\lambda_1 \; \lambda_2 \; \ldots\; \lambda_n], \tag 6$
and if we write $V$ and $D$ in full matrix form (which explicitly presents every element) we obtain
$V = \begin{bmatrix} v_{11} & v_{12} & \ldots & v_{1n} \\
v_{21} & v_{22} & \ldots & v_{2n} \\
\vdots & \vdots & \ldots & \vdots \\
v_{n1} & v_{n2} & \ldots & v_{nn} \end{bmatrix} = [v_{ij}], \tag 7$
and
$D = \begin{bmatrix} \lambda_1 & 0 & 0 & \ldots & 0 \\
0 & \lambda_2 & 0 & \ldots & 0 \\
0 & 0 & \lambda_3 & \ldots & 0 \\
\vdots & \vdots & \vdots & \ldots & \vdots \\
0 & 0 & 0 & \ldots & \lambda_n \end{bmatrix} = [\delta_{ij} \lambda_i]. \tag 8$
Note that the row indices in the matrix (7) are the first or $i$ indices in the entries $v_{ij}$, in conformance with the standard practice for writing out matrices; adopting this convention clarifies the ensuing calculations.
We may thus exploit the ordinary rule for matrix multiplication and we find
$VD = [v_{ij}][\delta_{ij} \lambda_i] = \left [ \displaystyle \sum_{k = 1}^n v_{ik}\delta_{kj}\lambda_k \right] = [v_{ij} \lambda_j], \tag 9$
and it is clear that
$[v_{ij} \lambda_j] = [\lambda_1 v_1 \; \lambda_2 v_2 \; \ldots \; \lambda_n v_n]
, \tag{10}$
whence, taking (3),(4) and (9) in concert, we arrive at
$AV = VD, \tag{11}$
the requisite relation 'twixt $A$, $V$, and $D$.
Perhaps a somewhat more elegant demonstration of (11) may be had via the observation that the $j$-th column of the matrix $D$ is in fact the vector
$\mathbf e_j = [\delta_{ij}], \tag{12}$
comprised of all $0$s save for a single $1$ in the $j$-th row, multiplied by the scalar $\lambda_j$, that is, $\lambda_j \mathbf e_j$. We may thus write
$D = \begin{bmatrix}\lambda_1 \mathbf e_1 & \lambda_2 \mathbf e_2 & \ldots & \lambda_n \mathbf e_n \end{bmatrix}; \tag{13}$
it is furthermore easy to see that
$V\mathbf e_i = v_i, \tag{14}$
whence
$VD = V\begin{bmatrix}\lambda_1 \mathbf e_1 & \lambda_2 \mathbf e_2 & \ldots & \lambda_n \mathbf e_n \end{bmatrix} = \begin{bmatrix}\lambda_1 V\mathbf e_1 & \lambda_2 V\mathbf e_2 & \ldots & \lambda_n V\mathbf e_n \end{bmatrix}$
$= \begin{bmatrix}\lambda_1 v_1 & \lambda_2 v_2 & \ldots & \lambda_n v_n \end{bmatrix} = \begin{bmatrix} Av_1 & Av_2 & \ldots & Av_n \end{bmatrix} = AV \tag{15}$
in accord with (3) and (4).
Best Answer
Yes! Note that $P' = P^{-1}$. In general, the eigenvalues of $P \Lambda P^{-1}$ are the same as the eigenvalues of $\Lambda$, even if $\Lambda$ is not diagonal and $P$ is not orthogonal. To see this, note that their characteristic functions are the same:
$$\det(tI - P \Lambda P^{-1}) = \det (tPIP^{-1} - P\Lambda P^{-1}) = \det( P(tI-\Lambda)P^{-1}) = \det (P) \det (tI-\Lambda) \det(P^{-1}) = \det (tI-\Lambda).$$
For symmetry, we do need orthogonality of $P$ and the fact that $\Lambda$ is diagonal. Then, $$(P\Lambda P')' = (P')' \Lambda' P' = P \Lambda P'.$$