Eigenvectors matrix multiplied by its transpose $\boldsymbol{\chi} \boldsymbol{\chi}^T $

calculuseigenvalues-eigenvectorsgraph theorylinear algebrareal-analysis

  • Let $V$ be the set of datapoint and assume that each point can be represented by a vertex. Then, given a similarity matrix $ \mathbf{M}$, we define a graph $G = (V, \mathbf{M})$ generated using $k$-nearest neighbors.

  • Let $\mathbf{D}$ denote the diagonal degree matrix of $\mathbf{M}$. Then, we define the normalized weight matrix $\mathbf{W}$ using $\mathbf{D}$, so that

\begin{equation*}
\mathbf{W}= \mathbf{D}^{-\frac{1}{2}} \mathbf{M} \mathbf{D}^{-\frac{1}{2}}.
\end{equation*}

  • Therefore, we define the normalized Laplacian matrix of $G$ as
    \begin{equation*}
    \mathbf{L} = \mathbf{I} – \mathbf{W}= \mathbf{I}- \mathbf{D}^{-\frac{1}{2}} \mathbf{M} \mathbf{D}^{-\frac{1}{2}},
    \end{equation*}

    where $\mathbf{I}$ is the identity matrix.

  • Since the normalized Laplacian matrix $\mathbf{L}$ is a positive semi-definite matrix, the matrix $\mathbf{L}$ is decomposed into an orthogonal set of eigenvectors $\mathbf{U}=[u_1,…u_n]$ and eigenvalues $\mathbf{\Lambda}=[\lambda_1,…,\lambda_n]$ represented as follows
    \begin{equation*}
    \mathbf{L}= \mathbf{U}\Lambda \mathbf{U}^{T}.
    \end{equation*}

In the graph setting, the eigenvalues of $ \mathbf{L}$ can be treated as graph frequencies, and are always situated in the interval $[0, 2]$ for $\mathbf{L}$.

………………….

  • Now, let $\mathbf{H}=\mathbf{U} h(\mathbf{\Lambda}) \mathbf{U}^{T}$ where $h(\mathbf{\Lambda})$=diag$(h(\lambda_1),…,h(\lambda_n))$.

  • I assume that $h(\mathbf{\lambda})=1$ if $\lambda \leq \lambda_k$ and $h(\mathbf{\lambda})=0$ if not.

Therefore, $\mathbf{U} h(\mathbf{\Lambda}) \mathbf{U}^{T}= \boldsymbol{\chi} \boldsymbol{\chi}^T $, where $\boldsymbol{\chi} \in \mathbb{R}^{n\times k}$ contains the first $k$ eigenvectors of the normalized Laplacian $\mathbf{L}$.

My questions:

  1. I know that $UU^{T}=\mathbf{I}$ but what about $\boldsymbol{\chi} \boldsymbol{\chi}^T $ that contains the first $k$ eigenvectors of the normalized Laplacian $\mathbf{L}$ ?

  2. Assume that I sum up all elements of the matrix $\boldsymbol{\chi} \boldsymbol{\chi}^T $ (call it $S$). Is there any condition to ensure that the sum $S$ decreases or takes the smallest possible value?

  3. Is there any relation between $S$, $\boldsymbol{\chi} \boldsymbol{\chi}^T $ and the the topology of the graph?

Best Answer

know that $UU^{T}=\mathbf{I}$ but what about $\boldsymbol{\chi} \boldsymbol{\chi}^T $ that contains the first $k$ eigenvectors of the normalized Laplacian $\mathbf{L}$ ?

Because $\chi^T\chi = I_k$, $\chi\chi^T$ is the orthogonal projection onto the span of the columns of $\chi$, i.e. the first $k$ eigenvectors of L.

Assume that I sum up all elements of the matrix $\boldsymbol{\chi} \boldsymbol{\chi}^T $ (call it $S$). Is there any condition to ensure that the sum $S$ decreases or takes the smallest possible value?

Let $\chi_k$ denote the matrix $\chi$ constructed above with $k$ columns. Let $e = (1,1,\dots,1)^T$. The sum of the elements of $\chi_k\chi_k^T$ is equal to $$ S_k = e^T[\chi_k\chi_k^T]e = [e^T\chi_k][\chi_k^Te] = (\chi_k^Te)^T(\chi_k^Te) = \sum_{j=1}^k (e^Tv_j)^2, $$ where $v_j$ denotes the $j$th unit eigenvector (i.e. the $j$th column of $\chi_k$). Note that $e^Tv_j$ is the sum of the entries of $v_j$. With that, it is clear that $S$ increases as $k$ increases since each term in the sum is non-negative.

Is there any relation between $S$, $\boldsymbol{\chi} \boldsymbol{\chi}^T $ and the the topology of the graph?

There is no relation to the topology of the graph that I can see. If you're interested in a more thorough answer, then this question should probably be asked as its own, separate post.