[Math] Dot product of the column vectors from a matrix and their transposes through matrix multiplication

inner-productslinear algebraproductssums-of-squares

I have a matrix with data, every dataset is a column vector in my matrix. I want to know the dot product of the transpose of each column vector with the original column vector. If I transpose the matrix and then do matrix multiplication with itself, what I want is on the diagonal. Is there a way to perform this calculation and only get the vector consisting of those diagonal elements? The diagonal elements are the sums of the squares of the column vectors.

Original $m \times n$ matrix $M$ with $m$ sets at $n$ points:

$M =\left[ \begin{array} {cc}M_{1,1}&…&M_{1,n}\\.&.&.\\.&.&.\\.&.&.\\M_{m,1}&…&M_{m,n} \end{array} \right] = \left[ \begin{array} {cc}.&.&.\\.&.&.\\c1&c…&cn\\.&.&.\\.&.&. \end{array} \right] $

Resulting $1 \times m$ matrix with resulting dot products of $c_n^T \cdot c_n$, e.g. the sum of its squares:

$R = \left[ \begin{array}{cc}c_1^T \cdot c_1, …, c_n^T \cdot c_n \end{array} \right]$

Which are the diagonal elements of:

$M^T\cdot M = S = \left[ \begin{array} {cc}R_1&.&.\\.&…&.\\.&.&R_n \end{array} \right] $

How do I calculate just the diagonals, and get that vector, instead of the whole matrix $S$.

Best Answer

Do you need the answer in terms of computational complexity? I can only come up with a non-closed form like that (which is pretty useless, but answers your question to some extent): $$ R = \sum_{i=1}^n (0 \ldots 1_i \ldots 0) \cdot M^T \cdot M \cdot (0 \ldots 1_i \ldots 0)^T \cdot (0 \ldots 1_i \ldots 0). $$