[Math] Tensor similarity measure

eigenvalues-eigenvectorstensor-products

In some practical applications, such as in diffusion tensor imaging (DTI), the diffusion data is often represented by a symmetric positive definite second order tensor (basically a 3×3 matrix). The similarity between two diffusion tensors is then measured by the tensor inner product
$$s(A,B) = \sum_{i=1}^3 \sum_{j=1}^3 \lambda^A_i \lambda^B_j \quad \vec{e^A_i} \cdot \vec{e^B_j}$$
,
where $e^A_i$s are the eigen vectors of $A$ with $\lambda^A_i$s being the corresponding eigen values.

In this context, I have two questions:

  1. What is the reasoning behind using the dot product (cosine similarity) between the respective eigen vectors of two tensors? As per my basic understanding, this measures how aligned the eigen vectors of the two tensors are. However, I'm having troubles in understanding the geometric interpretation of this. It'd be very helpful to obtain a simple geometric interpretation (as for example the dot product between two vectors can be interpreted as the angle between two vectors).
  2. What similarity measure can we use if one (or both) of the matrices is singular?

Best Answer

A natural measure for the similarity of two vectors is their inner product, and it has a geometric interpretation. Matrices can also be regarded as vectors, and the corresponding inner product is $$ A:B = \sum_{i,j}a_{ij}b_{ij} = \operatorname{Tr}(A^TB). $$ If the matrices $A$ and $B$ are symmetric, they can be written in terms of their eigenvectors and eigenvalues, and thus \begin{eqnarray} A:B &=& A^T:B \\&=& \operatorname{Tr}(AB) \\&=& \operatorname{Tr}\left(\left[\sum_i\lambda_i^Ae_i^A(e_i^A)^T\right]\left[\sum_j\lambda_j^Be_j^B(e_j^B)^T\right]\right) \\&=& \sum_{i,j}\lambda_i^A\lambda_j^B\operatorname{Tr}\left(e_i^A(e_i^A)^Te_j^B(e_j^B)^T\right) \\&=& \sum_{i,j}\lambda_i^A\lambda_j^B\operatorname{Tr}\left((e_i^A)^Te_j^B(e_j^B)^Te_i^A\right) \\&=& \sum_{i,j}\lambda_i^A\lambda_j^B\left(e_i^A\cdot e_j^B\right)^2. \end{eqnarray} This is very close to $s(A,B)$, although not the same. If the eigenvectors of the two matrices are equal, then $s(A,B)=A:B$. This is admittedly not a full answer but might give some ideas about the meaning of $s(A,B)$.

Related Question