Thanks for those kindly person answered or commented on my question. It's helpful.
I find 2 ways to solve my problem.
1.The RV coefficient.
Take each column of the matrix as an independent realization of a random vector. So, if I want to calculate matrix $A_1$ and $A_2$, where $A_1 \in R^{n*k}$,$A_2 \in R^{m*k}$, $m,n \in N^+$, I turn this problem into calculating the dependence of two random vectors $\mathbf{a_1}$, and $\mathbf{a_2}$, where $\mathbf{a_1} \in R^n$, $\mathbf{a_2} \in R^m$.
and $A_{1} \in R^{n*k}$ ,$A_{2} \in R^{m*k}$ represent k independent realizations of the random vectors and are assumed to be centered.
The correlation coefficient is defined as following:
$$ RV(X,Y)=\frac{tr(XX^{'}YY^{'})}{\sqrt{tr(XX^{'})^2tr(YY^{'})^2}}$$
substitute $X= A_{1}^{'}$, $Y= A_{2}^{'}$, then get the linear dependency.
However, this efficient can only measure the linear dependency of 2 random vectors, so even if the efficient equals zero, you can only say 2 vectors have no linear relationship between each other.
2.The dCov efficient
This efficient can be applied to two matrices with different size of both row and column.
Definition of the empirical distance covariance:
$$ dCov_n^{2}(X,Y)=\frac{1}{n^{2}} \sum_{i,j=1}^{n} (d_{ij}^X-d_{i.}^{X}-d_{.j}^{X}+d_{..}^{X})(d_{ij}^Y-d_{i.}^{Y}-d_{.j}^{Y}+d_{..}^{Y}) $$
where $d_{ij}$ is the Euclidean distance between sample $i$ and $j$ of random vector $\mathbf{a_i}$, $d_{i.}= \frac{1}{n}\sum_{j=1}^{n}d_{ij}$, $d_{.j}= \frac{1}{n}\sum_{i=1}^{n}d_{ij}$, $d_{..}= \frac{1}{n^2}\sum_{i,j=1}^{n}d_{ij}$.
The empirical distance correlation:
$$dCor_n^{2}(X,Y)=\frac{dCov_n^{2}(X,Y)}{\sqrt{dCov_n^{2}(X,X)dCov_n^{2}(Y,Y)}}$$
I used the $dCor_n^{2}$ to measure the similarity and it works better than using the Euclidean distance when the matrices are the same size.
References:
Josse, J. and Holmes, S. (2013). Measures of dependence between random vectors and tests
of independence. Literature review. arXiv preprint arXiv:1307.7383.
http://arxiv.org/abs/1307.7383.
Székely G J, Rizzo M L, Bakirov N K. Measuring and testing dependence by correlation of distances[J]. The Annals of Statistics, 2007, 35(6): 2769-2794.
Best Answer
It sounds like you want to take the dot product such that a vector dotted with itself gives the magnitude squared norm. For real vectors $\mathbf{x}$ and $\mathbf{y}$ this is $$\langle\mathbf{x},\mathbf{y}\rangle := \mathbf{x}^\top \mathbf{y}$$
Note that a norm should always return a non-negative real value so that $$\langle\mathbf{x},\mathbf{x}\rangle \ge 0$$
Note also that if $\mathbf{x}$ has complex values this definition fails to be a norm. To see this consider $\mathbf{x}^\top = (i,0,0,\dots)$ giving $$\mathbf{x}^\top \mathbf{x} = -1$$
Thus for complex numbers the norm is slightly different. To be a norm in the complex field, define the norm as $$\langle\mathbf{x},\mathbf{y}\rangle := \mathbf{x}^\dagger \mathbf{y}$$ Where the difference is that instead of the transpose, it is the transpose and complex conjugate. In that case for the example $\mathbf{x}^\top = (i,0,0,\dots)$ \begin{align} \mathbf{x}^\dagger \mathbf{x} &= (-i,0,0,\dots) (i,0,0,\dots)^\top\\ & = -ii \\ &= -(-1)\\ & = 1 \\ \end{align}
In terms of your desire to measure their "similarity", I am supposing you want the angle between them, which (for reals) requires solving the formula $$\cos\theta = \frac{\mathbf{x}^\top \mathbf{y}}{(\mathbf{x}^\top\mathbf{x})(\mathbf{y}^\top\mathbf{y})}$$
Or more simply for $\vert \mathbf{x} \vert = 1$ and $\vert \mathbf{y} \vert = 1$
$$\cos\theta = \mathbf{x}^\top \mathbf{y}$$
The magnitude on the right will be between zero and one. Zero means that the two vectors are orthogonal (90 degrees or $\pi\over 2$). One means they are scalar multiples of each other.
For complex, the magnitude still gives the "similarity" between them, where the complex angle gives the complex phase factor required to fully reach that similarity. In other words, if your result is $\cos\theta=-i$, then $i\mathbf{x}$ is a real scalar multiple of $\mathbf{y}$.