Do matrices of vectors exist as a mathematical notion & what applications might they have

matricesvectors

As a thought experiment, I was exploring the possibility of replacing scalar values in a square matrix with vectors instead so as to make a "vector matrix", and I've made a few findings that I'd like to share. I don't know if this concept already exists or if there are useful applications for it, so I wanted to receive some feedback regarding it. To begin, say for example you have a 2$\times$2 matrix with the form:

\begin{vmatrix}
\vec{a} & \vec{b} \\
\vec{c} & \vec{d}
\end{vmatrix}

The determinant of such a matrix can then be defined through the dot products of the vectors (assuming the vectors each have the same number of coordinates). This "dot product determinant" of the 2$\times$2 matrix would be $\vec{a}\cdot\vec{d}$$\vec{b}\cdot\vec{c}$, which would yield a scalar as a result. Note that switching around $\vec{b}$ and $\vec{c}$ in the matrix would not result in a different solution for the determinant since dot products are commutative. Also, if this were a 1$\times$1 matrix or a 3$\times$3 matrix, the determinant would be a vector instead of a scalar. It seems that for any n$\times$n "vector matrix" where n is even, the "dot product determinant" will be a scalar, whereas if n is odd, the "dot product determinant" will be a vector.

It is also possible to define a determinant through the cross products of the vectors, although this would probably be limited to vectors in 3D space, as far as I know. Given the assumption that all the vectors are in 3D space, the "cross product determinant" of the above 2$\times$2 matrix can be defined as $\vec{a}\times\vec{d}$$\vec{b}\times\vec{c}$, which would yield a vector as a result. In fact, for any n$\times$n "vector matrix" with all vectors in 3D space, the "cross product determinant" would always be a vector. Also, switching around $\vec{b}$ and $\vec{c}$ in the matrix would typically result in a different solution for the determinant since cross products are not commutative.

My question is two-fold: Can someone tell me if this concept of a "vector matrix" already exists? If it does exist, are there existing applications for it? If it doesn't, do any applications come to mind?

Best Answer

Yes, there are, and they have certain useful applications in theory of quantum mechanics and deep machine learning.

Consider a matrix $\matrix{A}$ and vector $\vec{B}$.

The expression $A \otimes B$ is called the tensor, or also Kronecker product I believe.

It is a matrix with each element multiplied by the vector.

https://www.math3ma.com/blog/the-tensor-product-demystified

Here's a useful link that explains the concept you're talking about.

Related Question