[Math] Vector Multiplication with Multiple Kronecker Products

kronecker productlinear algebramatrices

My question concerns matrix-vector multiplications when your matrix has Kronecker structure, which can be done faster in that case.

I know how to compute this for a matrix $A = A_1 \otimes A_2$, which has two components $A_i$:
$$Ax = (A_1 \otimes A_2)x = (A_1 \otimes A_2)vec(X) = vec(A_2XA_1^T)$$
where $vec(X) = x$ is the vectorization of $X$.

However, I have no idea how to proceed for more components $A_i$. I can imagine doing something as follows:
$$Ax = (A_1 \otimes A_2 \otimes \cdots \otimes A_n)x = vec((A_2 \otimes \cdots A_n)XA_1^T) = (I_m \otimes A_2 \otimes \cdots \otimes A_n)vec(XA_1^T)$$
which provides me again with an actual matrix-vector multiplication. I was hoping to get a large identity matrix on the left-hand side this way, but no luck.

(EDIT: I realised it is impossible to do it this way, as the matrices $X$ and $A_1^T$ do not have the same dimensions.)

I tried looking it up on-line, but I mainly get the case for two components. Can anyone of you help me out? Thanks!

Best Answer

This took me a while to figure out. What finally helped me was the supplemental materials of this paper.

To summarize very briefly, what you want to do is to treat the vector $x$ as having a multi-dimensional index $x_{i_1, i_2, ..., i_n}$. Then you can sequentially multiply $x$ by the matrices $A_1$, $A_2$, etc. after an appropriate permutation of the indices. See discussion in Fino & Algazi (1976). Overall, this gives substantial savings in terms of memory and computational complexity. You save a factor of 2 in the exponent.

I think the best way to understand the approach is to look at the code for this. I've put up a snippet below that is hopefully helpful: https://gist.github.com/ahwillia/f65bc70cb30206d4eadec857b98c4065