Linear Algebra – Finding the Rotation Matrix in N-Dimensions

linear algebrarotations

Suppose that we know two real vectors with n components, which are linked by some arbitrary transformation/scaling/rotation/shearing…

Now, I think that it is possible to know which is the scaling matrix and the rotation matrix. For example, the scaling matrix would be a diagonal matrix with n entries representing the n scaling factors.

On the other side, I can normalize the two vectors and then compute the rotation matrix between the two, isn't it?

1) How can I retrieve the rotation matrix? (if it is possible, obviously)

2) I need such matrix in order to use it in a computational mechanics context. Then, how would you find it in an operative way?

I have been reading somewhere that in multiple dimensions we can apply rotations that are combination of rotations in n-2 hyperplanes. Is this a convenient way to compute such matrix?

Thank you very much!!

EDIT: Let us put this way, in order to understand better what I should do…

Vector $x= [ 2, 4, 5, 3, 6 ]^T$ and vector $y= [ 6, 2, 0, 1, 7 ]^T$. I would like to find the rotation matrix that aligns vector x to vector y. First of all, I understood that one need to find the base of the orthogonal complement to the two vectors, i.e. find the hyperplane containing such two vectors. Consequently, one first compute the null space:

  • Reducing rows:
    $$A=
    \left[
    \begin{array}{ccccc}
    1 & 0 & -1/2 & -1/10 & 4/5 \\
    0 & 1 & 3/2 & 4/5 & 11/10 \\
    \end{array}
    \right]
    $$

  • Vectors of orthogonal space (linked to $x_3, x_4, x_5$)
    $$v_1^{\perp}=
    \left[
    \begin{array}{c}
    1/2 \\ -3/2 \\ 1 \\ 0 \\ 0 \\
    \end{array}
    \right],
    v_2^{\perp}=
    \left[
    \begin{array}{c}
    1/10 \\ -4/5 \\ 0 \\ 1 \\ 0 \\
    \end{array}
    \right],
    v_3^{\perp}=
    \left[
    \begin{array}{c}
    -4/5 \\ -11/10 \\ 0 \\ 0 \\ 1 \\
    \end{array}
    \right]
    $$

  • Then one can do Gram-Schmidt over the two groups of vectors, the one representing the plane containing the two initial vectors and the one representing the orthogonal subspace. The resulting matrix presents in the column vectors the basis of the space:

$$E=
\left[
\begin{array}{ccccc}
0.7255 & -0.0117 & 0.2673 & -0.0716 & -0.6301 \\
0 & 0.4429 & -0.8018 & -0.2409 & -0.3209 \\
-0.3627 & 0.6701 & 0.5345 & -0.3255 & -0.1663 \\
-0.0725 & 0.3555 & 0 & 0.9115 & -0.1937 \\
0.5804 & 0.4778 & 0 & 0 & 0.6594 \\
\end{array}
\right]
$$

The last matrix is correct and represent a base because obviously the scalar product between two columns i,j is equal to the Kronecker delta $\delta_{i,j}$.

Now, I have information about almost everything, I can compute the angle between the two vectors saying that

$$
\cos(\theta)=\frac{x \cdot y}{\left|| x \right|| \: \left|| y \right||}
$$

but now how do I construct the rotation matrix?

Best Answer

One way to do this is to find two orthonormal vectors in the plane generated by your two vectors, and then extend it to an orthonormal basis of $\mathbb R^n$. Then with respect to this basis, consider the rotation by angle $\theta$ in the plan generated by the first two vectors, and the identity on the space generated by the rest of the orthonormal basis. Use Gram-Schmidt to find the orthonormal basis.

As you said in a previous comment, you cannot rotate around an axis except in 3D. Rather you need to rotate about an $n-2$-dimensional subspace.

So suppose you want to rotate $x$ to $y$, and you happen to know they are the same norm. Let $u = x/|x|$, and $v = (y-(u.y)u)/|y-(u.y)u|$. Then $P=uu^T + vv^T$ is a projection onto the space generated by $x$ and $y$, and $Q=I-uu^T-vv^T$ is the projection onto the $n-2$-dimensional complemented subspace. So the "rotation" part just has to take place on the range of $P$. That is, $z \mapsto (z.u,z.v)$ is a isomorphic isometry of the range of $P$ to $\mathbb R^2$. Do the rotation on $\mathbb R^2$. Then map this back to $\mathbb R^n$ by $[a,b]^T \mapsto au + bv$.

So the whole rotation is $$ I - uu^T -vv^T + [u\ v]R_\theta[u\ v]^T $$ where $$ R_\theta = \begin{bmatrix}\cos(\theta) & -\sin(\theta) \\ \sin(\theta & \cos(\theta) \end{bmatrix} $$ where $\cos(\theta) = x\cdot y/(|x||y|)$.

(By the way, if a reflection matrix is OK, then consider Householder transformations instead.)

Here is some MATLAB code (but tested using octave so maybe it has syntax errors in matlab).

x=[2,4,5,3,6]';

y=[6,2,0,1,7]';

u=x/norm(x);

v=y-u'*y*u;

v=v/norm(v);

cost=x'*y/norm(x)/norm(y);

sint=sqrt(1-cost^2);

R = eye(length(x))-u*u'-v*v' + [u v]* [cost -sint;sint cost] *[u v]';

fprintf('testing\n');

should_be_identity = R*R'

should_be_y = R*x