I have two planes defined by two orthogonal vectors. Plane A is defined by vectors $A_x, A_y$ and B, $B_x, B_y$. I want to find the rotation matrix that would transform a point which has coordinates of $(x, y, z)$ to coordinates $(x', y', z')$ where original point have $(x'', y'')$ as components on Plane A and the transformed point would have $(x'', y'')$ on Plane B.
[Math] Rotation matrix from plane A to B
3dlinear algebrarotationsvector-spaces
Related Solutions
The proof assumes that the dot product is linear, which is not trivial to prove without the standard algebraic definition.
The more straightforward proof would be as follows: Create a triangle with the two vectors $a$ and $b$ so that the third side is $a-b$. Define the dot product as $a\cdot b=a_1b_1+a_2b_2$. Then note that $$||x||^2=x_1^2+x_2^2=(x_1,x_2)\cdot (x_1,x_2)$$ so the magnitude squared of a vector equals the vector dotted with itself. Then by the law of cosines, letting $\theta$ denote the angle between $a$ and $b$ and recalling that $a-b$ is the side opposite $\theta$ we get $$||a-b||^2=||a||^2+||b||^2-2||a||\,||b||\cos\theta$$ Using the magnitude squared/dot product relationship above gives $$(a-b)\cdot (a-b)=a\cdot a+b\cdot b-2||a||\,||b||\cos\theta$$ Clearly the dot product is linear and symmetric by our algebraic definition, so the left side can be re-written as $$a\cdot a+b\cdot b-2a\cdot b=a\cdot a+b\cdot b-2||a||\,||b||\cos\theta$$ from which it follows that $$a\cdot b=||a||\,||b||\cos\theta$$
As David K mentions, there are infinitely many rotations that map $\bf a$ to $\bf b$: Given any such rotation $R$, for a rotation $S$ about $\bf b$ by an arbitrary angle, $S \circ R$ is also a rotation that maps $\bf a$ to $\bf b$, and so there is a circle's worth of such rotations.
If the vectors $\bf a$ and $\bf b$ are linearly independent, however, then we can pick out a preferred rotation, namely the one that fixes ${\bf a} \times {\bf b}$ (equivalently, the unique rotation that preserves the plane spanned by $\bf a$ and $\bf b$ as well as the orientation of that plane).
Here's one way to construct the matrix corresponding to this preferred rotation: First, notice that we may as well normalize $\bf a$ and $\bf b$, that is, replace them respectively by the unit vectors $\frac{\bf a}{|{\bf a}|}$ and $\frac{\bf b}{|{\bf b}|}$.
The vector $${\bf n} := \frac{{\bf a} \times {\bf b}}{|{\bf a} \times {\bf b}|}$$ has unit length and is orthogonal to $\bf a$, and hence ${\bf a}$ and ${\bf b}$ together determine an oriented, orthonormal basis of $\Bbb R^3$, namely, $$({\bf a}, {\bf n} \times {\bf a}, {\bf n}).$$ In particular, the matrix $$\begin{pmatrix}{\bf a} & {\bf n} \times {\bf a} & {\bf n}\end{pmatrix}$$ defines a rotation, namely the one that sends the standard basis $({\bf e}_1, {\bf e}_2, {\bf e}_3)$ to the above basis, and its inverse does the reverse.
Now, by symmetry, $$({\bf b}, {\bf n} \times {\bf b}, {\bf n}).$$ is also an oriented, orthonormal basis, and the corresponding matrix built by adjoining these (column) vectors sends the standard basis to this one.
Putting this together with the inverse mentioned above maps $({\bf a}, {\bf n} \times {\bf a}, {\bf n})$ to the standard basis $({\bf e}_1, {\bf e}_2, {\bf e}_3)$ and then the standard basis to $({\bf b}, {\bf n} \times {\bf b}, {\bf n})$, that is, by construction $$\begin{pmatrix}{\bf b} & {\bf n} \times {\bf b} & {\bf n}\end{pmatrix}\begin{pmatrix}{\bf a} & {\bf n} \times {\bf a} & {\bf n}\end{pmatrix}^{-1}$$ is a rotation matrix that maps $\bf a$ to $\bf b$ and fixes $\bf n$ (and hence ${\bf a} \times {\bf b}$). (Since the inverted matrix $\begin{pmatrix}{\bf a} & {\bf n} \times {\bf a} & {\bf n}\end{pmatrix}$ is orthogonal, its inverse is just its transpose, saving considerable computation.)
One can of course expand this expression to find explicit formulas for the entries of the resulting matrix in terms of the components of $\bf a$ and $\bf b$, but I doubt that this would simplify nicely.
Best Answer
In 3 dimensions, there are infinitely many linear transformations that mat one plane onto another. We want to find one such transformation, defined by the matrix R, with the following properties: $$R\mathbf{a}_x=\mathbf{b}_x\ ,\ R\mathbf{a}_y=\mathbf{b}_y$$ We can arbitrarily chose a third vector so that each set is a complete, linearly independent basis: $$R\mathbf{a}_z=\mathbf{b}_z$$ where $\mathbf{a}_z$ and $\mathbf{b}_z$ are not contained within their respective planes. Let A be a matrix whose columns are $\mathbf{a}_x$, $\mathbf{a}_y$, and $\mathbf{a}_z$, and likewise with B. Then we have: $$RA=B$$ Right multiplying by the inverse of A gives: $$R=BA^{-1}$$ We can check to see if this gives the desired result: $$(x,y,z)=x''\mathbf{a}_x+y''\mathbf{a}_y$$ $$(x',y',z')=R(x,y,z)=Rx''\mathbf{a}_x+Ry''\mathbf{a}_y$$ $$=x''R\mathbf{a}_x+y''R\mathbf{a}_y$$ $$=x''\mathbf{b}_x+y''\mathbf{b}_y$$ The choice of $\mathbf{a}_z$ and $\mathbf{b}_z$ does not affect what happens to points on the plane. It only determines how the "off-plane" component is transformed.
If R is to be a rotation matrix, A and B need to be orthogonal matrices, or some matrix multiple thereof. The conditions $||\mathbf{a}_x||=||\mathbf{b}_x||\ ,\ ||\mathbf{b}_y||=||\mathbf{a}_y||$, and $\mathbf{a}_x\cdot\mathbf{a}_y=\mathbf{b}_x\cdot\mathbf{b}_y$ must be met in order for this to be the case. Choosing $\mathbf{a}_z=\mathbf{a}_y\times\mathbf{a}_y$ and $\mathbf{b}_z=\mathbf{b}_y\times\mathbf{b}_y$ will then give the rotation matrix.
To see why this is, we need to use the property that the dot product is invariant under rotation. That is, for any vectors $\mathbf{u}$ and $\mathbf{v}$, $\mathbf{u}\cdot\mathbf{v}=(R\mathbf{u})\cdot(R\mathbf{v})$. From this we get 3 equalities:
$$\mathbf{a}_x\cdot\mathbf{a}_y=(R\mathbf{a}_x)\cdot(R\mathbf{a}_y)=\mathbf{b}_x\cdot\mathbf{b}_y$$
$$||\mathbf{a}_x||^2=\mathbf{a}_x\cdot\mathbf{a}_x=(R\mathbf{a}_x)\cdot(R\mathbf{a}_x)=\mathbf{b}_x\cdot\mathbf{b}_x=||\mathbf{b}_x||^2$$
$$||\mathbf{a}_y||^2=\mathbf{a}_y\cdot\mathbf{a}_y=(R\mathbf{a}_y)\cdot(R\mathbf{a}_y)=\mathbf{b}_y\cdot\mathbf{b}_y=||\mathbf{b}_y||^2$$
When choosing $\mathbf{a}_z$ and $\mathbf{b}_z$, we we will have 3 mode sets of dot products that need to be equal. It can be shown that $||\mathbf{a}_x\times\mathbf{a}_y||=||\mathbf{b}_x\times\mathbf{b}_y||$, and all of the other dot products are conveniently $0$.
Hope this helps.