Linear Algebra – Can Any Matrix Be Decomposed into Rotation, Reflection, Shear, Scaling, and Projection Matrices?

linear algebramatricestransformation

It seems to me that any linear transformation in $R^{n\times m}$ is just a series of applications of rotation(actually i think any rotation can be achieved by applying two reflections, but not sure), reflection, shear, scaling and projection transformations. One or more of each kind in some order.

This is how I have been imagining it to myself, but I was unable to find proof of this on the internet.

Is this true? And if this is true, is there a way to find such a decomposition?

EDIT: to make it clear, I am asking whether it is true that
$\forall A\in R^{n \times m} $,$$A=\prod_{i=1}^{k}P_i$$
Where $P_i$ is rotation, reflection, shear, scaling, or projection matrix in $R^{n_i\times m_i}$. Also $n,m,k\in N$,and $n_i,m_i\in N$ for all i.

And if it is true then how can we decompose it into that product.

Best Answer

The question is not posed completely clearly, but I think that something close to what the questioner wants should follow quickly from the singular value decomposition, which states that any real matrix $A$ can be written in the form $$ A=UDV, $$ where $U$ and $V$ are square real orthogonal matrices and $D$ is a (possibly rectangular) diagonal matrix with nonnegative entries on the diagonal. Since $U$ and $V$ are orthogonal they are products of rotations and reflections, while $D$ can be thought of as a product of projections and scalings.

For example, if $$ A=\left(\begin{array}{cc}1&2x\\0&1\end{array}\right), $$ then $$ A= \left(\begin{array}{cc}\cos \phi&-\sin\phi\\\sin\phi&\cos\phi\end{array}\right) \left(\begin{array}{cc}\sqrt{x^2+1}-x&0\\0&\sqrt{x^2+1}+x\end{array}\right) \left(\begin{array}{cc}\cos\theta&-\sin\theta\\\sin\theta&\cos\theta\end{array}\right), $$ where $$ \phi=-\frac{\pi}{4}-\frac{1}{2}\arctan x, \qquad \theta=\frac{\pi}{4}-\frac{1}{2}\arctan x. $$

In reply to the comments below: Interpreting a diagonal matrix with positive entries along the diagonal as a scaling relies on allowing the scaling to be non-uniform, i.e., allowing it to scale different axes by different amounts. If the scaling matrices are restricted to be uniform, then, by using examples like the one above, you can write a square diagonal matrix with positive entries as a product of orthogonal matrices, shears, and a uniform scaling.