Linear Algebra – What is Known About Merely-Orthogonal Matrices?

linear algebraorthogonal matrices

I'm interested in square matrices whose columns are orthogonal, but not necessarily orthonormal, non-zero vectors. Answers to other questions on this topic have noted that such matrices do not have an agreed name and that the natural name of "orthogonal matrix" means a matrix with orthonormal columns. So I'm going to use the name "merely-orthogonal" for these matrices and "orthonormal" for those matrices where $Q^TQ=I$, avoiding "orthogonal matrix" entirely.

The answer to one question usefully notes that if $M$ is merely-orthogonal then there exists an invertible diagonal matrix $D$ and an orthonormal matrix $Q$ such that $M=QD$. The elements of $D$ are easily found as the norms of each column. This also immediately implies that
$$M^TM = (QD)^TQD = D^TQ^TQD = D^T D = D^2$$
which is diagonal and has non-zeros on the diagonal. And hence, $M$ is invertible with
$$M^{-1} = D^{-2}M^T$$
It seems like merely-orthogonal matrices should form a group under multiplication and that there should be an analog to the $QR$ decomposition (call it the $MR$ decomposition) where for any matrix $A$, $A = MR$ with $M$ merely-orthogonal and $R$ upper triangular. It seems to me that the $MR$ decomposition would not require the underlying field to be algebraically closed, like the $QR$ decomposition does but would be computable over the rational numbers.

So my questions are:

  1. Has anyone studied merely-orthogonal matrices and established these or other results?
  2. Am I right that merely-orthogonal matrices form a group?
  3. Am I right that $MR$ decompositions can be done over the rationals?

Best Answer

They are not closed under multiplication or inverse, hence not a group.

For example, all invertible diagonal matrices are merely orthogonal, but $$ \begin{pmatrix} 1&-1\\ a&b \end{pmatrix} $$ is merely orthogonal iff $ab=1$, which is not preserved under the multiplied by $\operatorname{diag}(1,\lambda)$ for $\lambda\neq\pm 1$ scaling the second row. Also, the matrix $$ \begin{pmatrix} \cos\theta&-R\sin\theta\\ \sin\theta&R\cos\theta \end{pmatrix} $$ is merely orthogonal but its inverse $$ \begin{pmatrix} R\cos\theta&R\sin\theta\\ -\sin\theta&\cos\theta \end{pmatrix} $$ is not unless $R=\pm 1$.

Basically, knowing only a matrix sends the standard basis to an orthogonal basis does not tell you the next application will send this new orthogonal basis to an orthogonal one. Similarly it also does not tell you the standard basis comes from an orthogonal basis.

There are of course the conformal linear transformations, which preserve conformal frames (i.e., orthogonal basis all of the same length) and they form a group -- the conformal group (fixing $0$) consisting of (nonzero) scalar multiples of orthogonal matrices.

Related Question