[Math] Linear transformation of orthonormal vectors and orthogonality of transformation matrix

linear algebralinear-transformations

Question:

Let $v_1, v_2,\cdots, v_n$ be orthonormal vectors in $\mathbb{R^n}$. Show that $Av_1.Av_2,\cdots,Av_n$ are also orthonormal if and only if $A\in \mathbb{R^n}$ is orthogonal.

What I have Done:

$P: Av_1.Av_2,\cdots,Av_n$ are orthonormal

$Q: A \in \mathbb{R^n}$ is orthogonal

$Q \Rightarrow P$ is quite straightforward. But what confuses me is the proof of $P \Rightarrow Q$.

If $P$, then we could write
$$
(Av_i)^T(Av_j)=v_i^TA^TAv_j=
\begin{cases}
1,i=j\\
0,i\neq j
\end{cases}
$$
But what is next. Even though we could say $A^TA = I$, which I do not think I could conclude directly from the formulas we have, how should I prove $AA^T = I$, which is the premise to define a orthogonal matrix.

Thank you in advance. Any help would be much appreciated.

Best Answer

Firstly, since $\{v_i\}$ is an orthonormal system of $n$ vectors in $\mathbb{R}^n$, they form an orthonormal basis. Therefore, each vector $v \in \mathbb{R}^n$ can be written as the sum $$v = \sum\limits_{i=1}^n (v, v_i)v_i,$$ where $(\cdot, \cdot)$ is the standard inner product on $\mathbb{R}^n$. This gives us \begin{multline*} A^{T}A(v) = A^{T}A(\sum\limits_{i=1}^n (v, v_i)v_i) = \sum\limits_{i=1}^n (v, v_i)A^{T}A(v_i) = \sum\limits_{i=1}^n (v, v_i)\sum\limits_{j=1}^n(A^{T}A(v_i), v_j)v_j \\= \sum\limits_{i=1}^n (v, v_i)\sum\limits_{j=1}^nv_j^{T}A^{T}A(v_i)v_j = \sum\limits_{i=1}^n\sum\limits_{j=1}^n (v, v_i)\delta_{ij}v_j = \sum\limits_{i=1}^n (v, v_i)v_i = v \end{multline*} where $$\delta_{ij}=\begin{cases}1 & i=j \\ 0 & i \ne j\end{cases}.$$ Therefore for every $v \in \mathbb{R}^n$ $$A^{T}Av = v$$ which means $$A^{T}A = I.$$

Like @Paul mentioned in the comments, $A^{T}A = I$ means $A^{T} = A^{-1}$ and thus $$AA^{T} = AA^{-1} = I.$$

Related Question