[Math] orthogonal matrices vs. orthogonal columns

least squaresmatricesorthogonalityregression

I'm just reading a book on econometrics and now I'm stuck with a problem:

There is a Theorem on "Orthogonal Partitioned Regression" which says:

"In the multiple linear least squares regression of $y$ on two sets of variables $X_1$ and $X_2$, if the two sets of variables are orthogonal, then the separate coefficient vectors can be obtained by separate regressions of $y$ on $X_1$ alone and $y$ on $X_2$ alone. …"

Furthermore, the authors says that $X_1$ and $X_2$ are orthogonal if:
$$
(X_1)' X_2=0 \, ,
$$
which confuses me because so far I always thought the condition for orthogonality between two matrices is that $(X_1)' X_2=I$.

The book is: W. Greene, Econometric Analysis, 7th ed. / Theorem 3.1 / p.33

What I found so far is:

Link 1 and Link 2 (search for: "imagine"), but neither really helps me with my problem.

BR
Fabian

Best Answer

Two vectors $v,w \in \mathbb{R}^n$ are orthogonal iff $v^t w = 0$ where $t$ indicates the transpose. Really, we're using the dot product given by $\langle v, w \rangle = v^t w$.

There is a different notion of orthogonality for matrices. Here's one definition of an orthogonal matrix: $O \in \rm{M}_n(\mathbb{R})$ is orthogonal if $O^t O = I$. Equivalently, this means that the columns of $O$ are orthonormal, i.e., that they are orthogonal and have length $1$.

However, note that only one matrix appears in this second definition. We just say that $A$ is orthogonal or not, not that $A$ is orthogonal to $B$. So this is not the definition used by the author of your book.

Related Question