[Math] Why can’t we add a non-square matrix $A$ to its transpose $A^T$

linear algebramatricesvector-spacesvectors

The addition operation is commonly defined as follows:

Two matrices must have an equal number of rows and columns to be added

But this is a very shallow definition/interpretation.

A deeper interpretation of a $n \times n$ matrix, I would believe, would be that it is an element of a vector space, e.g. $A \in \mathbb{R}^{n \times n}$ as you need $n^2$ linear equations to compute each entry in the matrix (a dot product for each entry)

Now if you use that interpretation of a matrix, then let's assume we have a column vector $B \in \mathbb{R}^{n \times 1}$ and it's transpose, a row vector, $B^T \in \mathbb{R}^{1 \times n}$

$$\begin{align}B = \begin{bmatrix}
b_1 \\
b_2\\
. \\
.\\
.\\
b_n\end{bmatrix}
&&\text{and}&&B^T = \begin{bmatrix}
b_1 &
b_2&
. &
.&
.&
b_n\end{bmatrix}
\end{align}$$

But both $B$ and $B^T$, are both elements of the same vector space $B, B^T \in \mathbb{R}^n$. So why is their addition undefined?

Why can't you add a matrix to its transpose, just like you could two vectors that are elements of the same vector space?

Is it wrong to interpret matrices as I have done? Is there a more rigorous definition of the addition operation for matrices? Are there better ways to interpret matrices?

Best Answer

$B$ and $B^T$ are not elements of the same vector space: $B$ is an element of $\Bbb R^{n\times 1}$, and $B^T$ is an element of $\Bbb R^{1\times n}$. Both of these vector spaces are isomorphic to $\Bbb R^n$, but no two of $\Bbb R^n$, $\Bbb R^{n\times 1}$, and $\Bbb R^{1\times n}$ are actually equal to each other.

Related Question