The rule for the sum of matrices multiplied by the transpose of their sum

linear algebramatricestranspose

Let's say we have the expression (XW + e) where each of the terms can be matrices and we want to multiply it by its transpose so that we have (XW+e)*(XW+e)'.

It seems that it works similarly to polynomial expansion for power of 2. We get

XWW'X' + 2XWe' + ee'.

Why is it so and what is the rule here?

Best Answer

Let me pick the middle two terms $eW'X' + XWe'$. If the two terms result in scalar values then we could take transpose of the two or either one of the terms. so I will take transpose of the first term $(eW'X')' = XWe'$ and this is same as the 2nd term so they could add together to form $2XWe'$