[Math] Not understanding derivative of a matrix-matrix product.

calculusderivativesmatricesmatrix-calculus

I am trying to figure out a the derivative of a matrix-matrix multiplication, but to no avail. This document seems to show me the answer, but I am having a hard time parsing it and understanding it.

Here is my problem: We have $\mathbf{D} \in \Re^{m n}$, $\mathbf{W} \in \Re^{m q}$, and $\mathbf{X} \in \Re^{q n}$. Furthermore, $\mathbf{D} = \mathbf{W}\mathbf{X}$. (NOT an element wise multiplication – a normal matrix-matrix multiply).

I am trying to derive the derivative of $\mathbf{D}$, w.r.t $\mathbf{W}$, and the derivative of $\mathbf{D}$, w.r.t $\mathbf{X}$.

My class note this is taken from seems to indicate that
$$
\frac{\delta \mathbf{D}}{\delta \mathbf{W}} = \mathbf{X}^{T} \text{ and that } \frac{\delta \mathbf{D}}{\delta \mathbf{X}} = \mathbf{W}^{T},
$$

but I am floored as to how he derived this. Furthermore, in taking the derivatives, we are asking ourselves how every element in $\mathbf{D}$ changes with perturbations by every element in, say, $\mathbf{X}$, – so wouldn't the resulting combinations blow up to be a-lot more than what $\mathbf{W}^{T}$ has? I cant even see how the dimensionality is right here.

EDIT: Id like to add the context of this question. It's coming from here, and here is my marked screen-shot of my problem. How are they deriving those terms? (Note: I understand the chain-rule aspect, and I am not wondering about that. I am asking about the simpler intermediate step).

enter image description here

Thanks.

Best Answer

For the first question alone (without context) I'm going to prove something else first (then check the $\boxed{\textbf{EDIT}}$ for what is asked):

Suppose we have three matrices $A,X,B$ that are $n\times p$, $p\times r$, and $r\times m$ respectively. Any element $w_{ij}$ of their product $W=AXB$ is expressed by:

$$w_{ij}=\sum_{h=1}^r\sum_{t=1}^pa_{it}x_{th}b_{hj}$$ Then we can show that: $$s=\frac {\partial w_{ij}}{\partial x_{dc}}=a_{id}b_{cj}$$ (because all terms, expect the one multiplied by $x_{dc}$, vanish)

One might deduce (in an almost straightforward way) that the matrix $S$ is the Kronecker product of $B^T$ and $A$ so that:$$\frac {\partial AXB}{\partial X}=B^T⊗A$$

Replacing either $A$ or $B$ with the appropriate identity matrix, gives you the derivative you want.

$$\boxed{\textbf{EDIT}}$$

Upon reading the article you added (and after some sleep!), I've noticed that $dD$ is not $\partial D$ in their notation, but rather $\dfrac {\partial f}{\partial D}$ where $f$ is a certain function of $W$ and $X$ while $D=WX$. This means that the first expression you're having problems with is $$\frac{\partial f}{\partial W}=\frac{\partial f}{\partial D}X^T$$ Since the author at the beginning stated that he'd use the incorrect expression "gradient on" something to mean "partial derivative" with respect to that same thing. So any element of $\partial f/\partial W$ can be written as $\partial f/\partial W_{ij}$. And any element of $D$: $$D_{ij}=\sum_{k=1}^qW_{ik}X_{kj}$$

We can write $$df=\sum_i\sum_j \frac{\partial f}{\partial D_{ij}}dD_{ij}$$ $$\frac{\partial f}{\partial W_{dc}}=\sum_{i,j} \frac{\partial f}{\partial D_{ij}}\frac{\partial D_{ij}}{\partial W_{dc}}=\sum_j \frac{\partial f}{\partial D_{dj}}\frac{\partial D_{dj}}{\partial W_{dc}}$$ This last equality is true since all terms with $i\neq d$ drop off. Due to the product $D=WX$, we have $$\frac{\partial D_{dj}}{\partial W_{dc}}=X_{cj}$$ and so $$\frac{\partial f}{\partial W_{dc}}=\sum_j \frac{\partial f}{\partial D_{dj}}X_{cj}$$ $$\frac{\partial f}{\partial W_{dc}}=\sum_j \frac{\partial f}{\partial D_{dj}}X_{jc}^T$$

This means that the matrix $\partial f/\partial W$ is the product of $\partial f/\partial D$ and $X^T$. I believe this is what you're trying to grasp, and what's asked of you in the last paragraph of the screenshot. Also, as the next paragraph after the screenshot hints, you could've started out with small matrices to work this out before noticing the pattern, and generalizing as I attempted to do directly in the above proof. The same reasoning proves the second expression as well...