Equation (3.89) seems wrong in Bishop pattern recognition & machine learning book

machine learning

In Bishop's pattern recognition & machine learning book, I seem to have found a serious mistake in an math equation; serious because all subsequent arguments rely on it.

It is the eq. (3.89) on page 168:

$$
0 = \frac{M}{2\alpha} -\frac{1}{2}\mathbf{m}_N^T\mathbf{m}_N – \frac{1}{2}\sum_{i}{\frac{1}{\lambda_i + \alpha}}
$$

The above equation is obtained by differentiating eq. (3.86) with respect to $\alpha$:

$$
\ln p(\mathbf{t}|\alpha, \beta)=(M/2)\ln \alpha +(N/2)\ln\beta -E(\mathbf{m}_N)-(1/2)\ln |\mathbf{A}|-(N/2)\ln(2\pi)
$$

where
$$
E(\mathbf{m}_N) = (\beta/2)||\mathbf{t}-\mathbf{\Phi}\mathbf{m}_N||^2 +(\alpha/2)\mathbf{m}_N^T\mathbf{m}_N
$$

However, because $\mathbf{m}_N$ dependens on $\alpha$ it cannot simply be $\frac{\partial{E(\mathbf{m}_N)}}{\partial\alpha}= (1/2)\mathbf{m}_N^T\mathbf{m}_N$

Correct derivative should instead be:

$$
\frac{\partial{E(\mathbf{m}_N)}}{\partial\alpha} = \{\beta\mathbf{\Phi}^T(\mathbf{\Phi}\mathbf{m}_N-\mathbf{t}) + \alpha\mathbf{m}_N\}^T\frac{\partial\mathbf{m}_N}{\partial\alpha}+\frac{1}{2}\mathbf{m}_N^T\mathbf{m}_N
$$

Or am I making a big mistake?

Best Answer

You are not making a mistake, you just need to go one step further. Fist, note that $\mathbf{m}_{N}=\beta \mathbf{A}^{-1} \mathbf{\Phi}^{\mathrm{T}} \mathbf{t}$ with $\mathbf{A} = \alpha I + \beta \boldsymbol{\Phi}^{T}\boldsymbol{\Phi}$. Having that in mind we can start by working your expression out

$$ \frac{\partial E\left(\mathbf{m}_{N}\right)}{\partial \alpha}=\left\{\beta \boldsymbol{\Phi}^{T}\left(\boldsymbol{\Phi} \mathbf{m}_{N}-\mathbf{t}\right)+\alpha \mathbf{m}_{N}\right\}^{T} \frac{\partial \mathbf{m}_{N}}{\partial \alpha}+\frac{1}{2} \mathbf{m}_{N}^{T} \mathbf{m}_{N} $$

Now, if we take a closer look, we can find that:

$$ \left\{\beta \boldsymbol{\Phi}^{T}\left(\boldsymbol{\Phi} \mathbf{m}_{N}-\mathbf{t}\right)+\alpha \mathbf{m}_{N}\right\}^{T} \frac{\partial \mathbf{m}_{N}}{\partial \alpha} = \left\{ {\beta \boldsymbol{\Phi}^{T}\boldsymbol{\Phi}\mathbf{m}_{N} + \alpha \mathbf{m}_{N} - \beta \boldsymbol{\Phi}^{T}\mathbf{t}} \right\}\frac{\partial \mathbf{m}_{N}}{\partial \alpha} $$

which is the same as $\left\{ {\mathbf{A}\mathbf{m}_{N} - \beta \boldsymbol{\Phi}^{T}\mathbf{t}} \right\}\frac{\partial \mathbf{m}_{N}}{\partial \alpha} = \left\{ \beta \mathbf{A}\mathbf{A}^{-1} \mathbf{\Phi}^{\mathrm{T}} \mathbf{t} - \beta \boldsymbol{\Phi}^{T}\mathbf{t}\right\} \frac{\partial \mathbf{m}_{N}}{\partial \alpha}= 0$. This means that $\frac{\partial E\left(\mathbf{m}_{N}\right)}{\partial \alpha}=\frac{1}{2} \mathbf{m}_{N}^{T} \mathbf{m}_{N}$