Calculus – Proving that $\int{\exp{\{-\lambda w^2\}} = \sqrt{\frac{\lambda} {2\pi}}$

calculusgaussian-integralintegrationmachine learningstatistics

In machine learning, when using L2 Regularization we add a penalty of $\lambda \lVert w \rVert^2$ to our error function to get:

$ E = \sum{(y_n – \hat{y}_n)^2 } + \lambda\lVert w \rVert^2$ as our new error function.

This is equivalent to the negative log-likelihood, which we want to minimize. In order to maximize the likelihood we multiply both sides by $-1$ and get:

$ -E = -\sum{(y_n – \hat{y}_n)^2 } – \lambda\lVert w \rVert^2$

which is equivalent to the log-likelihood. Then, exponentiating the function, we obtain:

$\exp(-E) = [\prod\exp{\{-(y_n – w^Tx_n)^2}\}]\exp{\{-\lambda\lVert w \rVert^2}\}$.

Now I am told this yields two gaussians; the one of interest to me is the second one:

$\exp{\{-\lambda\lVert w \rVert^2}\}$.

This is a gaussian with $\mu = 0$, and $\sigma^2 = \frac{1}{\lambda}$.

I can intuitively get this, since the gaussian take the form $\exp\{{-\frac{1}{2}\frac{(x-\mu)^2}{\sigma^2}\}} $, which is then multiplied by some constant as a normalization factor.

But when I tried to convince myself, I came to the point that $\int\exp{\{-\lambda\lVert w \rVert^2}\} = \sqrt{\frac{\lambda}{2\pi}}$. I can't get any farther than this. Is there a way I can finish proving this equality to be true?

Best Answer

From the PDF of the $N(0, \sigma^2)$ distribution, we have $$\frac{1}{\sqrt{2\pi \sigma^2}} \int \exp\{-\frac{z^2}{2 \sigma^2}\} \, dz = 1.$$

Taking $\sigma^2 = \frac{1}{2 \lambda}$ yields $$\int_{\mathbb{R}} \exp\{-\lambda z^2\} \, dz = \sqrt{\pi / \lambda}.$$

Writing $\|w\|^2 = w_1^2 + \cdots + w_d^2$, we have $$\int_{\mathbb{R}^d} \exp\{-\lambda \|w\|^2\} \, dw = \left(\int_{\mathbb{R}^d} \exp\{-\lambda w_1^2\} \, dw_1 \right)^d.$$

Related Question