[Math] Where does mean and standard deviation go in the error function

definite integralserror functionnormal distribution

The error function is defined as

$$ \textrm{erf}(x) = \frac{2}{\sqrt{\pi}} \int_0^x e^{-t^2}dt~.$$

However, the normal distribution can take a more general form than the definition of the error function seems to admit, at least at first glance:

$$f(x | \mu,\sigma) = \frac{1}{\sigma\sqrt{2\pi}}e^{\frac{-(x-\mu)^2}{2\sigma^2}} ~.$$

I am wondering how to properly account for $\mu$ and $\sigma$ in the error function. I was thinking of, starting with
$$\int_0^x \frac{1}{\sigma\sqrt{2\pi}}e^{\frac{-(x'-\mu)^2}{2\sigma^2}}dx'$$
doing two substitutions,
$$u = x'-\mu$$
$$t = \frac{u}{\sqrt{2}\sigma}$$
I get
$$ \frac{1}{\sqrt{\pi}} \int_0^{(x-\mu)/\sigma\sqrt{2}} e^{-t^2}dt ~.$$

This is identical to the error function except for modified limits of integration and a factor of 2 (I am not sure where it comes from, but my guess is that it has to do with the evenness of the normal distribution when $\mu=0$). That factor of 2 aside (the normalization of the error function isn't important in my particular use of it), is this

  1. correct?
  2. the standard way of accounting for $\mu$ and $\sigma$ in the error function?

Best Answer

The cumulative distribution function of the normal distribution is defined as

$$\Phi(z) = \frac{1}{2}+\frac{1}{2}erf(\frac{z}{\sqrt{2}})$$

When we want standardise a normal distribution, this substitution is made:

$$z = \frac{\mu - x}{\sigma}$$

Using this, we get

$$\Phi(\frac{\mu - x}{\sigma}) = \frac{1}{2}+\frac{1}{2}erf(\frac{\mu - x}{\sigma\sqrt{2}})$$

And so the integral we need to calculate is, almost as you said,

$$erf(\frac{\mu - x}{\sigma\sqrt{2}}) = \frac{2}{\sqrt{\pi}} \int_0^{(x-\mu)/\sigma\sqrt{2}} e^{-t^2}dt ~.$$