[Math] how to calculate mean and variance for logistic distribution

probability distributions

I need to have a formula for calculating the mean and variance for logistic distribution to fit some data I have to it. I know that the mean is $\int_{-\infty}^\infty xf(x).dx $ where f(x) is the pdf for the distribution which is $$f(x)=\frac{e^{-\frac{x-\mu}{\sigma}}}{{(1+e^{-\frac{x-\mu}{\sigma}})}^2} $$
Is there an existing solution for this integral? Or is there an easier method for calculating the mean? I found a proof for the variance to be $\pi^2\over3$ for a normalized distribution, but I couldn't do the trick for my problem.

I normalized the pdf and substituted as: $$mean =\int_{-\infty}^\infty x\frac{e^{-x}}{({1+e^{-x}})^2} $$
Split the integral:
$$ =\int_{-\infty}^0 x\frac{e^{-x}}{({1+e^{-x}})^2}+\int_0^\infty x\frac{e^{-x}}{({1+e^{-x}})^2} $$
For the second integral, I saw something similar, but I don't really understand it quite well:
$$\int_0^\infty x\frac{e^{-x}}{({1+e^{-x}})^2}=\int_0^\infty x\sum_{n=1}^\infty n(-1)^{n-1}e^{-nx}.dx=\sum_{n=1}^\infty n(-1)^{n-1}\int_0^\infty xe^{-nx}.dx $$
Solving the integration by parts:
$$=\sum_{n=1}^\infty n(-1)^{n-1}\int_0^\infty xe^{-nx}.dx= \sum_{n=1}^\infty n(-1)^{n-1}\frac{n+1}{n^2} $$
However, I can't proceed from here.

So, my questions are:

  1. How is this possible? $\int_0^\infty\frac{e^{-x}}{(1+e^{-x})^2}=\sum_{n=1}^\infty n(-1)^{n-1}e^{-nx}$

  2. How to proceed with evaluating the series and what to do with the first integral? Or is there an already answer/easier way for finding the mean for logistic distribution?

  3. In this, I assume a normalized data. So, I didn't use the form $f(x)=\frac{e^{-\frac{x-\mu}{\sigma}}}{{(1+e^{-\frac{x-\mu}{\sigma}})}^2} $ because I thought maybe it's easier to normalize my data and work on them. But is there another method to work directly on the raw data?

I can find the mean and variance for the dataset I have using MATLAB, but I'm interested in calculating them myself

Best Answer

First of all, the pdf of $X$ should be of the form

$$f(x)=\frac{1}{\sigma}\cdot\frac{\exp\left(-\frac{x-\mu}{\sigma}\right)}{\left[1+\exp\left(-\frac{x-\mu}{\sigma}\right)\right]^2}\,,\quad x\in\mathbb R$$

where $\mu$ is real and $\sigma$ is positive.

Normalising the pdf by taking $Y=(X-\mu)/\sigma$, we get the standard logistic pdf $$g(y)=\frac{e^{-y}}{\left(1+e^{-y}\right)^2}\,,\quad y\in\mathbb R$$

Now,

\begin{align}E(Y)&=\int_{\mathbb R}\frac{ye^{-y}}{\left(1+e^{-y}\right)^2}\,dy \\&=\int_0^1\ln\left(\frac{z}{1-z}\right)\,dz\qquad\left[y\mapsto z\text{ such that } z=\frac{1}{1+e^{-y}}\right] \\&=\int_0^1\ln z\,dz-\int_0^1\ln(1-z)\,dz \\&=\int_0^1\ln z\,dz-\int_0^1\ln z\,dz\qquad\left[\text{ using }\int_0^af(x)\,dx=\int_0^af(a-x)\,dx\right] \\&=0 \end{align}

Hence, $$\qquad E(X)=\mu$$

For the variance, see this post. It derives the result $$E(Y^2)=\operatorname{Var}(Y)=\frac{\pi^2}{3}$$

So we get $$\operatorname{Var}(X)=\frac{\pi^2\sigma^2}{3}$$

We can also derive the moments using the moment generating function, but the calculation is slightly more involved.