First Approximation of Expected Value of Positive Part of a Random Variable

probabilityrandom variablesrandom-functionstaylor expansion

Consider a random variable $X$ with mean zero ($\mu_X = 0$), known variance ($\sigma_X^2$), and all other moments finite but unknown. I am interested in obtaining an estimate of the expected value of the positive part of this random variable, i.e., given $X^{+} \equiv \max(0, X)$ I want $\mathbb{E}(X^{+})$. Preferably this would only be a function of the variance as I have no other information, but this may not be possible.

It is simple to apply a standard Taylor series approach to this problem, e.g. if $f$ is the positive part function:

$$\mathbb{E}\left[f(X)\right]\approx f(\mu _{X})+{\frac {f''(\mu _{X})}{2}}\sigma _{X}^{2}$$

However, as $\mu_X = 0$, we need to find $f''(\mu_X)$, which is undefined. It is easy to make a function which converges to $X^+$ in some limit and has defined $f''(\mu_X)$, but this behavior is not unique, so I don't expect the behavior this function has to also apply to $X^+$.

It is not difficult to show that $\mathbb{E}(X^{+}) < \sigma_X$, but I'd prefer to know something like $\mathbb{E}(X^{+}) \approx \alpha \, \sigma_X$, where $\alpha$ is a constant to be determined. (Thanks to stud_iisc for noting that the inequality is strict.)

If it is necessary to assume that $X$ is Gaussian to get a result, that may be acceptable, though $X$ may not be Gaussian.

Best Answer

Special case 1: If $X\sim N(0,\sigma^2)$ then $$E(X^+)=\int_{0}^{\infty} x\cdot \frac{1}{\sqrt{2 \pi \sigma^2}} e^{-\frac{x^2}{2\sigma^2}}dx = \sqrt{\frac{1}{2\pi}} \sigma.$$

Special case 2: If $X$ is a non-negative random variable then $\alpha = 0.$