Expectation of inverse of the sum of positive random variables

expected valueprobabilityprobability distributionsprobability theoryprobability-limit-theorems

Suppose we have a sequence of independent identically distributed positive random variables $X_1,X_2,\cdots\stackrel{i.i.d.}{\sim}\xi$, and I am puzzled with the existence of expectation
$$\mathbb{E}\frac{1}{X_1+\cdots+X_n}.$$ (Here, existence means the expectation is finite.)

For example, if $X_1,X_2,\cdots\stackrel{i.i.d.}{\sim}\xi\stackrel{d}{=}\chi^2(1)$ (chi-square distribution with 1 degree of freedom), then we know that $\mathbb{E}\frac{1}{X_1}=\infty$ and $\mathbb{E}\frac{1}{X_1+X_2}=\infty$, but $\mathbb{E}\frac{1}{X_1+\cdots+X_n}=\frac{1}{n-2}<\infty$ if $n\geq 3$.

My question is can we find a positive population distribution $\xi$, such that for any $n>0$, $X_1,\cdots,X_n\stackrel{i.i.d.}{\sim}\xi$,
$$\mathbb{E}\frac{1}{X_1+\cdots+X_n}=\infty?$$

Best Answer

An alternative solution offered by my colleague, M.:

Let $\phi_X(t)=\mathbb{E} e^{-tX}$ be the Laplace transform of r.v. $X>0$. Since $\int_0^\infty e^{-tx}dt=\frac1x$ for $x>0$, $\mathbb{E}\frac1X=\int_0^\infty \phi_X(t)dt$ by Tonelli's theorem. Therefore, $$ \mathbb{E}\frac1{X_1+\dots+X_n}=\int_0^\infty \phi_{X_1+\dots+X_n}(t)dt =\int_0^\infty \left[\phi_{\xi}(t)\right]^n dt. $$

Now let $Z_k\sim \Gamma(k,1)$, i.e. $Z$ has the density $$ f_{Z_k}(x)=\frac{x^{k-1}e^{-x}}{\Gamma(k)} $$ and the Laplace transform $$ \phi_{Z_k}(t)=\frac1{(1+t)^k}. $$ Now assume that $k$ itself is random, e.g. $k\sim exp(1)$, and let $\xi$ have the distribution of $Z_k$. Then $$ \phi_{\xi}(t)=\int_0^\infty \frac1{(1+t)^k} e^{-k} dk=\frac1{1+\ln(1+t)}\sim \frac1{\ln t} $$ for large $t$, and $$ \int_0^\infty [\phi_{\xi}(t)]^n dt=\infty $$ for all $n\ge 1$.