Distribution Function – Maximum of n IID Standard Uniform Random Variables with Poisson Distributed n

convergencedistributionsprobabilityself-study

I am studying probability theory on my own and am trying to work the following problem in the book –
Let $X_1, X_2, . . .$ be independent, $U(0, 1)$-distributed random variables, and
let $Nm \in Po(m)$ be independent of $X_1, X_2, . . . .$
Set $V_m = max\{X_1, . . . ,X_{Nm}\} (Vm = 0 \ when\ Nm = 0)$. Determine

(a) the distribution function of $V_m$,

(b) the moment generating function of $V_m$.

(c) Show that $E[Vm] \to 1$ as $m \to \infty$.

(d) Show that $m(1 − Vm)$ converges in distribution as $m \to \infty$, and determine
the limit distribution.

I got stumped at the very first part when trying to find the distribution function of $V_m$. I know that the CDF of $X$ is given by –

$F_X(x) = \begin{cases} 0, & x < 0 \\ x, & 0 \leq x \leq 1 \\ 1 & x > 1 \end{cases}$

This implies that the CDF of $V_m$ should be given by

$F_{V_m}(x) = \begin{cases} 0, & x < 0 \\ x^n, & 0 \leq x \leq 1 \\ 1 & x > 1 \end{cases}$ where $n \in Po(m)$

Then by differentiating $F_{V_m}(x)$, we get the density function
$f_{V_m} = nx^{n-1}$. Then using conditional probability, we want to find the CDF of $V_m$ and I did the following to get there –
$P(V_m = x) = \sum_{n=0}^{\infty} P(V_m = x|N_m=n) . f_N(n) = \sum_{n=1}^{\infty} nx^{n-1} e^{-m} \frac{m^n}{n!} = me^{m(x-1)}$

The problem is when I try to integrate this over x from 0 to 1,the answer is not 1 which implies that this might not be the correct pdf and hence integrating this may not give me the correct CDF. Incidentally, when integrating this over 1 to x, I get $ e^{m(x-1)} – e^{-m}$ which is not the correct answer.

I am not sure if I have made a big blunder in my understanding of something or a small goof up. I have looked at my calculations a few times to ensure that I have not made any silly error.
I think once I am past this first part, I should be able to get to the rest. I would highly appreciate any help as I feel that this is essential for me to understand before I move forward with my course.
Thanks to anyone ande

Best Answer

The calculations in the question look correct, but care is needed because the distribution of $V_\mu$ is not continuous. (I will use $\mu$ instead of $m$ throughout.)

From first definitions we may find the distribution function (CDF) of $V_\mu$ is

$$F_\mu(x) = \Pr(V_\mu) \le x) = \sum_{n=0}^\infty x^n \Pr(N_\mu = n) = e^{\mu(x-1)}$$

provided $0 \le x \le 1$. For $x \gt 1$, $F_\mu(x) = 1$ of course. But for $x \lt 0$, necessarily $F_\mu(x) = 0$. Here is its graph when $\mu=1$ showing the jump at $x=0$:

Figure

The moment generating function, $\phi_\mu(t) = \mathbb{E}(\exp(t V_\mu))$, must be computed with similar care near zero. It can be obtained as a Lebesgue-Stieltjes integral,

$$\phi_\mu(t) = \int_\mathbb{R} e^{t x} dF_{\mu}(x)$$

via integration by parts as

$$\phi_\mu(t) = e^{t x} F_\mu(x) \vert_{-\infty}^1 - \int_0^1 t e^{t x} e^{\mu(x-1)}dx = e^t - t\frac{e^t - e^{-\mu}}{t+\mu}.$$

As a check, its McLaurin series begins

$$\phi_\mu(t) = 1 + \left(\frac{\mu-1+e^{-\mu}}{\mu}\right) t + \left(\frac{\mu^2 - 2\mu + 2 - 2e^{-\mu}}{\mu^2}\right)t^2/2 + \cdots$$

The constant term of $1$ shows the total probability mass is $1$. The next two terms will be useful in addressing the rest of the questions.