PDF of $\min$ and $\max$ of $n$ iid random variables

cumulative-distribution-functionsprobabilityprobability distributionsprobability theorysolution-verification

The Problem: Suppose that $X_1,\dots,X_n$ are independent random variables with the same absolutely continuous distribution. Let $f$ denote their common marginal PDF. Set $Y=\min(X_1,\dots,X_n)$
and $Z=\max(X_1,\dots,X_n)$. Show that $Y$ and $Z$ are both absolutely continuous, and find their marginal PDFs.

My Thoughts: We begin by finding the CDF of $Y$. For $t\in\mathbb R$ we have
\begin{equation}\begin{split}
F_Y(t)&=P(Y\leq t)=P(\min(X_1,\dots,X_n)\leq t)=1-P(X_1>t,\dots,X_n>t)\\
&=1-P(X_1>t)\cdots P(X_n>t)\\
&=1-[1-F(t)]^n,
\end{split}\end{equation}

where in the fourth step we used the independence of the random variables $X_1,\dots,X_n$, and in the last step we used the fact the latter random variables have the same distribution which we call $F$.
By the absolute continuity of the random variables $X_1,\dots,X_n$, we may use the chain rule to differentiate the expression for $F_Y$ to obtain
$$f_Y(t)=nf(t)[1-F(t)]^{n-1}=nf(t)\left[1-\int_{-\infty}^tf(s)\,ds\right]^{n-1}.$$
It follows that $Y$ is an absolutely continuous random variable.
Next, we find the CDF of $Z$. For $t\in\mathbb R$ we have
\begin{equation*}\begin{split}
F_Z(t)&=P(Z\leq t)=P(\max(X_1,\dots,X_n)\leq t)=P(X_1\leq t,\dots,X_n\leq t)\\
&=P(X_1\leq t)\cdots P(X_n\leq t)\\
&=F(t)^n,
\end{split}\end{equation*}

where in th fourth step we used the independence of the random variables $X_1,\dots,X_n$. Since the latter mentioned random variables are absolutely continuous, we may use the chain rule to
differentiate the expression for $F_Z$ to obtain
$$f_Z(t)=nF(t)^{n-1}f(t)=nf(t)\left[\int_{-\infty}^tf(s)\,ds\right]^{n-1}.$$
It follows that $Z$ is an absolutely continuous random variable.


Could anyone please provide feedback on the correctness of my proof above?

Thank your for your time, it is much appreciated.

Best Answer

The derivative $F_X'$ of the distribution function of a random variable $X$, if it exists, is always measurable and non-negative, but its integral need not be $1$. So just proving that the derivative exists is not enough. In both cases you can see that the derivative you obtained actually integrates to $1$ and the formula $F_X(x)=\int_{-\infty} ^{x} F_X'(t)dt$ holds. Hint for computing the integral: putting $h(t)=\int_{-\infty}^{t} f(s)ds$ helps in both cases.