[Math] Bernoulli trials conditional probability

bayesianprobabilityprobability theory

Let $\Omega=\{0,1\}^\infty$ and $S_n=X_1+\cdots+X_n$ the number of “successes” or “arrivals” in $n$ steps. $p\in(0,1)$ and $\mathbb P(S_n=k)=\binom{n}{k}p^k(1-p)^{n-k}$

Let $T$ be the time until the first success appears, i.e $\min\{n\mid X_n=1\}$

I already calculated $\mathbb P(T=k)=(1-p)^{k-1}p$ and $\mathbb E[T]=1/p$

I have no idea how to calucltae the following two conditional probabilities:

(1) The cond. prob. of $\{T=k+m\}$ under the condition $T>k$

(2) Let $T_1,T_2$ be independent, geometrically distributed random variables with paramter $p$, what is $\mathbb P(T_1=k\mid T_1+T_2=n) (k<n)$ ?

Best Answer

For the first conditional probability, we can calculate. We have $$\Pr(T=k+m|T\gt k)=\frac{\Pr(T=k+m \cap T\gt k)}{\Pr(T\gt k)}.$$ The numerator is $(1-p)^{m+k-1}p$, and the denominator is $(1-p)^k$. Divide. We get $(1-p)^{m-1}p$.

The above result is intuitively clear. It captures the memorylessness of the distribution. The coin does not remember its history of failures.

The above calculation can serve as a template for the second problem.

Added: Again, this is a conditional probability calculation. For the numerator, we want $\Pr(T_1=k \cap T_1+T_2=n)$, or equivalently $\Pr(T_1=k \cap T_2=n-k)$. By independence, this is $(1-p)^{k-1}p(1-p)^{n-k-1}p$, which simplifies to $(1-p)^{n-2}p^2$.

For the denominator, we want the probability that $T_1+T_2=n$. This event happens if $T_1=1$ and $T_2=n-1$ or if $T_1=2$ and $T_2=n-2$, and so on.

The probability that $T_1=i$ and $T_2=n-i$ is, by a calculation identical to the one above, equal to $(1-p)^{n-2}p^2$. For the probability that $T_1+T_2=n$, we add from $i=1$ to $i=n-1$. So we get $(n-1)(1-p)^{n-2}p^2$.

Finally, for the conditional probability, divide. We get the constant $\frac{1}{n-1}$. This says that the conditional distribution of $T_1$, given the sum $T_1+T_2$ is $n$, is (discrete) uniform.

Related Question