[Math] Conditional Probabilities – The Mad Kings’ Draft

pr.probabilityst.statistics

The Problem of the Mad King's Draft:

Suppose there is country which is ruled by a king who can be either 'mad' or 'normal.' The king rules a a large country with a continuum of citizens who have names $j$ from $[0,1]$. The citizens do not know whether the king is mad or not but believe both characters are equally likely.

The king drafts citizens. The mad king drafts $2$ citizens while the nice king drafts $4$ citizens at random, that is, each citizen is 'equally likely' to be drafted. Drafted citizens do not know how many other citizens are drafted.

What is the posterior of a citizen $j$ who is drafted by the king? This is an important question, since drafted citizens want to escape the draft if the king is mad.

It seems the obvious answer is that the correct posterior should be $1/3$. But why?

Ideally, I would like to find a general definition of conditional probabilities that applies to the Mad King's Draft and that is analogous to the standard definition of conditional probabilities, found, for example, on wikipedia en.wikipedia.org/wiki/Conditional_expectation

None of these standard definitions apply, which is not satisfying.

I eventually came up with the following definition of conditional expectations. The definition 'generalizes' the standard definition and applies to the Mad Kings' Draft. I would be interested in whether this definition (i) makes sense and (ii) whether it is in fact implied by a conventional definition.

Given a probability space $(Ω,\mathfrak{F},μ)$, let $(A_{j})$ be an uncountable family of sets $A_{j}∈F$ such that $Ω=⋃A_{j}$, and, for each $ω$, the number of $A_{j}$ such that $ω∈A_{j}$ is finite.
Take some $Ω^*\in \mathfrak{F}$. I define a family of binary random variables $1_{A_{j}}(ω)$ and $1_{Ω^*}(ω)$.

Definition. The family of functions $E[1_{Ω^*}(ω)|1_{A_{j}}(ω)=1]_{j∈[0,1]}$ are conditional expectations if and only if

$\int_{\Omega } \sum_{j∈J:1_{A}(ω)=1} E[ 1_{\Omega^*}|1_{A_{j}}=1] ( \omega
) dμ = \int_{\Omega } \sum_{j∈J:1_{A}(ω)=1} 1_{\Omega^*} ( \omega
) dμ $

for all $J \in \mathfrak{B} $, with $\mathfrak{B}$ denoting the Borel $\sigma $-algebra of $\left[ 0,1%
\right] $.

Note: If $(A_{j})$ partitions $Ω$, then the definition reduces to the standard definition of
$E[1_{Ω^*}(ω)|1_{A_{j}}=1(ω)]$ .

One very good Answer suggests using Palm Probabilities. However, the formula above is implied by Palm probabilities only in the special case in which $Ω^*$ is the union of a subset of elements from $(A_{j})$. The Mad King's Draft is such a special case, because $Ω^*$ ("the king is mad") is the union over all sets in which 2 citizens are drafted. However, the approach does not work more generally.

Example 1: Both kings randomize between drafting $2$ and drafting $4$ citizens and the mad kings invites $2$ citizens with higher probability than the nice king. If the probabilities with which the two types draft $2$ are $p_{mad}$ and $p_{nice}$, respectively, what is the posterior probability that the king is mad conditional on being drafted?

Example 2: Both kings randomize between drafting none and drafting $1$ citizen and the mad kings invites $1$ citizen with higher probability than the nice king. If the probabilities with which the two types draft $0$ citizens are $p_{mad}$ and $p_{nice}$, respectively, what is the posterior probability that the king is mad conditional on being drafted? Note, (i), in this example, a drafted citizen knows that no other citizen was drafted. Note, (ii), that there is no problem with defining conditional probabilities in this example in the conventional way.

Put differently, in the two examples, the conditional probability cannot be expressed as a conditional probability of a subset of the point measures, but it needs to be expressed as a conditional probability on the set of random point processes itself.

Best Answer

Let us formalize the story in the following way: Let $U_1,U_2,U_3,U_4$ be 4 independent random variables, uniformly distributed on $[0,1]$ (the possibly drafted citizens), and let $M$ be an independent Bernoulli random variable, with $\mathbb{P}(M=0) = \mathbb{P}(M=1) = 1/2$. ($M=1$ means that the king is mad.) Then we construct the random point process $\eta$ on the unit interval by $$ \eta := \delta_{U_1}+\delta_{U_2} + (1-M)(\delta_{U_3}+\delta_{U_4}). $$ ($\delta_U$ is the Dirac measure on the point $U$.) The question is, given some fixed $j\in[0,1]$, to compute the conditional probability that $M=1$ knowing that $\eta(j)=1$.

If the set modelizing the kingdom were finite, this would pose no problem since the event with respect to which we condition would be of positive probability. But here we condition with respect to a negligible event and this is not a priori clear what this conditional probability means. Fortunately a tool has been developped to deal with this kind of problems: Palm probabilities. In this context, it is a family $(P_j)_{j\in [0,1]}$ of probability measures on the set $ I^* $ of point measures on the unit interval $I$, such that for any Borel set $A\subset I$ and any measurable subset $B\subset I^*$: $$ \mathbb{E} [\mathbb{1}_B(\eta) \ \eta(A)] = \int_A P_j(B)\ d\mu(j), $$ where $\mu$ is the measure on $I$ given by $$ \mu(A):= \mathbb{E} [ \eta(A) ]. $$ If $I$ were finite, then we could easily check that this family of Palm probabilities would be given by $$ P_j(B)= \mathbb{P}(B | \eta(j)=1). $$ In the case of a continuous space $I$, $P_j$ can still be interpreted as the conditional distribution of our point process knowing that it charges $j$.

In the setting of the question, we observe that $\mu(A)=3 |A|$ where $|A|$ denotes the Lebesgue measure of $A$, and then we can easily compute $P_j(B)$, which is given by $$ P_j(B)= \frac{1}{3}\int_I \mathbb{1}_B (\delta_j+\delta_{u_2}) du_2 + \frac{2}{3}\int_{I\times I\times I} \mathbb{1}_B (\delta_j+\delta_{u_2}+\delta_{u_3}+\delta_{u_4}) du_2 du_3 du_4. $$ When $B$ is the set $(M=1)$, $\mathbb{1}_B(\eta)=1$ if $\eta$ charges 2 points, 0 if $\eta$ charges 4 points, hence we get $$ P_j(B)=1/3. $$

Related Question