Lower bound on conditional probability of sum of random variables

probability

Let $X=\left(\sum_{i=1}^n a_iW_i\right)\mod 2, Y=\left(\sum_{i=1}^n b_iW_i\right) \mod 2$ where $a_i,b_i$ are constants, $a_i,b_i\in\{0,1\}$, $W_i$ are iid Bernoulli random variables with $W_i\in\{0,1\}$ with probability $1/2$ each. Note that $n$ is a fixed positive integer (and the case $b_i= 0$ for all $1\leq i\leq n$ is excluded as pointed out in the comment below).

I am trying to find a lower bound on $\mathbb{P}(Y=1|X=1)$ over the set of all possible values of $a_i,b_i,1\leq i\leq n$.

Firstly I think this probability is strictly positive for all choices of $a_i,b_i$ because $\mathbb{P}(Y=1|X=1)=0 \iff Y=X+1$ which, by definition of $X,Y$ is not true. But I'm not sure what is a lower bound for it. I have a guess that the minimum for the aforementioned probability is attained when say $X=\sum_{i=1}^n W_i, Y=W_1$ and in this case, $\mathbb{P}(Y=1|X=1)\geq \frac{1}{2n}$ but I don't have a proof. Any ideas?

Best Answer

The lower bound you are looking for is probably 1/2, since the probability that you have there is either 1/2, or 1 (after excluding the case @saulspatz mentioned). This follows from the fact that if $X$ and $Y$ differ by even one $W_i$, they are independent. For Bernoulli random variables, uncorrelatedness implies independence (you can check https://stats.stackexchange.com/questions/74410/for-which-distributions-does-uncorrelatedness-imply-independence).

So, let $T_X =\{ i | a_i = 1 \}, T_Y = \{ i | b_i = 1 \}$. If $T_X$ and $T_Y$ are both empty, then it is easy to see that that probability is 1. Also, for notational convenience, let $Z_i = (-1)^{(1-W_i)}$, you can see that $X = \frac{(\prod_{i \in T_X} Z_i) + 1}{2}$ and similarly for $Y$. Now, it is possible to see that the uncorrelatedness of $X$ and $Y$ is precisely when $E[\prod_{i \in T_X} Z_i \prod_{j \in T_Y} Z_j] = 0$.

Now, note that if there is an $i$ in $T_X$ but not $T_Y$ or vice versa, that expectation will be zero, giving you the result. If those sets are identical, then of course, they are very correlated, and that is when the probability you have is equal to 1.

I assumed that both $X$ and $Y$ are Bernoulli 1/2. The special cases are when $X=0$ a.s. (which makes the probability 0) and $Y = 0$ a.s. (which is trivially independent of $X$)

Related Question