Create an example of i.i.d Bernoulli random variable

distribution-theoryprobabilityprobability distributionsstatistics

i need to create an example of identically distributed, but dependent Bernoulli random variables where $ x_1,x_2….x_n$ i.e $x\in{0,1}$ such that,
$$P\big(|\mu-\frac{1}{n}\sum_{i=1}^{n}x_i|\geq \frac{1}{2}\big)=1$$

where $\mu=E[x_i]$,

The example should show that independence is crucial for convergence of mean
to the expected values.
i.e $\mu=E[x_i]$

im a non mathematics student struglling a bit with understanding the concept.i was wondering how $\mu$ equals $E[x_i]$ and its relation to independence. a short explanation will really help.

Best Answer

Take $(x_1,...,x_n)$ such that $x_1$ is a Bernoulli r.v. with parameter $1/2$, and $x_1=x_2=\dots=x_n$ a.s. Then $\mu=1/2$, but $$ \frac{1}{n}\sum_{i=1}^n x_i = x_1\in\{0,1\} $$ so $ \left\lvert \mu - \frac{1}{n}\sum_{i=1}^n x_i \right\rvert = 1/2 $ a.s.

Related Question