Entropy of random variable which denotes the number of heads landed after 3 coin flips

entropyinformation theory

I am not sure how to work this out but I split the outcomes of 3 coin flips into the 8 possible outcomes: HHH, HHT, HTH, HTT | TTT, TTH, THT, THH

And have gotten a probability mass function for the random variable out of the 3 flips(only 1 head is 1/4, only 2 heads is 3/8, only 3 heads is 1/8) –

I have calculated the entropy as $H(x) = -sum(P(xi)*logP(xi)) =-1/4*log(1/4)-3/8*log(3/8)-1/8*log(1/8)$ – is that method correct?

Best Answer

Your method is right, except that you did not compute the correct mass function reflecting the random variable of interest. For more on entropy, this is a lecture were i explain the intuition behind it.

Let $X$ be the random variable denoting the number of heads in your experiment \begin{align} p_0 \triangleq \Pr(X = 0) &= \Pr(TTT) = \frac{1}{8}\\ p_1 \triangleq\Pr(X = 1) &= \Pr(HTT) +\Pr(THT)+\Pr(TTH) = \frac{3}{8}\\ p_2 \triangleq\Pr(X = 2) &= \Pr(HHT) +\Pr(HTH)+\Pr(THH) = \frac{3}{8}\\ p_3 \triangleq\Pr(X = 3) &= \Pr(HHH) = \frac{1}{8} \end{align} This is a mass function because $\sum p_i = 1$, in contrast to yours which is not. Now, we can easily compute the entropy $$H(X) = - \sum p_i \log p_i$$

Related Question