The intuition behind linearity of expectations not requiring independence

expected valueintuitionprobabilityprobability theory

I am confused as to the intuition behind the linearity of expectations not requiring events to be independent. Why is this true? I read that since the proof that shows expected values are linear does not use anything regarding independence, independence is not a requirement. I don't quite follow that step. Why would we not need to show that both independent and dependent events have this property?

This also leaves me confused with questions regarding this property. For example, Suppose you toss a fair coin 12 times resulting in a sequence of heads (H) and tails (T). Let N be the number of times that the sequence HTHT appears. For example, HTHT appears twice in HTHTHTTTTTTT. Find E(N)
The answer to this problem is 9/16, which comes from the fact that there is a 1/16 probability that HTHToccurs, starting at index n, with 1 <= n <= 9, and the answer is 9 * 1/16.

Why is it that we can add the probability that the string HTHT occurs starting at any index? I ask this because say HTHT were to appear in the first four flips, then the probability that HTHT occurs starting at the second index is zero because T was the outcome of the second index.

An explanation of the intuition of this property would be appreciated.

Best Answer

It's because summation and integration are linear operations: $$ \sum_j (a x_j + b y_j) = a \sum_j x_j + b \sum_j y_j$$ $$ \int (a f(x) + b g(x))\; dx = a \int f(x)\; dx + b \int g(x)\; dx$$ and expected value is defined by an integral (or a sum for the discrete case).

Related Question