[Math] Weighted Average Proof

probabilitystatistics

Been stuck on this for a while now, seems pretty straightforward but can't seem to prove it.

Given $\mu$ is a weighted average of $\mu_1$ and $\mu_2$ such that $\mu = x_1\mu_1 + x_2\mu_2$ where $x_1$ and $x_2$ are positive real numbers and $x_1 + x_2 = 1$, prove that $\mu$ must lie between $\mu_1$ and $\mu_2$.

And then in the more general case suppose $\mu = x_1\mu_1 + x_2\mu_2 + \cdots + x_n\mu_n$ where again the $x_i$'s are positive and $\sum x_i = 1$, prove that $\mu$ must lie between the smallest and largest of the $\mu_i$'s.

Best Answer

Let's get you started.

Suppose without loss of generality that $\mu_1 \leq \mu_2$. Then we have that $$ \mu = x_1\mu_1 + x_2\mu_2 \leq (x_1 + x_2)\mu_2 = \mu_2.$$ Each other subcomponent of the proof looks extremely similar to this line.

Related Question