[Math] Average of square roots’s sum vs. square root of an average

inequalityradicals

I was watching a video on youtube about how colors work in computers, and found this statement:

"The average of two square roots is less than the square root of an
average"

The link to the video where I found this: here

There's even an image in the video with the corresponding algebraic expression, even though in the image it's using a <= (less or equal) sign, instead of a < (less than) sign.

Could someone proof how this makes sense?

P.S. Sorry I don't include the expression in this question, I don't know how to.

Best Answer

This is from the comment of André Nicolas, which I will just attempt to clarify. Assume $x,y\ge0$. We want $${[\sqrt x + \sqrt y]\over 2 } \le \sqrt{(x+y)\over 2}$$

We proceed with inequalities equivalent to the first and to each other:

$${[\sqrt x + \sqrt y]\over 2 } \le {\sqrt{(x+y)}\over \sqrt 2}$$ $${\sqrt x + \sqrt y } \le {2\sqrt{(x+y)}\over \sqrt 2}$$ $${\sqrt x + \sqrt y } \le {\sqrt2\sqrt{(x+y)}}$$ squaring $$x+2\sqrt{xy}+y\le 2x+2y$$ $$x+y-2\sqrt{xy}\ge0$$ $${{[\sqrt x - \sqrt y]}}^2\ge0$$ which we know to be true. Thus, its equivalent inequality, the one we are trying to prove is also true.

Related Question