Find the maximum of $(x_1\cdots x_n)^2$ subject to $x_1^2+\ldots +x_n^2=1$

analysisextremal-combinatoricslagrange multipliermaxima-minima

Find the maximum of $(x_1\cdots x_n)^2$ subject to $x_1^2+\ldots +x_n^2=1$ und show that $$\left (\prod_{k=1}^na_k\right )^{\frac{1}{n}}\leq \frac{1}{n}\sum_{k=1}^na_k$$

For the first part I applied the method of Lagrange multipliers.

We have the function $f(x_1, \ldots , x_n)=(x_1\cdot \ldots \cdot x_n)^2$ and the constraint $g(x,y,z)=x_1^2+ \ldots + x_n^2-1=0$.

The Lagrange function is \begin{equation*}L(x_1, \ldots , x_n ,\lambda )=f(x_1, \ldots , x_n)+\lambda g(x_1, \ldots , x_n)=(x_1\cdot \ldots \cdot x_n)^2+\lambda \left (x_1^2+ \ldots + x_n^2-1\right )=\left (\prod_{j=1}^nx_j\right )^2+\lambda \left (\sum_{j=1}^nx_j^2-1\right )\end{equation*}
We calculate the partial derivatives of $L$ :
\begin{align*}&\frac{\partial}{\partial{x_i}}(x_1, \ldots , x_n ,\lambda )=2\left (\prod_{j=1}^nx_j\right )\cdot \left (\prod_{j=1, j\neq i}^nx_j\right ) +2\lambda x_i \\ & \frac{\partial}{\partial{\lambda }}L(x_1, \ldots , x_n ,\lambda )=\sum_{j=1}^nx_j^2-1 \end{align*} with $1\leq i\leq n$.

To get the extrema we set the partial derivatives equal to zero.

Then we get the following system:
\begin{align*}&2\left (\prod_{j=1}^nx_j\right )\cdot \left (\prod_{j=1, j\neq i}^nx_j\right ) +2\lambda x_i =0 \Rightarrow x_i\cdot \prod_{j=1, j\neq i}^nx_j^2 +\lambda x_i =0 \Rightarrow x_i\cdot \left (\prod_{j=1, j\neq i}^nx_j^2 +\lambda \right ) =0 \\ & \sum_{j=1}^nx_j^2-1=0 \end{align*}

How can we continue?

Best Answer

Just to provide a non-calculus alternative.

We can show that the maximum of $(x_1x_2\cdots x_n)^2$ subject to $x_1^2 + \cdots +x_n^2 =1$ must occur when $x_1^2 = \cdots = x_n^2$ by using AM-GM for two non-negatives (which is relatively easy to show):

AM-GM for two non-neagives: If $A,B\ge 0$, then $(A+B)/2 \ge \sqrt{AB}$ and equality holds if and only if $A=B$. (Proof of this later, or try it yourself.)

Now we prove that this maximum occurs when $x_1^2 =\cdots = x_n^2$. Suppose to the contrary, say $x_1^2\neq x_2^2$. Denote the maximum value as $M$ where $$M = (x_1^2 x_2^2)(x_3\cdots x_n)^2.$$ Now $x_1^2 + x_2 ^2 +\cdots+ x_n^2 = 1$, so $x_1^2 + x_2 ^2 = 1-Q$ for some $Q$. I claim we can pick better values for $x_1$ and $x_2$ that improves the maximum. Indeed, set $y_1 = y_2 = \sqrt{\frac{1-Q}{2}}$. Then note $y_1 ^2 + y_2 ^2 = 1-Q$, so $$y_1^2 +y_2 ^2 + x_3 ^2 +\cdots + x_n^2 = 1.$$ And by AM-GM for two non-negatives, we have $$x_1^2x_2^2 < \left(\frac{x_1^2+x_2^2}{2}\right)^2 = \left(\frac{1-Q}{2}\right)^2 = y_1^2y_2^2,$$ and note this inequality is strict as $x_1^2\neq x_2^2$.

Hence $y_1^2y_2^2(x_3\cdots x_n)^2> M$, contradicting the maximality of $M$.

Thus we conclude $x_i^2$ are all the same values when achieving maximum and hence $x_i^2 = 1/n$. $\blacksquare$


Proof of AM-GM for two non-negatives.

The inequality is straightforward as $0\le (\sqrt A - \sqrt B)^2 = A + B -2\sqrt{AB} $, so $ \sqrt{AB}\le (A+B)/2$. For the equality case, note $ \sqrt{AB} = (A+B)/2$ if and only if $(\sqrt A - \sqrt B)^2 = 0 $, if and only if $A=B$. $\blacksquare$


Remark. Of course we will have to believe that $(x_1x_2\cdots x_n)^2$ will attain a maximum somewhere on the unit sphere to begin with, which is true as $f(x_i)=(x_1\cdots x_n)^2$ is continuous on the compact unit sphere (extreme value theorem). This will be trickier to justify, a non-analysis proof eludes me at the moment.

Remark 2. It isn't too surprising we involve AM-GM here, as you may observe that the corollary of this problem is to demonstrate the general AM-GM inequality for $n$ non-negatives.