Calculating the minimum of pointwise maximum of bivariate functions

maxima-minimaoptimizationreal-analysis

I am trying to solve the following problem:

$$\min_{(x,y) \in \mathcal{D}}f(x,y) = \min_{(x,y) \in \mathcal{D}} \max_{x,y}(f_1(x,y), f_2(x,y), f_3(x,y))$$
So the max function is a pointwise maximum of the three functions, and I want to calculate the minimum possible location $(x^*,y^*)$ (and therefore value) of this pointwise maximum on some domain $\mathcal{D}$. The domain here is $[-1,1] \times [-1,1]$. The specific functions in this case are: $$f_1(x,y) = |1 -y||1-x|, f_2(x,y) = |1 + y||1+x|, f_3(x,y) = |\frac{x+y}{2} -y||\frac{x+y}{2} -x| $$

The problem I am having is that calculating the max function itself is difficult and I know it's not the expected way to solve the problem. I think there is some shortcut to discovering how to calculate the minimum location $(x^*,y^*)$, but I am not experienced enough to find it.

If it helps at all, the solution to locate the minimum point is where $f_1(x,y) = f_2(x,y) =f_3(x,y)$, but I no idea how this conclusion was reached and can't seem to work backwards from there either. Using the domain given, this leads to $(x^*,y^*) = (\pm \sqrt{\frac{1}{2}}), \mp \sqrt{\frac{1}{2}})$.

Any help would be greatly appreciated. Thanks!

Best Answer

The conclusion that $f_1(x,y)=f_2(x,y)=f_3(x,y)$ in optimum can be argued by showing that if $f_i(x,y)>\max\{f_j(x,y), f_k(x,y)\}$ ($i,j,k$ is a permutation of $1,2,3$), then we can always find some $(x',y')$ in the neighborhood of $(x,y)$ such that $f_i(x,y)>f_i(x',y')>\max\{f_j(x',y'), f_k(x',y')\}$, which then implies that $(x,y)$ does not achieve the minimum in your problem.

The argument should not be difficult, but is just a little bit mass. A straightforward way is to consider the partial derivatives of each of the three functions (notice that the absolute value sign in $f_1$ and $f_2$ can be dropped for free, while $f_3(x,y)$ can be rewritten as $(x-y)^2/4$, by which all three are actually smooth and calculus works).

For example, we argue that it is impossible to have \begin{equation}f_1(x,y)>\max\{f_2(x,y), f_3(x,y)\}\end{equation} in optimum. To this end, let $A=f_1(x,y)$ and $B=\max\{f_2(x,y), f_3(x,y)\}$, where $A>B$, and we denote $\theta=A-B$. It is clear that $A>B\ge 0$, which then implies that $x,y<1$, and thus there exists some $\delta>0$ such that $\max\{x,y\}+\varepsilon<1$ for all $\varepsilon\in (0,\delta)$. Since $$\frac{\partial f_1}{\partial x}=y-1<0,\,\,\,\,\,\frac{\partial f_2}{\partial y}=x-1<0,$$ it is clear that for all $\varepsilon\in (0,\delta)$, $f_1(x+\varepsilon, y+\varepsilon)<f_1(x,y)$. Also, since all three functions are continuous, there exists some $\delta'>0$ such that for all $(x',y')$ that are at most $\delta'$-distant from $(x,y)$, $|f_i(x',y')-f_i(x,y)|<\theta/3$, $i=1,2,3$. Therefore, pick $x''=x+\min\{\delta, \delta'\}/\sqrt 2$ and $y''=y+\min\{\delta, \delta'\}/\sqrt 2$. We thus have $$f_1(x'',y'')>f_1(x,y)-\frac{\theta}{3}>\max\{f_2(x,y), f_3(x,y)\}+\frac\theta 3\ge\max\{f_2(x'', y''), f_3(x'', y'')\}$$ while $$f_1(x'',y'')<f_1(x,y),$$ showing that $(x,y)$ fails to achieve the optimum to the minmax problem.

Related Question