$$\begin{array}{ll} \text{minimize} & f(x) := \max \{ |x_1|,|x_2| \}\\\\ \text{subject to} & g(x) := a_1 x_1 + a_2 x_2 + b \leq 0\end{array}$$
where $f, g$ : $\mathbb{R}^{2}\rightarrow \mathbb{R}$, and $a_{1}, a_{2}, b > 0$.
Since the global minimum of $f$, $(0,0)$, does not satisfy the constraint, and since $f$ is convex, the minimum is attained on the boundary of $C:=\{x \mid g(x)\leq 0\}$. Hence, the minimizer is a point of the form
$$\left(\frac{-b-a_2x_2}{a_1},x_2\right)$$
Intuitively, I can see that at the minimum, we must have $x_1=x_2$, giving us $\left(\frac{-b}{a_1+a_2}, \frac{-b}{a_1+a_2}\right)$ as the minimizer. How can I show this rigorously?
Best Answer
Generalizing and rephrasing slightly, we have the following optimization problem in $\mathrm x \in \mathbb R^n$
$$\begin{array}{ll} \text{minimize} & \| \mathrm x \|_{\infty}\\ \text{subject to} & \mathrm a^\top \mathrm x \leq b\end{array}$$
where vector $\mathrm a \in \mathbb R^n$ has no zero entries. There are two cases to consider:
If the origin is in the feasible region, then the minimum is attained at the origin.
If the origin is not in the feasible region, then the minimum is attained at its boundary.
Hence, in the latter case, we have a least-norm problem in the $\infty$-norm
$$\begin{array}{ll} \text{minimize} & \| \mathrm x \|_{\infty}\\ \text{subject to} & \mathrm a^\top \mathrm x = b\end{array}$$
In $\mathbb R^2$, the feasible region of this least-norm problem is a line that is neither vertical nor horizontal (due to the restrictions on vector $\mathrm a \in \mathbb R^2$). We would like to find the point on this line that is closest to the origin in the $\infty$-norm. Since the level sets of $\| \mathrm x \|_{\infty}$ are squares centered at the origin, then the minimizer is a square's vertex and, thus, its entries have the same absolute value.
[ image source ]