Proving that the expansion of a convex set is convex

convex-analysisreal-analysis

I am trying to solve problem 2.14 from Stephen Boyd's 'Convex Optimization', which is as follows.

Given a convex set $S \subset \mathbb{R}^n$, a norm $\lVert \cdot \rVert$ on $\mathbb{R}^n$, and a constant $a \geq 0$, we have to prove that the set
$$ S_a = \{ x \in \mathbb{R}^n : d(x,S) \leq a \}, $$
is convex, where $d(x,S)$ is the distance between $x$ and $S$, defined as
$$ d(x,S) = \inf_{y \in S} \ \lVert x – y \rVert. $$

So, I take two points $x_1, x_2 \in S_a$, a constant $\alpha \in [0,1]$, denote $\alpha x_1 + (1-\alpha)x_2$ by $x$, and get
$$ d(x,S) = \inf_{y \in S} \ \lVert \alpha x_1 + (1-\alpha) x_2 – y \rVert
\leq \inf_{y \in S} \left[ \alpha \lVert (x_1-y) \rVert + (1-\alpha) \lVert (x_2-y) \rVert \right] $$

by the triangle inequality, but I am stuck here. I feel I am missing a trick since I can't figure out how to use the convexity of $S$. Any help will be greatly appreciated. (This is not a homework question BTW.)

Best Answer

Hint: Take $\epsilon > 0$ arbitrary. Since $x_1, x_2 \in S_a$, there are two points $y_1$ and $y_2$ in $S$ such that $\|x_1- y_1\| <a + \epsilon$ and $\|x_2 - y_2\|<a + \epsilon$. We want to show that there is a point $z$ in $S$ such that $\|\alpha x_1 + (1-\alpha)x_2 - z\| <a+\epsilon$. A natural candidate is the point $z=\alpha y_1 + (1-\alpha)y_2$ and by convexity of $S$ this point belongs to $S$. So you want to show $$ \|\alpha x_1 +(1-\alpha)x_2 - (\alpha y_1 +(1-\alpha)y_2 )\| <a + \epsilon. $$ Use the inequalities $\|x_1- y_1\| <a+\epsilon$ and $\|x_2 - y_2\|<a+\epsilon$ for this.

After this is proven, use that $\epsilon>0$ can be taken arbitrarily small and conclude.

Related Question