Let's do a transformation $y = \frac{x_2}{3}$. Then your problem will be equivalent to minimizing $x_1^2+y^2$ subject to $x_1\geq 0$, $y\geq 0$, $x_1+y\geq 1$, and $x_1+\frac{y}{6}\geq 1$.
If the last inequality holds with equality, and we forget about the third one, the answer would need to have $x_1=y=1/2$. This satisfies all other inequalities. Moreover, if you consider any other scenario, it must have $x_1+y= z>1$, in the absence of any other constraints, the function would minimize at $x_1=y=z/2$ which would yield a value higher than what it would be when $x_1=y=1/2$. Therefore, regardless of the other constraints, the minimum value this function can take is $(1/2)^2+(1/2)^2$. Hence, we were able to find a minimizer.
Finally, this function is strictly convex, which allows us to say that the minimizer is unique.
Let $f(x) = (f_1(x), ..., f_m(x))$ and $h(x)=(h_1(x), ..., h_p(x))$. Let $D$ be a set. The primal problem is to find $x \in D$ that satisfies $f(x)\leq 0, h(x)=0$. We say the primal problem is feasible if there is such an $x \in D$.
Define
$$ M = \{(\lambda, v) : \lambda \in \mathbb{R}^m, v \in \mathbb{R}^p, \lambda \geq 0\}$$
The dual function $g(\lambda, v)$ should be maximized subject to the constraint $(\lambda, v) \in M$. Define $d^* = \sup_{(\lambda, v) \in M} g(\lambda, v)$. It is not difficult to show that
$$d^* = \left\{ \begin{array}{ll}
\infty &\mbox{ if the primal is not feasible } \\
0 & \mbox{ if the primal is feasible}
\end{array}
\right.$$
So there are only two possibilities for $d^*$ (either $d^*=0$ or $d^*=\infty)$.
It follows that if there is a $(\lambda, v) \in M$ such that $g(\lambda, v)>0$, then certainly the supremum of $g(\lambda, v)$ cannot be zero, so the supremum must be $\infty$. Conversely, if there is no $(\lambda, v) \in M$ such that $g(\lambda, v)>0$, then the supremum certainly cannot be $\infty$ and so the supremum must be $0$. Thus
$$ d^* = \left\{ \begin{array}{ll}
\infty &\mbox{ if there is a $(\lambda, v) \in M$ such that $g(\lambda,v)>0$. } \\
0 & \mbox{ else}
\end{array}
\right.$$
This is likely what the equation in your question means.
I agree with Tony S.F. comments that the way of writing $d^*$ in your question (which I think is equivalent to my "second way" of writing $d^*$ above) is a bit unusual. However, in your cut-and-paste it seems the authors use the second way to prove the first way (not the other way round as I suggest above).
Actually it is not a bad way to prove it: The authors observe:
$(\lambda, v) \in M \implies (\alpha \lambda, \alpha v) \in M$ for any real number $\alpha \geq 0$.
$g(\alpha \lambda, \alpha v) = \alpha g(\lambda, v)$ for any real number $\alpha \geq 0$.
$(0,0) \in M$.
So if there is a $(\lambda, v) \in M$ such that $g(\lambda, v)>0$, we observe $\lim_{\alpha\rightarrow\infty}g(\alpha\lambda, \alpha v)\rightarrow \infty$ and so $\sup_{(\lambda, v) \in M} g(\lambda, v) = \infty$. On the other hand if there is no $(\lambda, v) \in M$ for which $g(\lambda, v)>0$ then the supremum must be less than or equal to 0; The supremum cannot be negative because $g(0,0)=0$, and so the supremum must be 0.
Best Answer
Not every set has a minimum value, but every set that’s bounded below has an infimum value. If you used min instead of inf, you would lose the ability to talk about many functions