The advantage of adding $\log$ Barrier to solve a Linear program

convex optimizationconvex-analysislinear programmingoptimization

Let $A \in \mathbb{R}^{n \times m}$, $b \in \mathbb{R}^{n}$, and $x \in \mathbb{R}^{m}$. Let $Ax \leq b$ be the set of linear inequalities which can be written as $a_i^Tx-b_i \leq 0 \,\,\,\,\forall i= 1, \ldots , m$. Consider the following linear program

$$
\min_{Ax\ \leq\ b} c^Tx \tag{1}
$$

where $c \in \mathbb{R}^{m}$ is an arbitrary vector. If this problem has a solution, then according to representation theorem (or Weyl-Minkowski) the solution happens at vertices of the constraint set.
Convex optimization on page 499 says $f(x)=-\sum_{i=1}^m \log (b_i -a_i^Tx)$ is a self-concordant $\log$ barrier function for linear inequalities. So, the above constraint problem can be an unconstraint problem as
$$
\min_{x \in \mathbb{R}^{m}} c^Tx -\sum_{i=1}^m \log (b_i -a_i^Tx) \tag{2}
$$

Let the solutions of (1) and (2) be $x_1^\ast$ and $x_2^\ast$ respectively.

1- Are the solutions the same ($x_1^\ast=x_2^\ast$)? (I mean, Barrier bans the algorithm to approach to the boundary of the constraint set because it goes to infinity. However, the solution is on the boundary. How can (2) catch the solution?)
2- In terms of iterations and algorithm complexity which one is superior? I know for (2) we can use Newton's method to find the answer but how using Newton's method accelerate the algorithm?

3- Can we analytically show that the two problems are the same or one has superiority to the other?

Please address my questions carefully by providing rigorous proofs.

Best Answer

  1. No, the solutions are not the same. On p. 499, the book is merely presenting an example of a self-concordant function.

  2. When you ask about algorithm complexity for problem (1), which algorithm do you have in mind? At this point in the book's development, it is not clear how to solve problem (1) efficiently. The book will present a strategy that utilizes log-barrier functions in chapter 11.

  3. No, the problems are not equivalent.

You should read chapter 11 (Interior-point methods), and specifically section 11.2, which presents a strategy to minimize $f_0(x)$ subject to the constraints $f_i(x) \leq 0, Ax = b$ by solving a sequence of subproblems of the form \begin{align} \text{minimize} & \quad f_0(x) + \sum_{i=1}^m - (1/t) \log(-f_i(x)) \\ \text{subject to} &\quad Ax = b \end{align} with increasing values of $t$. Newton's method is used to solve each subproblem. As $t$ increases, the solution of the subproblem becomes a better approximation to the solution of the original problem. That is how log-barrier functions are used to solve problems with inequality constraints.

Related Question