Converting job assignment problem into a 0-1 linear programming problem

binary-programminglinear programmingword problem

There are $n$ people who need to be assigned to execute $n$ jobs, one person per job. (That is, each person is assigned to exactly one job and each job is assigned to exactly one person.) The cost that would accrue if the $i$-th person is assigned to the $j$-th job is a known quantity $C[i, j]$ for each pair $i, j = 1, \dots , n$. The problem is to assign the people to the jobs to minimize the total cost of the assignment. Express the assignment problem as a $0$$1$ linear programming problem.

Let $M$ be an $n \times n$ matrix where the value at $M_{ij}$ indicates the cost of assigning person $i$ to job $j$, or $C[i, j]$.

Let $x$ be an $n \times n$ matrix where the value at $x_{ij}$ denotes whether person $i$ was assigned to job $j$. If the assignment occurred, $x_{ij}$ is 1; otherwise, it is 0.

I've managed to come up with these definitions, but unfortunately I'm not sure how to proceed with the problem. The canonical form of a linear programming problem is as follows:

maximize $c^Tx$, subject to $Ax\leq b$, and $x\geq 0$

However, it seems the goal of this problem is to minimize, not to maximize. How do I convert this minimization problem into a maximization problem? Furthermore, how would the constraints be defined by this sort of problem? Are my definitions adequate for the problem?

Best Answer

The required LP is

$\begin{align} \min & \sum_{i=1}^n\sum_{j=1}^n M_{ij} x_{ij} \\ & \sum_{i=1}^n x_{ij} = 1 \quad \forall j \in \{1,\dots, n\} \tag{each job is assigned to exactly one person} \\ & \sum_{j=1}^n x_{ij} = 1 \quad \forall i \in \{1,\dots, n\} \tag{each person is assigned to exactly one job} \\ & x_{ij} \in \{0,1\} \quad \forall i,j \in \{1, \dots, n\} \end{align}$

To understand the objective function, for each job $j$, there exist unique $i_0 \in \{1, \dots, n\}$ such that $x_{ij} = \begin{cases} 1 &\text{if } i = i_0 \\ 0 &\text{otherwise,} \end{cases}$ so the cost for the $j$-th job is $M_{i_0j} = \sum \limits_{i=1}^n M_{ij}$. Sum the previous equality over $j \in \{1,\dots,n\}$ to get the above double sum.

How do I convert this minimization problem into a maximization problem?

To minimize a quantity is equivalent to maximizing its additive inverse.

Furthermore, how would the constraints be defined by this sort of problem?

Use the standard technique of transforming an equality constraint to a set of two inequality constraints:

$$x = a \iff \begin{cases} - x &\le -a \\ x &\le a. \end{cases}$$

Are my definitions adequate for the problem?

They are more than enough. You don't need to define $M$, since you already have $C$, which represents the costs.

To write the LP is a more concise matrix form, denote

  1. $\mathbf{1} \in \{0,1\}^n$ the $n \times 1$ vector filled with $1$.
  2. $C = (C[i,j])_{ij} = (c_{ij})_{ij} \in \mathbb{R}^{n \times n}$: the matrix $C$ is defined by $C[i,j]$ as the $(i,j)$-th element, so that
  3. $(C^T)_{ij} = c_{ji}$: the $(i,j)$-th element of $C^T$ is the $(j,i)$-th element of $C$.
  4. $X = (x_{ij})_{ij} \in \{0,1\}^{n \times n}$

Remarks: I prefer using capital letters for matrices, and use small letters with a subscript to denote the corresponding matrix elements.

Note that the cost function is \begin{align} \sum_{i=1}^n \sum_{j=1}^n C[i,j] x_{ij} &= \sum_{i=1}^n \sum_{j=1}^n (C^T)_{ji} x_{ij} \tag{take transpose of $C$} \\ &= \sum_{j=1}^n \left( \sum_{i=1}^n (C^T)_{ji} x_{ij} \right) \tag{change order of summation} \\ &= \sum_{j=1}^n \left( C^T X \right)_{jj} \tag{defintion of matrix product} \\ &= \mathop{\mathrm{tr}}(C^T X). \tag{definition of trace} \end{align} $\sum\limits_{i=1}^n x_{ij} = 1$ can be rewritten as $\mathbf{1}^T X = \mathbf{1}^T$; $\sum\limits_{j=1}^n x_{ij} = 1$ can be rewritten as $X \mathbf{1} = \mathbf{1}$. Hence the LP in matrix form is $\min\{\mathop{\mathrm{tr}}(C^T X) \mid \mathbf{1}^T X = \mathbf{1}^T, X \mathbf{1} = \mathbf{1}, X \in \{0,1\}^{n \times n} \}$

Related Question