[Math] Farkas’ lemma application

linear algebralinear programming

Farkas' lemma: Let $A$ be $m \times n$ matrix and $b \in \mathbb{R}^m$ $m$-dimensional vector.
Then, exactly one of the following holds:

(a) there exists some $x \in \mathbb{R}^n$, $x \geq 0$, such that $Ax = b$

(b) there exists some vector $y \in \mathbb{R}^m$ such that $y^TA \geq 0$ and $y ^Tb < 0$.

where $p^TA \geq 0$ means that every component of $p^TA$ is non-negative.

The task:

Let $A$ be a $m \times n$ matrix, let $C$ be a $m \times k$ matrix and let $b \in \mathbb{R}^m$. Prove that exactly one of the following holds

(a) there exist $x \in \mathbb{R}^n$ and $u \in \mathbb{R}^k$ such that $Ax + Cu \leq b$ and $x \geq 0$;

(b) there exists $y \in \mathbb{R}^m$ such that $y \geq 0$, $y^TA \geq 0$, $y^TC=0$ and $y^Tb < 0$.


I think you have to introduce vector of slack variables $s = [s_1,s_2,\dots,s_m]^T$, $s_i \geq 0$ such that
$$
Ax + Cu + s = b
$$
ant then somehow apply Farkas' lemma but i don't know how. Any help most welcome.

Edit: with the help of the answer i came up with the following…

(a) there exist $x \in \mathbb{R}^n$ and $u \in \mathbb{R}^k$ such that $Ax + Cu \leq b$ and $x \geq 0$;

is equivalent to

(a') there exist $x \in \mathbb{R}^n$, $u \in \mathbb{R}^k$ and $s \in \mathbb{R}^m$ such that $Ax + Cu + s = b$, $x \geq 0$ and $s \geq 0$;

We define $m \times (n+2k+m)$ matrix
$$ D = \begin{pmatrix} A & C & -C & I
\end{pmatrix}
$$
and apply standard Farkas' lemma to matrix $D$ and vector $b \in \mathbb{R}^m$. We get that exactly one of the following holds

(a') there exists $x' \in\mathbb{R}^{n+2k+m}$, $x' \geq 0$, such that $Dx=b$. We denote first $n$ components of $x'$ as a vector $x \in \mathbb{R}^n$, the following $k$ components as a vector $u'$, the following $k$ components as a vector $u''$ and the last $m$ components of $x'$ as a vector $s$. $Dx = b$ implies $Ax + Cu' – Cu'' + s = b$. Picking $u=u'-u''$ we get $Ax + Cu + s = b$.

(b) there exist $y \in \mathbb{R}^m$ such that $y^TD \geq 0$ and $y^Tb < 0$. $y^TD \geq 0$ implies $y^T A \geq 0$, $y^T C \geq 0$, $-y^T C \geq 0$, $y^T \geq 0$ which is exactly what we wanted to prove.

Best Answer

Hint: Your idea almost works, but you will have to deal with the $C$ part also, let $$ D = \begin{pmatrix} A & C & -C & \mathrm{Id} \end{pmatrix} \in \mathbb R^{m \times (n+2k+m)}. $$ Then apply the first form you gave to $D$ and $b$.