Linear optimization: Is the shadow price of a constraint a monotone function of the RHS

linear algebralinear programming

I am working on a linear programming model, and specifically interested in sensitivity analysis of the RHS of constraints.

My model is aiming to maximize an objective function, and I am interested in constraints of the shape $\le$.

It is easy to obtain a shadow price using a solver, including its RHS range. In addition I know it is possible to create a graph/table of the shadow price corresponding to a constraint.
However, I wonder whether it holds in general that such a graph will be monotone.

I.e. in the case of maximizing and a $\le$ constraint with a positive shadow price, will the shadow price decrease (to 0) as I increase the RHS of said constraint?

EDIT:

The general structure of my problem is like such:
$x_i$ are variables, all other symbols are parameters/indexes
$$
maximise \sum_i c_ix_i\\
s.t. \sum_i a_ix_i \ge l_j \qquad\forall j\\
\sum_i b_ix_i \le u_j \qquad \forall j\\
x_i \ge 0 \qquad \forall i
$$

Given that the system is feasible, and thus a shadow price for $u_j$ can be obtained, is the shadow price for constraint $u_j$ decreasing, as $u_j$ itself is increased?

A possible answer might have been given by Proof that $\xi^*(b)$ is concave for arbitrary b in a linear program

Best Answer

Consider the problem $\max \ c^Tx$ such that $Ax \leq b$ and $x \geq 0$

Let $b_2 = b_1$ except the first component with $b_{21} > b_{11}$. And denote $y_i^*$ the dual solution at $b_i$. Since the dual is minimizing, we get $b_1^Ty_2^* \geq b_1^Ty_1^*$ and $b_2^Ty_1^* \geq b_2^Ty_2^*$, and $$ b_1^Ty_2^* + b_2^Ty_1^* \geq b_1^Ty_1^* + b_2^Ty_2^* \ . $$ Since $b_1$ and $b_2$ differs only by their first component, we can get $$ (b_{11}-b_{21})(y_{21}^*-y_{11}^*) \geq 0 $$ which gives, since $b_{11}-b_{21} < 0$, $y_{21}^* \leq y_{11}^*$

Related Question