The following is a possible workaround for implementing soft constraints.
We have an objective function
with linear constraints in the form:
In this form the constraints are hard constraints because vector "x" must satisfy the inequality. If we introduce the slack variable "y>=0", rewrite the inequality constraint as
and penalize "y" in the objective function as
min 0.5 (x'Hx + y'Ry) (4)
then the constraint (2) in the form (3) becomes a soft constraint.
If "x" exists that satisfies (2) then optimal value of "y" in (3) will be 0. In case there is no "x" that satisfies (2), then the optimal value of "y" will not be zero but will be smallest to satisfy (3).
Starting with "H, f, A, b, x" values corresponding to (1) and (2), we can define the following variables to solve for (3) and (4):
1. Goal is to find the values for a new optimization variable "X" which is defined as:
2. Define new "H1" matrix as:
H1 = blkdiag(H, R*ones(size(b, 1)));
3. Define new "f1" vector as:
f1 = [f; zeros(size(b, 1), 1)];
4. Define new "A1" matrix as:
A1 = [[A -ones(size(A, 1), size(b, 1))]; [zeros(size(b, 1), size(x,1)) - eye(size(b,1))]];
5. The "b" variable remains unchanged.
6. Finally, the new optimization problem is to execute:
X = quadprog(H1, f1, A1, b);
and we will receive both "x" and "y" values in "X".
Note that similar formulations can also be implemented to incorporate soft* *constraints in other optimization problems as well using different MATLAB optimization functions.
Best Answer