Hi all.
Given 3 known m by m matrices M1 and M2 and B, I want to find the matrix X that minimize the following objective:
Obj = -trace(M1*inv(M2+X)*M1*inv(M2+X))
subject to X-B being semidefenite, i.e.,
X – B >=0.
The matrix B is purely complex and anti-symmetric (i.e., transpose(B) = -B) while M1, M2 and the solution X are real symmetric matrices.
I tried to use fmincon, but I ran into some issues with fmincon and complex numbers. An example code in which dimension m=2 is provided below.
[M1,M2] = cov_mat(1);B = 1i*[0 1;-1 0];fun =@(X) -trace((M1/(M2+X))^2);X0 = eye(2*m);% If I understand correctly, my constraint should be expressed in a vectorised manner,
% i.e., X-B>=0 should be replaced with kron(eye(2*m),eye(2*m))*X(:) >= B(:);
A = -eye(2*m);A = kron(A,eye(2*m));B = B(:);X = fmincon(fun,X0,A,B);function [sigma,dsigma] = cov_mat(T)sigma = coth(1/T) * eye(2);dsigma = (coth(1/T)^2 - 1)/T^2 * eye(2);end
Any hints on how to fix this, or any alternative methods to solve this problem are appreciated.
Best Answer