[Math] Expectation of random matrix inverse

pr.probabilityst.statistics

Given a $K\times M$ matrix $X$, where $M\gg K$, comprising independent complex Gaussian random variables, each one with mean
$$E[X_{k,m}]=B_{k,m}$$
and variance
$$Var[X_{k,m}]=\Sigma_{k,m}$$
define the random matrix $R(X)$ as
$$R(X)=I +XX^{H}.$$

My problem is now to compute the expected value of $R^{-1}(X)$, i.e.,
compute
$$E_X[R^{-1}(X)].$$

My idea was to perhaps use a Neumann series expansion of the matrix inverse and then to evaluate moments.

I noted that someone posted a similar question:Expected mean square error of an estimation problem

The questions are clearly related

Best Answer

I'm not sure that I understand well the notations, but:

(1) If $H$ is the transpose, then $XX^H$ is a Wishart matrix, its asymptotic law with $M,N\to\infty$ and $M/N\simeq t>0$ fixed is the Marchenko-Pastur law.

(2) If in addition $R^{-1}(X)=R(X)^{-1}$, then you can recover the law of $R^{-1}(X)$ from that of $XX^H$ just by using the formula $\frac{1}{1+x}=1-x+x^2+\ldots$, under the MP assumptions of course.

(3) Now depending on what your assumption exactly $M>>K$ means, you might either deduce your formula from (2), or need a variation of that, or of the technical results in the proof of that.

In any case, probably an idea would be to look first at the various proofs of the Marchenko-Pastur (Google..) pick one that fits you best (e.g. via the moment method, as you suggest..) and slightly modify where needed. You may also try directly, with the Wick formula.

Related Question