I came across this matrix expansion based on Taylor expansion, which I could not derive: let $A=(\Sigma(\theta^{'})-\Sigma(\theta))\Sigma^{-1}(\theta)$,
$$\log\det(I+A)=tr(A)-R_3$$
with
$$R_3\le c_3\sum_{i=1}^p\lambda_i^2$$
where $p$ is the dimension of $A$,$c_3$ is some constant, and $\lambda$s are the eigenvalues of $A$.
- Could anyone provide any hint on how to derive this equality for expansion or at least the inequality for bounding the $\log\det(I+A)$?
- Is this a problem on Taylor expansion of a function of single variable $\theta$?
I can understand this result is reasonable, since first term is first order ($\sum_i\lambda_i$), the second term is bounded by second order terms. However, I cannot find a direct formula to derive this.
Maybe relevant: in Anderson Intro to multivariate statistics, it has a Theorem A.4.8 states that: $$\det(I+xC)=1+xtr(C)+O(x^2),$$
could be useful?
Best Answer
You could use the identities $tr A = \sum_k \lambda_k$ and $\det A = \prod_k \lambda_k$.
Note that eigenvalues of $I + A$ are just $\lambda_k +1$.
Then we have \begin{equation} \log \det (I+A) = \sum_k \log (1 + \lambda_k) = \sum_k \lambda_k + \sum_k (\log (1+\lambda_k) - \lambda_k) = tr(A) + R_3 \end{equation}
where $0 \geq R_3 = \sum_k (\log (1+\lambda_k) - \lambda_k) \geq \sum_k (\frac{\lambda_k}{1+\lambda_k} - \lambda_k) = -\sum_k \frac{\lambda_k^2}{1+\lambda_k} \geq -\sum_k \lambda_k^2$.