[Math] Resolvent of a matrix

linear algebra

Suppose X is a self-adjoint $n\times n$-matrix. The resolvent of X is defined by $R(z)=(X-zI)^{-1}$, where $I$ denotes the identity matrix and z is a "true" complex number (meaning z has a non-zero imaginary part). First, why is this well-defined, i.e. why is $(X-zI)$ invertible? Second, by Spectral Theorem, we know $X=U\Lambda U^*$, where $U=(u_1,…,u_n)$ is a unitary matrix and $\Lambda$ is a diagonal matrix with the eigenvalues $\lambda_1,…,\lambda_n$ on the diagonal and, by rules for multiplication of blockmatrices, we should have

$X=(u_1,…,u_n)\Lambda\begin{pmatrix} u_1^* \\ \vdots \\ u_n^* \end{pmatrix}=\sum_{j=1}^n \lambda_j u_ju_j^*. $

But how can we derive $R(z)=\sum_{j=1}^n(\lambda_j-z)^{-1}u_ju_j^*$?

Best Answer

(1)$\lambda$ is an eigenvalue if it is a solution of $det(X-\lambda I) = 0$. But hermitian matrix has real eigenvalues only, so $(z\in\mathbb{C\setminus R})\Rightarrow (det(X-zI) \neq 0)$.

(2)I didn't quite understand your queastion, but I'll try to say some thoughts possibly useful for you:

Eigenvalies of hermitian matrix are orthogonal. There is a n eigenvectors. They are normed to unit length. They are a new basis. $X$ has diagonal $\{\lambda_1,...,\lambda_n\}$. $z I$ is the same in new basis. $z I$ has diagonal $\{z,...,z\}$. $X-zI$ has diagonal $\{\lambda_1-z,...,\lambda_n-z\}$. $(X-zI)^{-1}$ has diagonal $\{(\lambda_1-z)^{-1},...,(\lambda_n-z)^{-1}\}$. Contraction of your matrix is real number which don't depend on a basis. For example when U is unitarien it can be proved like that: $v=Uw$, $v^T B v=x \Rightarrow (Uw)^T B Uw \Rightarrow w^T U^T B Uw \Rightarrow w^T B U^T Uw \Rightarrow w^T B w $

Related Question