Find Eigenvalue Expansion for Perturbed Symmetric Matrix – Linear Algebra

linear algebramatricespower series

Let $A$ be a real, symmetrix $n\times n$ matrix with $n$ distinct,
non-zero eigenvalues, and let $V$ be a real, symmetric $n\times n$
matrix.

Consider $A_{\varepsilon}=A+\varepsilon V$, a perturbation of $A$, where
$\varepsilon$ is a small number.

  1. Find an expansion as an $\varepsilon$-series for the eigenvalues and
    eigenvectors of $A_{\varepsilon}$ around the eigenvalues and
    eigenvectors of the original matrix, $A$.
  2. Assume that $\epsilon$ is sufficiently small that one can neglect
    all terms of order $\varepsilon^2$ and higher in the expansion. How
    close are the eigenvalues of $A_{\varepsilon}$ to those of $A$? What
    about the eigenvectors?
  3. What can we say about the eigenvalues and eigenvectors of
    $A_{\varepsilon} $ if $A$ and $V$ are arbitrary $n\times n$ matrices?
    You may assume that $\det A \neq 0$.

Ideas: If $A$ and $V$ commute (thus are simultaneously diagonalizable), there are just the linear term $\varepsilon v_{ii}$, when we look with the right basis. I'm getting the feeling I really don't know what they want me to do with this problem. Any ideas?

Best Answer

$\newcommand{\+}{^{\dagger}}% \newcommand{\angles}[1]{\left\langle #1 \right\rangle}% \newcommand{\braces}[1]{\left\lbrace #1 \right\rbrace}% \newcommand{\bracks}[1]{\left\lbrack #1 \right\rbrack}% \newcommand{\ceil}[1]{\,\left\lceil #1 \right\rceil\,}% \newcommand{\dd}{{\rm d}}% \newcommand{\ds}[1]{\displaystyle{#1}}% \newcommand{\equalby}[1]{{#1 \atop {= \atop \vphantom{\huge A}}}}% \newcommand{\expo}[1]{\,{\rm e}^{#1}\,}% \newcommand{\fermi}{\,{\rm f}}% \newcommand{\floor}[1]{\,\left\lfloor #1 \right\rfloor\,}% \newcommand{\half}{{1 \over 2}}% \newcommand{\ic}{{\rm i}}% \newcommand{\iff}{\Longleftrightarrow} \newcommand{\imp}{\Longrightarrow}% \newcommand{\isdiv}{\,\left.\right\vert\,}% \newcommand{\ket}[1]{\left\vert #1\right\rangle}% \newcommand{\ol}[1]{\overline{#1}}% \newcommand{\pars}[1]{\left( #1 \right)}% \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\pp}{{\cal P}}% \newcommand{\root}[2][]{\,\sqrt[#1]{\,#2\,}\,}% \newcommand{\sech}{\,{\rm sech}}% \newcommand{\sgn}{\,{\rm sgn}}% \newcommand{\totald}[3][]{\frac{{\rm d}^{#1} #2}{{\rm d} #3^{#1}}} \newcommand{\ul}[1]{\underline{#1}}% \newcommand{\verts}[1]{\left\vert\, #1 \,\right\vert}$ Let's consider a particular eigenvector $v_{0}$ of $A$ where $\lambda_{0}$ is the correspondent eigenvalue. Namely, $Av_{0} = \lambda_{0}v_{0}$. 'Under the action' of $A + \epsilon V$, the eigenvalue problem becomes $\pars{A + \epsilon V}v = \lambda v$. We expand $v$ and $\lambda$ in powers of $\epsilon$ as: $$ v = v_{0} + v_{1}\epsilon + v_{2}\epsilon^{2} + \cdots\,, \qquad \lambda = \lambda_{0} + \lambda_{1}\epsilon + \lambda_{2}\epsilon^{2} + \cdots $$ In order to derive the contributions to the eigenvector and eigenvalue $\ul{\mbox{up to order}\ \epsilon}$, it's sufficient to write: $$ \pars{A + \epsilon V}\pars{v_{0} + \epsilon v_{1}} = \pars{\lambda_{0} + \lambda_{1}\epsilon}\pars{v_{0} + \epsilon v_{1}}\,, \quad \left\vert% \begin{array}{rcl} Av_{0} & = & \lambda_{0}v_{0} \\ Av_{1} + Vv_{0} & = & \lambda_{0}v_{1} + \lambda_{1}v_{0} \end{array}\right. $$ where we equated terms which correspond to the same power of $\epsilon$ $\pars{~\ul{\mbox{up to order}\ \epsilon}~}$.

Applying $v_{0}^{\sf T}$, by the left, to both sides of the last expression: $$ v_{0}^{\sf T}Av_{1} + v_{0}^{\sf T}Vv_{0} = \lambda_{0}v_{0}^{\sf T}v_{1} + \lambda_{1}v_{0}^{\sf T}v_{0} $$ Since $A$ is a symmetric matrix, we'll have $$ \color{#ff0000}{v_{0}^{\sf T}Av_{1}} = \pars{v_{1}^{\sf T}Av_{0}}^{\sf T} = \pars{v_{1}^{\sf T}\lambda_{0}v_{0}}^{\sf T} = \color{#ff0000}{\lambda_{0}v_{0}^{\sf T}v_{1}} \quad\mbox{such that}\quad \lambda_{1} = {v_{0}^{\sf T}Vv_{0} \over v_{0}^{\sf T}v_{0}} $$

Then, $\pars{~\ul{\mbox{up to order}\ \epsilon}~}$: $$\color{#0000ff}{\large% \lambda \approx \lambda_{0} + {v_{0}^{\sf T}Vv_{0} \over v_{0}^{\sf T}v_{0}}\,\epsilon} $$
Related Question