Use the inner product. For an eigenvalue $\lambda$ and eigenvector $x$ of $B^*B$
$\lambda ||x||^2=\langle B^*Bx,x\rangle=||Bx||^2$.
Hence $\lambda$ is real and nonnegative.
$\newcommand{\+}{^{\dagger}}%
\newcommand{\angles}[1]{\left\langle #1 \right\rangle}%
\newcommand{\braces}[1]{\left\lbrace #1 \right\rbrace}%
\newcommand{\bracks}[1]{\left\lbrack #1 \right\rbrack}%
\newcommand{\ceil}[1]{\,\left\lceil #1 \right\rceil\,}%
\newcommand{\dd}{{\rm d}}%
\newcommand{\ds}[1]{\displaystyle{#1}}%
\newcommand{\equalby}[1]{{#1 \atop {= \atop \vphantom{\huge A}}}}%
\newcommand{\expo}[1]{\,{\rm e}^{#1}\,}%
\newcommand{\fermi}{\,{\rm f}}%
\newcommand{\floor}[1]{\,\left\lfloor #1 \right\rfloor\,}%
\newcommand{\half}{{1 \over 2}}%
\newcommand{\ic}{{\rm i}}%
\newcommand{\iff}{\Longleftrightarrow}
\newcommand{\imp}{\Longrightarrow}%
\newcommand{\isdiv}{\,\left.\right\vert\,}%
\newcommand{\ket}[1]{\left\vert #1\right\rangle}%
\newcommand{\ol}[1]{\overline{#1}}%
\newcommand{\pars}[1]{\left( #1 \right)}%
\newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}}
\newcommand{\pp}{{\cal P}}%
\newcommand{\root}[2][]{\,\sqrt[#1]{\,#2\,}\,}%
\newcommand{\sech}{\,{\rm sech}}%
\newcommand{\sgn}{\,{\rm sgn}}%
\newcommand{\totald}[3][]{\frac{{\rm d}^{#1} #2}{{\rm d} #3^{#1}}}
\newcommand{\ul}[1]{\underline{#1}}%
\newcommand{\verts}[1]{\left\vert\, #1 \,\right\vert}$
Let's consider a particular eigenvector $v_{0}$ of $A$ where $\lambda_{0}$ is the correspondent eigenvalue. Namely, $Av_{0} = \lambda_{0}v_{0}$. 'Under the action' of $A + \epsilon V$, the eigenvalue problem becomes
$\pars{A + \epsilon V}v = \lambda v$. We expand $v$ and $\lambda$ in powers of $\epsilon$ as:
$$
v = v_{0} + v_{1}\epsilon + v_{2}\epsilon^{2} + \cdots\,,
\qquad
\lambda = \lambda_{0} + \lambda_{1}\epsilon + \lambda_{2}\epsilon^{2} + \cdots
$$
In order to derive the contributions to the eigenvector and eigenvalue
$\ul{\mbox{up to order}\ \epsilon}$, it's sufficient to write:
$$
\pars{A + \epsilon V}\pars{v_{0} + \epsilon v_{1}}
=
\pars{\lambda_{0} + \lambda_{1}\epsilon}\pars{v_{0} + \epsilon v_{1}}\,,
\quad
\left\vert%
\begin{array}{rcl}
Av_{0} & = & \lambda_{0}v_{0}
\\
Av_{1} + Vv_{0} & = & \lambda_{0}v_{1} + \lambda_{1}v_{0}
\end{array}\right.
$$
where we equated terms which correspond to the same power of $\epsilon$
$\pars{~\ul{\mbox{up to order}\ \epsilon}~}$.
Applying $v_{0}^{\sf T}$, by the left, to both sides of the last expression:
$$
v_{0}^{\sf T}Av_{1} + v_{0}^{\sf T}Vv_{0} = \lambda_{0}v_{0}^{\sf T}v_{1} + \lambda_{1}v_{0}^{\sf T}v_{0}
$$
Since $A$ is a symmetric matrix, we'll have
$$
\color{#ff0000}{v_{0}^{\sf T}Av_{1}} = \pars{v_{1}^{\sf T}Av_{0}}^{\sf T}
=
\pars{v_{1}^{\sf T}\lambda_{0}v_{0}}^{\sf T}
=
\color{#ff0000}{\lambda_{0}v_{0}^{\sf T}v_{1}}
\quad\mbox{such that}\quad
\lambda_{1} = {v_{0}^{\sf T}Vv_{0} \over v_{0}^{\sf T}v_{0}}
$$
Then, $\pars{~\ul{\mbox{up to order}\ \epsilon}~}$:
$$\color{#0000ff}{\large%
\lambda
\approx
\lambda_{0} + {v_{0}^{\sf T}Vv_{0} \over v_{0}^{\sf T}v_{0}}\,\epsilon}
$$
Best Answer
Suppose $A$ is an invertible square matrix and $B$ is its inverse.
Eigenvalues of $B$ are inverse to those of $A$ (and vice versa) and $A,B$ share eigenvectors corresponding to the pairs of inverse eigenvalues $(\lambda,\lambda^{-1}).$
To prove it,
let $\lambda\neq 0$ be an eigenvalue of $A$ and $u$ a right eigenvector corresponding to $\lambda.$ This writes $$u=(BA)u=B(Au)=B(\lambda u)=\lambda Bu.$$ Therefore $$Bu={\lambda}^{-1}u.$$
Similarly, if $v$ is a left eigenvector corresponding to the eigenvalue $\lambda$ of $A,$ then $v$ is also a left eigenvector of $B$ corresponding to $\lambda^{-1}.$
This proves also that inverse matrices have equal algebraic multiplicity and equal geometric multiplicity.
EDIT
Left (or "row") eigenvectors corresponding to the eigenvalue $\mu$ satisfy the relation $$v^TA=\mu v^T.$$ Let us prove that the eigenvalues corresponding to the left or right eigenvectors are the same. (So, the name "eigenvalue" without specifying the side is legitimate.)
For, realize that $A$ and its transpose $A^T$ share eigenvalues because they have the same charasteristic polynomial.
Let $(\lambda,u)$ be an eigenpair of $A.$ $$\begin{aligned}Au=\lambda u \iff &(Au)^T=(\lambda u)^T\\ \iff & u^T A^T=\lambda u^T \end{aligned}$$ Therefore, there exists a row vector $v^T$ such that $$v^T A=\lambda v^T.$$ The rest of the proof can be completed analogously.