[Math] Find a two-term expansion for the root of 1+\sqrt(x^2+\epsilon)=e^x.

asymptoticsperturbation-theorytaylor expansion

I am trying to find a two term expansion for the root of $$1+\sqrt{(x^2+\epsilon)}=e^x$$. Since $$\epsilon << 1$$, I can tell that this equation behaves like $$1+x=e^x$$ which has a root close to zero. That gave me the idea that the root might look like $$x \approx x_0 + \epsilon^{\alpha} x_1 + \dots $$. I used Taylor expansion for both functions $\sqrt{(x^2+\epsilon)}$ and $e^x$ and then plugged my guess into the Taylor expanded equation and balance of O(1) terms gave me $$x_0=0$$ as expected. My problem begins now. For the higher orders of $\epsilon$. Long story short this expansion ends up with some inconsistencies. So I thought maybe $x \approx x_0+\mu(\epsilon)$. But then again I get nowhere. I am getting frustrated and I would really appreciate some help.

Thank you in advance.

Best Answer

Here's a hint from my instructor. Rearranging the given equation as follows \begin{align*} x^2 + \varepsilon & = (e^x - 1)^2 \\ \varepsilon & = (e^x - 1)^2 - x^2 \\ & = (e^x - 1 - x)(e^x - 1 + x) \end{align*} You can then Taylor expand $e^x$ around 0, substitute the given asymptotic expansion and try to balance term. Note that for $\varepsilon=0$, $x=0$ is the solution to the equation $1 + \sqrt{x^2} = e^x$, so you may choose $x_0 = 0$.