[Math] the reason behind the Pythagorean relation in a hyperbola

algebra-precalculusanalytic geometryconic sections

I am currently (in my Pre-Calculus course) deriving the equations of the conic sections. I very much understand how the relationship, in an ellipse, between $a, b$, and $c$ is established. Knowing that in an ellipse, the sum of the distances from each focus to the point $(0, b)$, one endpoint of the minor axis, is equivalent to the sum of the distances from each focus to the right-hand vertex, by the very locus definition of the ellipse. Seeing this fact, I set the distance from $(0, b)$ to $(c, 0)$ equal to the distance from $(a, 0)$ to $(c, 0)$ and, through algebra, arrived at the Pythagorean relationship $b^2=a^2-c^2$. (*Note – this is based on an ellipse centered at the origin with major axis lying on the x-axis, but of course I understand how the relationship is maintained no matter how we translate and orient the ellipse. I just used this basic case for the derivation.)

However, I cannot say that I understand the Pythagorean relation established by the hyperbola. I began the derivation with a hyperbola centered at the origin and transverse axis lying on the x-axis. The point $(0, b)$ is not even on the hyperbola itself, while $(a, 0)$ and $(c, 0)$ are, being representative of the right-hand vertex and right-hand focus respectively; yet my textbook throws out the relationship $a^2+b^2=c^2$ out of nowhere and uses that relationship to finish the derivation of the equation. I am comfortable with all other parts of the derivation except this Pythagorean relationship. It comes quite literally completely out of nowhere to me, and I have searched the internet for hours today trying to find the proof of this relationship, and have found absolutely nothing. One site I went to, Purplemath, even wrote verbatim that the proof was "long and painful" and said to just "memorize" the relationship and "move on."

My ultimate question is thus: where on earth does this relationship, $a^2+b^2=c^2$, for a hyperbola, even come from? What is the geometric/algebraic reasoning? I want to fully understand this derivation and this is the sole hindrance.

Thanks very much!

Best Answer

Consider the following image:

enter image description here

Here, I have a drawn a hyperbola of the form $$\frac{x^2}{a^2} - \frac{y^2}{b^2} = 1,$$ for some $0 < a < b$. The asymptotes $y/b = \pm x/a$ have also been shown, and it is easy to see algebraically that the asymptotes indeed must be these equations.

The green rectangle is drawn such that the horizontal width is the distance $2a$, thus opposite sides are tangent to the vertices of the hyperbola at $(\pm a, 0)$. The vertical height of the rectangle is chosen such that the rectangle's vertices are on the asymptotes, thus the height is $2b$ and the rectangle's vertices are $(\pm a, \pm b)$ where the signs can be chosen independently.

The green circle is simply the circumscribing circle to the rectangle. The intersection of this circle with the $x$-axis is the location of the hyperbola's foci, at $(\pm c, 0)$. It is natural to see, then, that $$c^2 = a^2 + b^2,$$ from our description of the rectangle's vertices and the Pythagorean relationship that relates the rectangle's width and height to the circumradius.

The question, then, is why the focus happens to be located at the point of intersection of this circle with the coordinate axis? The reason is that the definition of the hyperbola is the locus of all points whose absolute difference of distances from the foci is constant. So for such a choice of $c$, it is fairly easy (though tedious) to show that points $(x,y)$ satisfying the above equation for the hyperbola will have an absolute difference of distances from $(\pm c,0)$ a constant. What is this constant value?