[Math] Proof that the angle sum of a triangle is always greater than 180 degrees in elliptic geometry

euclidean-geometryriemannian-geometryspherical-geometrytriangles

I've scoured the internet and have found many proofs showing that in Euclidean geometry, the angle sum of a triangle is always 180 degrees. I've also found many proofs showing that in hyperbolic geometry, the angle sum of a triangle is always less than 180 degrees. For some reason I have been unable to find a proof that shows that, in elliptic geometry, the angle sum of a triangle is greater than 180 degrees.

Could anyone state the proof, or even better, provide a link or book where I could read up on it?

SOLUTION – verified

Using the reading Sam suggested, combined with a book I've been reading, I think I've come up with a hybrid proof.

Definition: A lune is a wedge of a sphere with angle $\theta$, represented by L($\theta$) in the proof.

$\alpha$, $\beta$, and $\gamma$ are the three angles of the triangle.

Proof

enter image description here

If you read the page Sam suggested, especially part 3. Triangle on Spheres, it should help with the idea of lunes. The six lunes created by the angles of the triangle encompass the entire sphere as well as overlapping the triangle area in the front and rear of the sphere 4 extra times. Thus, we begin with the radius of a sphere plus 4 times the area of the triangle is equal to the 6 lunes.

$4\pi r^2 + 4{\rm area}[\alpha\beta\gamma] = 2L(\alpha) + 2L(\beta) + 2L(\gamma)$

$2(2\pi r^2 + 2{\rm area}[\alpha\beta\gamma]) = 2(L(\alpha) + L(\beta) + L(\gamma))$

$2\pi r^2 + 2{\rm area}[\alpha\beta\gamma] = L(\alpha) + L(\beta) + L(\gamma)$

At this point we need to use a theorem that states that a lune whose corner angle is $\theta$ radians has area $2\theta r^2$.

$2\pi r^2 + 2{\rm area}[\alpha\beta\gamma] = 2\alpha r^2 + 2\beta r^2 + 2\gamma r^2$

$2\pi r^2 + 2{\rm area}[\alpha\beta\gamma] = 2 r^2 (\alpha + \beta + \gamma)$

$\pi + \frac{{\rm area}[\alpha\beta\gamma]}{r^2} = \alpha + \beta + \gamma$

At this point it is clear that the sum of the angles is equal to $\pi$ plus the $\frac{{\rm area}[\alpha\beta\gamma]}{r^2}$ (which cannot be zero).

Best Answer

This is pretty easy. Hint - extend all sides of the triangle to be great circles. Proofs can by found in:

Chapter 9 of Jeff Weeks' book "The shape of space" and in Section 3 of a lecture of Calegari http://lamington.wordpress.com/2010/04/10/hyperbolic-geometry-notes-2-triangles-and-gauss-bonnet/

I'm sure it is also dealt with in Coxeter's book "Introduction to geometry" but I don't have a copy with me.