[Math] Calculating Distance of a Point from an Ellipse Border

computer scienceconic sectionsgeometry

I'm thinking about using oriented ellipses to represent curves (dents/bumps etc.) in my physics engine, and have a few questions about working with them:

  1. What methods are there to finding the minimum distance between a point and an ellipse? I need methods of varying cost (in terms of # of calculations) for different parts of my engine.

    • I'm currently aware of two methods to testing if a point is inside/outside an ellipse.

      1. In the first you plug in the point coordinates into the equation (x/a)^2 + (y/b)^2 and seeing if it's >, <, or = to 1 (does the output -1 give the min. distance to the ellipse border?)
      2. In the second you translate the point to the ellipse's coordinates and horizontally/vertically stretch both the ellipse and point in order to turn the ellipse into a circle. (I rarely see this method used… any reason I should be aware of?)
  2. How do you test the distance between two ellipses? I figure you could combine the two methods above by transforming both ellipses in a way that makes one a circle, then test the distance from the center of the circle ellipse to the regular ellipse's edge, and finally compare that distance to the radius of the circle ellipse.

Best Answer

Source: Exercise 2.3.18 (p.54) from Convex functions: constructions, characterizations and counterexamples, J.M. Borwein & J.D. Vanderwerff (2010).

Consider $E:=\{(x,y):x^2/a^2+y^2/b^2=1\}$ in standard form. Show that the best approximation is: $$P_E\,(u,v)=\left(\frac{a^2u}{a^2-t},\frac{b^2v}{b^2-t}\right)$$ where $t$ solves $\frac{a^2u^2}{(a^2-t)^2}+\frac{b^2v^2}{(b^2-t)^2}=1$.

Related Question