Find the shortest distance from a point to curved surface

geometrylinear algebra

Find the shortest distance from a point $(0,0,0)$ to curved surface $x^2+2y^2-z^2=5.$


What I have done is,
Let a point in the curved surface be $(a,b,c)$, and $\begin{bmatrix} {a – 0} \\
{b – 0}\\
{c – 0}\\
\end{bmatrix}$ be vector v.
A vector w perpendicular to curved suface at (a,b,c) would be
$ \begin{bmatrix} {2a} \\
{4b}\\
{-2c} \\
\end{bmatrix} $.
Distance is $\frac{10}{2\sqrt{a^2+4b^2+c^2}}$ .

Then $\begin{bmatrix} {a – 0} \\
{b – 0}\\
{c – 0}\\
\end{bmatrix} \cdot \begin{bmatrix} {2a} \\
{4b}\\
{-2c} \\
\end{bmatrix} = 2a^2+4b^2-2c^2 = \sqrt{a^2+b^2+c^2}\cdot 2\sqrt{a^2+4b^2+c^2}\cdot \cos{0} =10$.
and $\begin{bmatrix} {a – 0} \\
{b – 0}\\
{c – 0}\\
\end{bmatrix} \times \begin{bmatrix} {2a} \\
{4b}\\
{-2c} \\
\end{bmatrix} = \begin{bmatrix} {-2bc-4bc} \\
{2ac+2ac}\\
{4ab-2ab} \\
\end{bmatrix} = \begin{bmatrix} {-6bc} \\
{4ac}\\
{2ab} \\
\end{bmatrix}= \sqrt{()} \cdot \sqrt{()} \cdot \sin{0}$ = 0
$\quad \to$ For this to be true, two of a,b, and c have to be zero.
So the answer is $\frac{10}{2\sqrt{10}}=\frac{\sqrt{10}}{2}$.

Is this correct? and is there a better way?

Best Answer

Alternatively, you can set up an optimization (minimization) problem.

Let the point $(a,b,c)$ belong to the curve $x^2+2y^2-z^2=5$. The squared distance from the point to the origin is: $$d^2(a,b,c)=(a-0)^2+(b-0)^2+(c-0)^2=a^2+b^2+c^2,$$ which needs to be minimized subject to the constraint: $a^2+2b^2-c^2=5$. Using the Lagrange multiplier's method: $$L(a,b,c,\lambda)=a^2+b^2+c^2+\lambda (5-a^2-2b^2-c^2)\\ \begin{cases}L_a=2a-2a\lambda =0\\ L_b=2b-4b\lambda=0 \\ L_c=2c-2c\lambda =0\\ L_{\lambda}=5-a^2-2b^2-c^2=0\end{cases}\Rightarrow \\ (a,b,c)=\left(0,\pm \sqrt{\frac 52},0\right) \Rightarrow d^2=\frac 52 \Rightarrow d=\sqrt{\frac 52} \ \text{(min)}\\ $$