[Math] Trying to prove shortest distance between two points

differential-geometrysurfaces

I'm trying to prove that the shortest distance between two points in the Euclidean plane is a straight line:

Here is what I've achieved so far; but I've got lost right at the end if anyone could help?

Let $\varphi: [a,b] \rightarrow R$ such that $\varphi(a)=\varphi(b)=0$ and consider $L_t = \text{length}(f + t\varphi )$. If $f$ minimises the length then $\dfrac{d}{dt}L_t |_{t=0} = 0$.

Let us begin by defining the arc length of a curve :
\begin{equation}
\text{Arc length of a curve} = \int_{0}^1 \sqrt{1+ (f'(x))^2} dx
\end{equation}

Let $L$ be the arc length of the graph of $f$, such that:
\begin{equation}
L_t= \int_b^a \sqrt{1+(f'+t\varphi')^2}\,dx
\end{equation}
Then we evaluate:
\begin{align}
\dfrac{d}{dt}L_t\bigg|_{t=0} &= \int_a^b \dfrac{2(f'+t\varphi')\varphi'}{2\sqrt{1+(f'+t\varphi')^2}} \bigg|_{t=0} \\
&= \int_a^b \dfrac{f'\varphi'}{\sqrt{1+(f')^2}}
\end{align}
Using integration by parts:
\begin{equation}
= -\int_a^b \varphi \, \dfrac{d}{dx}\bigg(\dfrac{f'}{\sqrt{1+(f')^2}}\bigg)=0 \quad \text{for any} \, \varphi
\end{equation}
We can now see that:
\begin{align}
\dfrac{d}{dx}\bigg(\dfrac{f'}{\sqrt{1+(f')^2}}\bigg) &=0 \\
\Rightarrow \dfrac{f'}{\sqrt{1+(f')^2}} &= \text{constant} \\
f' &= \text{constant} \,(\sqrt{1+(f')^2}) \\
(f')^2 &= \text{constant} \, (1+(f')^2)
\end{align}

Best Answer

From

$$\dfrac{f'}{\sqrt{1+(f')^2}} = \text{constant}$$

square both sides and take the reciprocal, to get

$$\dfrac{1+(f')^2}{f'^2} = \text{constant}$$

So

$$\dfrac{1}{f'^2} = \text{constant}$$

$$f'^2 = \text{constant}$$

$$f' = \text{constant}$$

Related Question