Generalization of $\int_0^\alpha \sqrt{1+\cos^2\theta}\,d\theta>\sqrt{\alpha^2+\sin^2\alpha}$

calculusdefinite integralssolution-verification

I came across a problem that required proving a specific case and then going on to generalize it. While I have no problem with the first part, I need some confidence from someone about the second part. Here we go.

The problem statement:

  1. Show that for $\displaystyle 0<\alpha\leq \frac{\pi}{2}$
    $$
    \int_0^\alpha \sqrt{1+\cos^2\theta}\,d\theta>\sqrt{\alpha^2+\sin^2\alpha}\tag{1}
    $$
  2. Generalize the result in part (1).

My Answer Attempt:
Clearly, LHS of $(1)$ represents the length of the curve $f(\theta)=\sin\theta$ from $\theta=0$ to $\theta=\alpha$ and the RHS represents the distance from the origin to the point $(\alpha,\sin\alpha)$, that is the length of the line from origin to the said point.

enter image description here
Curve in red and Line in blue.

Since both the curve and the line pass through the origin, and the shortest distance between two points is a straight line, we have $(1)$ proven.

The generalization that I've come up with:

For any continuous (not necessarily smooth) curve $f(x)$ in $[0,a]$ such that $f(0)=0$ and $a\in\mathbb{R}$, we have
$$\left |\int_0^a \sqrt{1+\big[f'(x)\big]^2}dx\right|\geq \sqrt{a^2+\big[f(a)\big]^2} $$ Equality holding if the curve is a
straight line or $a=0$.

This is basically saying that the shortest distance between two points is a straight line. The points in our case being the origin and a point on the curve $f(x)$.

The curve has to pass through the origin because otherwise the relation may not hold true. Example: The curve $f(x)=3$ in $[0,\infty)$.

Is this good?

Best Answer

The inequality naturally only makes sense for continuously differentiable $f$. We will use that if $F$ and $G$ are continuously differentiable, and $F(0)=G(0)$ and $F'(y) \geq G'(y)$ for all $y \in [0,z]$ then $F(y) \geq G(y)$ on that interval.

Equality holds for $a=0$. Differentiating both sides w.r.t $a$ gives $$\sqrt{1+(f'(a))^2} \geq \frac{a+f(a)f'(a)}{\sqrt{a^2+f(a)^2}}$$ Which after squaring both sides and rearranging amounts to $$a^2f'(a)^2 + f(a)^2 \geq 2a f(a) f'(a)$$ or $$ (f(a)-a f'(a))^2=0.$$ Insisting that this holds for all $a$ is easily seen to imply linearity of $f$.