Calculus of Variations with first derivative boundary conditions

boundary value problemcalculus-of-variationseuler-lagrange-equationvariational-analysis

I am self teaching calculus of variations and have a seemingly basic question, which I cannot find the answer to. I understand that to obtain the shortest path between two points, $A$ and $B$, one can minimise the functional $J[y]$,

$$
J[y] = \int_A^B \sqrt{1 + (y')^2} dx,
$$

where $y' = \frac{dy}{dx}$. The integrand goes in to the Euler-Lagrange equation and everything works out nicely. However, I have only seen solutions where explicit boundary conditions are given, i.e $y(A)$ and $y(B)$ are known, or additionally the area under the line is used as a constraint with Lagrange multipliers.

My question is: is it possible to find an analytic solution for $y(x)$ if $y'(A)$ and $y'(B)$ are the boundary conditions? If not, what about if $y'(A)$, $y'(B)$, $y(A)$ and $y(B)$ are known? My (ignorant?) intuition says that for the trivial case where $y'(A) = y'(B)$, the shortest distance should be a straight line, but this cannot be the case for $y'(A)\neq y'(B)$. However, I do not know how to work with the first derivatives conditions, hence I am seeking help from you fine mathematicians đŸ™‚

Best Answer

  1. In general a first-order variational problem has the following consistent boundary conditions (BCs): essential/Dirichlet BC, natural BC, or some combination thereof.

  2. In OP's case the strong Neumann BC $y^{\prime}(A)=0=y^{\prime}(B)$ happens to be a natural BC. Of course there are infinitely many constant solutions, since the constant is arbitrary.

  3. A weak Neumann BC (where $y^{\prime}(A)$ or $y^{\prime}(B)$ are non-zero constants) is inconsistent with the variational principle.

Related Question