[Math] Starting with Calculus of Variations and Optimal Control Theory

book-recommendationcalculus-of-variationscontrol theoryoptimal control

I want to study the calculus of variations. I understand this to be a "more advanced version" of calculus, in the sense that we maximize functionals (functions of functions), by choosing a particular function, rather than maximizing a function, by choosing a particular variable.

Two questions about this:

  1. I read online that the calculus of variations $(1)$ is a special case of "Optimal Control Theory" $(2)$, which in turn is a part of "control theory" $(3)$. However, I understand control theory to be about dynamical systems (in engineering), and how to set a variable in order to control other variables in that system in some way (is that correct?). So it is not entirely clear to me what the relation is between $(1), (2), (3)$?
  2. Also: Is it recommendable to study calculus of variations first, and then study Optimal Control Theory? Or is it old-fashioned to study calculus of variations in isolation?

  3. What do you think is the best resource for studying the calculus of variations, given my background? (or optimal control theory, depending on your answer to the above).

My background: I have mostly knowledge of applied mathematics (multivariable calculus for physics and economics, linear algebra, differential equations, PDE's), and some rudimentary knowledge of pure math (analysis 101, algebra 101, mathematical logic).

Best Answer

Calculus of variation is a special case of optimal control theory in a particular sense.

Consider, Dido's iso-perimetric problem (colloquially said to be the oldest calculus of variation problem) which can be viewed as an optimal control problem, in the sense that what you get to control is the 'shape' of the curve, and your objective is to maximize the area.

Similarly, another classic problem in calculus of variation is the Brachistochrone Problem which got much attention from the likes of Newton, Bernoullis, Leibniz etc. Again in this case, we can consider the control to be the shape of the curve, and the objective to minimize time.

Now as you observed classical control theory is concerned with transfer functions, root locus, stabilization etc. Here traditionally the choice of control has been to stabilize a system, drive a system from one state to another etc. Optimality was indeed an after thought.

The field of optimal control only really took off in the 1960's due to Bellman and Pontryagin who introduced dynamic programmingand the maximum principle respectively. Of these the latter approach is specifically a great generalization of ideas from calculus of variations.

Very simplistically, in calculus of variations, we take a function from a space of functionals, 'perturb it a bit' (that is take its variation) and then derive conditions that function and the variations would satisfy if the function were optimal to begin with. So generally this gives necessary conditions (and indeed the maximum principle is a necessary condition where as the HJB equation is necessary and sufficient).

The great leap from calculus of variations to optimal control was a broad generalization of the kinds of variations we can consider. And so we say that calculus of variations is a special case of optimal control theory.

As a side note, another topic that relates calculus of variations and optimal control is principle of least action


As mentioned in the comments, Dr. Liberzon's book is an excellent introductory resource that combines both calculus of variations and optimal control in a very concise and readable form. There is a couple of chapters introducing calculus of variations and then moving into optimal control theory. So yes, studying calculus of variations first is recommended, but it needn't be a very deep study to get to optimal control. If you have a background in real and functional analysis, that should be sufficient for the Liberzon text.

If you want to study just calculus of variations I found Gelfand and Fomin to be pretty good. For a very deep study of optimal control Athans and Falb is a classic.

Other resources in no particular order are: Lectures on Calculus of Variations and Optimal Control by L.C Young, Mathematical Control Theory by E. D Sontag, Calculus of Variations and Optimal Control by G. Leitmann and lecture notes by H. Sussman.