[Math] nested integrals in functional derivative

calculus-of-variations

Unlike usual situations in variational calculus where one has a functional like

$$J[f]=\int_{a}^{b}L[x,f(x)]dx$$

I find myself in the position where I have

$$J[f]=\int_{a}^{b}L[x,f(x),G[f(y)](x)]dx$$

such that I have, for example

$$L[x,f(x),G[f(y)](x)]=f(x)G[f(y)](x)$$

and

$$G[f(y)](x)=\int_{c}^{d}f(y)g(y,x)dy$$

How does one go about calculating $\frac{\delta J}{\delta f}$?

Naively treating $G[f(y)](x)$ as 'just' a function in $x$ for which $\partial G[f(y)]/\partial f(x)=0$ would give $\frac{\delta J}{\delta f}=\int_{c}^{d}f(y)g(y,x)dy$ but this seems wrong.

The next obvious thing would be to utilise the chain rule for functional derivatives, but it doesn't seem to be able to handle it since I can't write it in the form $L[G[f]]$ but instead have to write $L[f,G[f]]$, unless I'm missing something..

How does one proceed?

EDIT: a more complete (helpful?) way of phrasing the questions:

Given

$$ J[f]=\int dx L[x,f(x),G[f](x)]$$

and

$$ G[f](x) = \int dy M[x,y,f(y)]dy$$

I can write (I assume no derivatives of $f$ appear anywhere)

$$\frac{\partial J}{\partial f(x)}=\frac{\partial L}{\partial f(x)}$$

which gives me a relation in $x$

and I can write (again assuming no derivatives in $f$)

$$\frac{\partial J}{\partial f(y)}=\int dx'\frac{\partial L}{\partial G}(x')\frac{\partial M}{\partial f(y)}(x',y)$$

which gives me a relation in $y$

But how do I combine these results so that I have a single expression $\delta J/\delta f$ as an expression in $x$.

The answer below seems to me to suggest

$$\frac{\delta J}{\delta f}(x)=\frac{\delta J}{\delta f(x)}(x)+\frac{\delta J}{\delta f(y)}(y)\Bigg |_{y=x}$$

but I don't understand how to justify this.

Is it as simple as considering a 'total variational derivative' of the form

$$\delta J=\frac{\delta J}{\delta f(x)}\delta f(x)+\int dy\frac{\delta J}{\delta f(y)}\frac{\delta f(y)}{\delta f(x)}\delta f(x)$$

with the introduction of the integral over $y$ from the use of the variational chain rule

and with

$$\frac{\delta f(y)}{\delta f(x)}=\delta(y-x)$$

such that

$$\frac{\delta J}{\delta f(x)}(x)=\frac{\delta J}{\delta f(x)}+\int dy\frac{\delta J}{\delta f(y)}\delta(y-x)$$

so that the sifting property kicks in, effectively swapping the variables $x$ and $y$ (incidentally leading to the transpose terms in the matrix style answer below).

So the total variational derivative actually obeys

$$\int dx \frac{\delta J}{\delta f}(x)\phi(x) = \int dx \left(\frac{\delta J}{\delta f(x)}(x)+\int dy\frac{\delta J}{\delta f(y)}(y)\delta(y-x)\right)\phi(x)$$

Is the above correct?

This seems to give me the result below… But I'm unsure as to whether the above 'inspired by regular calculus' steps are legitimate here

Best Answer

I find it helpful in cases such as this to discretize the integrals, so that $g$ becomes a matrix with elements $g_{ij}=g(y_i,x_j)$ and $f$ becomes a vector with elements $f_i=f(x_i)$. Then $J[f]=\sum_{i,j}f_i f_j g_{ji}$ and if we perturb $f\mapsto f+\delta f$, with $\delta f_i =\epsilon\delta_{i,i_0}$, we obtain to first order in $\epsilon$ the variation $$\delta J=\sum_{ij}(\delta f_i f_j + f_i\delta f_j)g_{ji}=\sum_{j}\delta f_{i_0} f_jg_{ji_0} + \sum_i f_i\delta f_{i_0}g_{i_0i}=\delta f_{i_0}\sum_{j}f_j(g_{ji_0}+g_{i_0j})$$ $$\Rightarrow \frac{\delta J}{\delta f_{i}}=\sum_{j}f_j(g_{ji}+g_{ij}).$$ Returning from a discrete index $i$ to a continuous variable $x$, with an integral instead of a sum, we thus have $$\frac{\delta J}{\delta f}(x)=\int f(y)[g(y,x)+g(x,y)]dy$$

Related Question