Functional Analysis – Looking for a Specific Function

functional-analysisfunctional-equationsfunctions

There was this question on one of the whiteboards at my company, and I found it intriguing. Maybe it's a dumb thing to ask. Maybe there is a simple answer that I couldn't see. Anyway, here it is:

Does there exist a non-trivial, monotonically increasing function such that $f'(x) = f(f(x))$ (in $\mathbb{R}$)?

I checked a bunch of functions, from elementary to special (gamma, digamma, zeta, Riemann, Lambert …) and none seems to work (not surprisingly). I managed to convince myself that a function expressible as a power series would not work, regardless of convergence issues, because the derivative lowers the degree of polynomials, when the composition raises it. The Dirac delta or some sort of generalized function looked promising for a while, but the Dirac delta is not monotonically increasing anyway, and I'm not very familiar with generalized functions. I tried to use the Fourier transform on both sides, but it seems the Fourier transform is difficult for f(f(x)) (at least for me). I thought about somehow seeing "taking the derivative" as a differential operator, finding its (infinite) matrix in some basis (which one?), do the same thing to the RHS and show that the 2 matrices could not be identified (reducing the problem to a linear algebra problem) – that didn't work. Nothing on the geometric front either. I thought about trying to prove that there is no such function by deducing a contradiction, but didn't manage that. My hunch is that no such function exists, based on the completely invalid and semi-meaningless idea that differentiation pulls f in one direction, and composition in the other. Any idea?

Best Answer

FYI this was problem B5 in the 2010 Putnam exam, so you can find it here: http://amc.maa.org/a-activities/a7-problems/putnamindex.shtml

They had a pretty succinct solution. Suppose $f$ is strictly increasing. Then for for any $y_0$ you can define an inverse funciton $g(y)$ for $y>y_0$ such that $x=g(f(x))$. Differentiating, we get $1=g'(f(x))f'(x)=g'(f(x))f(f(x))$, so that $g'(y)=\dfrac{1}{f(y)}$. We know that $g$ obtains arbitrarily large values since it is the inverse function of $f$ and $f$ is defined for all $x$, which means $g(z) - g(y_0) = \displaystyle \int_{y_0}^zg'(y)dy = \int_{y_0}^z\frac{dy}{f(y)}$ must diverge as $z\rightarrow\infty$.

Now all we have to do is show that $f$ is bounded below by a function that causes the integral to converge. For $x>g(y_0)\equiv g_0$, we have $f'(x)>g_0$, so we can assume that for some $\beta$ and $x$ large enough, $f(x)>\beta x$. Iterating this argument, we get that $f(x)>\alpha x^2$ for some $\alpha$ and $x$ large enough. So we can assume that $f(x)$ is asymptotically greater than $\alpha x^2$. But then the integral above converges, contradicting that $g(z$) is unbounded as $z\rightarrow\infty$. Thus, we conclude that $f$ cannot be strictly increasing.