Naive way to calculate the minimum of the Gamma function .

exponential functiongamma functionoptimization

Inspired by some question due to user Claude Leibovici I propose a simple method to calculate the first digit of the minimum of the Gamma function for positive value ($x>0$).

The first idea is to transform a little bit the function Gamma and work with the function :

$$f(x)=\Gamma(x+1)$$

The second natural question is : As the minimum is relatively near of the absissca $x=0.5$ how to transform it to get $x_{min}\simeq 0.5$ ?

After some hours I have found the good candidate wich is :

$$g(x)=\Gamma(x+1)^{x^x}$$

Now the third step is :how to find a relatively good approximation of the function $g(x)$ around $x=0.5$ ?

A natural way for me is to use the exponential function and we have :

$$g(x)\simeq \exp\left(\alpha x^2-\alpha x\right)=h(x)$$ around $x=0.5$ where $\alpha$ verifying $e^{-0.25\alpha}=\Gamma(1.5)^{\sqrt{0.5}}$

So we have as approximation :

$$f(x)\simeq h(x)^{\frac{1}{x^x}}$$

Around the minimum of the Gamma function.

Question :

Can we improve the situation using naive or more complex method ?

Thanks in advance!

Best Answer

I think that we can do it without approximation using the Taylor expansion of a function $$F(x)=f(x)^{x^x} \qquad \text{where} \qquad f(x)=\Gamma(1+x)$$ up to $O\left(\left(x-\frac{1}{2}\right)^3\right)$.

Developed as a series around $x=a$,we are left with $$A+a^aB(x-a)+O\left(\left(x-a\right)^2\right)$$ with $$A=\frac{a^a f'(a)}{f(a)}+a^a (\log (a)+1) \log (f(a))$$ $$B=\frac{\log (f(a)) \left(a \left(a^a (\log (a)+1)^2 \log (f(a))+\log (a) (\log (a)+2)\right)+a+1\right)}{a}$$ $$+\frac{\left(a^a-1\right) f'(a)^2+f(a) \left(2 (\log (a)+1) f'(a) \left(a^a \log (f(a))+1\right)+f''(a)\right)}{f(a)^2}$$

For the present case where $f(x)=\Gamma(x+1)$, after simplification, the problem is to find the zero of $$g(x)=\psi (x+1)+(\log (x)+1) \log (\Gamma (x+1))$$

Using the first iteration of Newton method with $x_0=\frac 12$, we have $$x_1=\frac 12+\frac{-4+2 \gamma +\log \left(\frac{64}{\pi }\right)+\log (2) \log \left(\frac{\pi }{4}\right)}{-4+\pi ^2+4\log ^2(2)+2\gamma (\log (2)-1)+2 \log \left(\frac{\pi }{64}\right)}$$ Evaluated numerically, this is $x_{\text{min}}=\color{red}{0.500812}76$ while its value computed by a full optimization is $x_{\text{min}}=\color{red}{0.50081252}$.

Using the approximation, the minimum value of $F(x)$ is numerically $F_{\text{min}}=\color{red}{0.9181393488698}94$ while, from optimization, it is $F_{\text{min}}=\color{red}{0.918139348869881}$.

We can still do better using one iteration of Newton-type methods of order $n$. This would give $$\left( \begin{array}{ccc} n & x_1^{(n)} & \text{method} \\ 2 & 0.5008125609311657357119337 & \text{Newton} \\ 3 & 0.5008125200962838796408148 & \text{Halley} \\ 4 & 0.5008125195407779789522950 & \text{Householder} \\ 5 & 0.5008125195413028999125796 & \text{no name} \\ 6 & 0.5008125195413028258583701 & \text{no name} \\ 7 & 0.5008125195413028255893431 & \text{no name} \\ \cdots &\cdots &\cdots \\ \infty &0.5008125195413028255895594 & \text{no name} \end{array} \right)$$

Related Question