I have a trigonometric function; for instance
$$f(x)=\left(\cos{\frac{33}{x}\pi}\right) (\cos{x \pi})-1$$
I wanted to know the zeroes of this particular function, so I thought I could look into some root-finding algorithms (Newton's, Halley's, Secant…). However, they don't seem to work as $f'(x)=0$ at the roots of $f(x)$, so all those methods are not guaranteed to converge.
So, I was thinking, is there some type of root-finding algorithm for this particular trigonometric equation? Or at least transform this equation into one that the roots would go through the x-axis rather than "bouncing" off it, so Newton's method would apply.
Also, I am focused on roots $>1$ and $<33$.
Note: Although the given example can be solved with trigonometric techniques, I am specifically looking for numerical methods. The example was chosen to make it easy to check the roots. I can generalize it to say for any $$f(x)=\left(\cos{\frac{n}{x}\pi}\right) (\cos{x \pi})-1$$ and an interval $$[a,b]$$ where there is only one root in that interval, is there a way to use numerical methods that are guaranteed to converge at the root to find that root?
Best Answer
The roots have multiplicity
The situation for the given function is that the roots are at the same time maxima of the function, that is, they have multiplicity $2$, as $$ f(x)=\left(1-2\sin^2\frac{33\pi}{2x}\right)\left(1-2\sin^2\frac{\pi x}{2}\right)-1 $$ so after expanding $-f(x)$ is a sum of squares minus the product of these terms. Methods that are developed for finding single roots will either slow down or fail to converge at roots of higher multiplicity. Newton's and Halley's method slow down.
There are many local extrema
Another problem with applying Newton is that this function has many local maxima and minima at small $x$ due to the first factor. There the derivative is zero, so that the Newton step, considered as function of $x$, has as many poles. Any improved method based on Newton's method will have as many or more poles, even if locally around the roots of $f$ the convergence is better.
Note that at a double root, where locally $f(x)=c(x-r)^2$, the Newton step maps $x$ to $\frac{x+r}2$ and the Halley step to $\frac{x+2r}3$. In the plots, this is somewhat visible around the roots $x=3$ and $x=11$.
Modifying Newton's method
Knowing this and the possibility of a double root, one can change the Newton step to alternate steps of single and double step size. Then at simple roots the single step will reduce the distance to the root quadratically while the following double step will overshoot the root, however with a smaller step size. At a double root the single step will reduce the distance by half, while the following double step will restore quadratic convergence. In each case, the "wrong" step does not make the situation worse, while the "right" step proceeds with the expected quadratic convergence.
Finding roots inside intervals
If an interval is small enough for a given function, then it either has no root inside the interval or it is contained in the basin of attraction of the root inside. Finding a subdivision of a given interval that is fine enough is again a heuristic task.
As a python code, this could look like
Called as
find_roots(method,2,12,segments=14)
this returns the resultsNote that in the last method, each iteration contains two Newton steps. If one counts the effort in function evaluations, then Newton gets a factor of $2$, Halley a factor of $3$, and the double step method a factor of $4$, giving the first two methods a similar complexity.
Appendix: More code
The method steps are standard implementations
The function implementation provides also the first and second derivative à la algorithmic differentiation (AD) in forward mode
The call of the root finder procedure is