$\def\R{\mathbb{R}}\def\N{\mathbb{N}}\def\d{\mathrm{d}}\def\e{\mathrm{e}}\def\peq{\mathrel{\phantom{=}}{}}$The proposition is not necessarily true.
Lemma 1: If $f \in C^∞((0, +∞))$ and $g(x) := f(\e^x)$, then $g \in C^∞(\R)$ and the following conditions are equivalent:
- $\lim\limits_{s → +∞} s^m f^{(m)}(s) = 0$ for all $m \in \N$;
- $\lim\limits_{x → +∞} g^{(m)}(x) = 0$ for all $m \in \N$.
(In fact, $g^{(m)}(x)$ is a linear combination of $f(s), s f'(s), \cdots, s^m f^{(m)}(s)$ with $s = \e^x$.)
Lemma 2: If $h \in C^∞((0, +∞))$, then for each $m \in \N_+$, there exist polynomials $P_m, Q_m \in \R[h_1, \cdots, h_m]$ such that\begin{align*}
(\sin(h(x)))^{(m)} &= P_m(h'(x), \cdots, h^{(m)}(x)) \sin(h(x))\\
&\peq + Q_m(h'(x), \cdots, h^{(m)}(x)) \cos(h(x)).\quad \forall x > 0
\end{align*}
(It can be easily proved by induction on $m$.)
Now define $G(x) = \sin(h(x))$, where $h(x) = x^a$ and $a \in (0, 1)$ is a constant, and take $δ(s) = G'(\ln s)$. Since $\lim\limits_{x → +∞} h^{(m)}(x) = 0$ for any $m \in \N_+$, then Lemma 2 yields that $\lim\limits_{x → +∞} G^{(m)}(x) = 0$ for all $m \in \N_+$, and combining Lemma 1 shows that $\lim\limits_{s → +∞} s^m δ^{(m)}(s) = 0$ for all $m \in \N$.
For $s > r > 1$, making the substitution $u = \e^x$ yields\begin{gather*}
\left| \int_r^s \frac{δ(u)}{u} \,\d u \right| = \left| \int_{\ln r}^{\ln s} δ(\e^x) \,\d x \right| = \left| \int_{\ln r}^{\ln s} G'(x) \,\d x \right|\\
= |G(\ln s) - G(\ln r)| \leqslant |G(\ln s)| + |G(\ln r)| \leqslant 2.
\end{gather*}
For $B > A > 1$ and any $s > 1$, because\begin{align*}
&\peq |G(\ln(Bs)) - G(\ln(As)))| = |\sin(h(\ln(Bs))) - \sin(h(\ln(Bs)))|\\
&= 2 \left| \sin\left( \frac{1}{2} (h(\ln(Bs)) - h(\ln(As))) \right) \right| · \left| \cos\left( \frac{1}{2} (h(\ln(As)) + h(\ln(Bs))) \right) \right|\\
&\leqslant 2 \left| \frac{1}{2} (h(\ln(Bs)) - h(\ln(As))) \right| = |(\ln s + \ln B)^a - (\ln s + \ln A)^a|
\end{align*}
and $\lim\limits_{s → +∞} ((\ln s + \ln B)^a - (\ln s + \ln A)^a) = 0$, so$$
\lim_{s → +∞} \int_{As}^{Bs} \frac{δ(u)}{u} \,\d u = \lim_{s → +∞} (G(\ln(Bs)) - G(\ln(As))) = 0.
$$
However, for any $s > s_0 > 1$, since$$
\int_{s_0}^s \frac{δ(u)}{u} \,\d u = G(\ln s) - G(\ln s_0)
$$
and $\lim\limits_{x → +∞} G(x)$ does not exist, then $\displaystyle \int_{s_0}^s \frac{δ(u)}{u} \,\d u$ does not exist.
Your solution isn't right. If $\sin (nx)<0$, then the product $f_k(x)\sin(nx)$ no longer increases to $f(x)\sin(nx)$ as $k\to\infty$ (it decreases), so monotone convergence cannot be applied anymore.
Note that the clause "for any fixed $M>0$" is actually a red herring; you can completely ignore it. In other words, we can prove the following: for any $f\in L^1(\Bbb{R})$,
\begin{align}
\lim\limits_{n\to\infty}\int_{\Bbb{R}}f(x)\sin(nx)\,dx&=0.
\end{align}
The version you wrote about is a special case obtained by replacing $f$ with $f\cdot \chi_{[-M,M]}$. Actually, one can prove the stronger statement that for any $f\in L^1(\Bbb{R})$, we have $\lim\limits_{t\to\infty}\int_{\Bbb{R}}f(x)e^{itx}\,dx=0$ (by looking at the imaginary part of this integral and considering $t=n\in\Bbb{N}$, we recover the version you're asking about).
Since the more general statement doesn't require any extra effort, we shall prove that instead. Verify first that for $f=\chi_{[a,b]}$, we have $\lim\limits_{t\to\infty}\int_{\Bbb{R}}f(x)e^{itx}\,dx=0$. This is almost obvious since the integral can be explicitly evaluated, and once you do, it's clear that it decays like $\frac{1}{t}$. By linearity, the statement is also true for finite linear combinations of such functions.
Next is a slightly technical measure theory approximation result: the space of "step functions", i.e linear combinations of characteristic functions of compact intervals is dense in $L^1(\Bbb{R})$. In other words, the space
\begin{align}
\text{Step}(\Bbb{R}):= \left\{\sum_{i=1}^nc_i\chi_{[a_i,b_i]}\,\bigg| \text{ $n\in\Bbb{N}$, $a_i,b_i\in\Bbb{R}, c_i\in\Bbb{C}$ for all $1\leq i \leq n$}\right\}
\end{align}
is dense in $L^1(\Bbb{R})$. To prove this, note first of all that the space of simple functions (finite linear combinations of characteristic functions of Lebesgue-measurable sets) is dense in $L^1(\Bbb{R})$. So, to prove that step functions are also dense, it is enough to show that any Lebesgue measurable set $A\subset\Bbb{R}$ of finite measure, can be approximated (in $L^1$ norm) by step functions.
This follows pretty much by regularity of Lebesuge measure. Suppose $A\subset\Bbb{R}$ has finite measure, and let $\epsilon>0$. By inner-regularity of Lebesgue measure, there exists a compact $K\subset A$ such that $m(A)-m(K)<\epsilon$. Also, by definition (if you use the Caratheodory construction) of Lebesgue measure, there exist countably many open intervals $\{I_j\}_{j=1}^{\infty}$ which cover $A$ and such that $\sum_{j=1}^{\infty}m(I_j)<m(A)+\epsilon$. Since these intervals cover the compact set $K$ as well, finitely many of them, say $I_1,\dots, I_n$ (relabel indices if necessary) will cover $K$. Then, the symmetric difference $\left(\bigcup_{j=1}^nI_j\right)\triangle A\subset \left(\bigcup_{i=1}^{\infty}I_i\right)\setminus K$, and the RHS has measure $\leq 2\epsilon$. This shows $\|\sum_{j=1}^n\chi_{I_j}-\chi_A\|_1\leq 2\epsilon$, which proves the required density.
Can you now conclude that the theorem holds for all $f\in L^1(\Bbb{R})$? (you only need the triangle inequality; no need for monotone/dominated convergence).
There are also several other proofs for the Riemann-Lebesgue lemma. If you know that $C^{\infty}_c(\Bbb{R})$ is dense in $L^1(\Bbb{R})$, then you can first prove the theorem for $f\in C^{\infty}_c(\Bbb{R})$ using integration by parts (this gives you the $\frac{1}{t}$ decay factor; and the boundary terms vanish due to compact support). Now another density argument allows you to conclude for all $f\in L^1(\Bbb{R})$.
The other proof I know makes use of continuity of translation in $L^1(\Bbb{R})$ (which itself is often proved by using density of $C_c(\Bbb{R})$ in $L^1(\Bbb{R})$). TO use this method (which I believe is how things are done in Stein and Shakarchi's text), note that due to translation-invariance of Lebesgue measure, for each $\alpha\in\Bbb{R}$, we have
\begin{align}
\int_{\Bbb{R}}f(x)e^{itx}\,dx&=\int_{\Bbb{R}}f(x-\alpha) e^{it(x-\alpha)}\,dx
\end{align}
So,
\begin{align}
\left|\int_{\Bbb{R}}f(x)e^{itx}\,dx\right| &=\left|\frac{1}{2}\int_{\Bbb{R}}[f(x)+ f(x-\alpha)e^{-it\alpha}]e^{itx}\,dx\right|\\
&\leq \frac{1}{2}\int_{\Bbb{R}}\left|f(x)+e^{-it\alpha}f(x-\alpha)\right|\,dt
\end{align}
In particular this holds when $\alpha=\frac{\pi}{t}$, so we have
\begin{align}
\left|\int_{\Bbb{R}}f(x)e^{itx}\,dx\right| &\leq \frac{1}{2}\int_{\Bbb{R}}\left|f(x)-f\left(x-\frac{\pi}{t}\right)\right|\,dx,
\end{align}
and the RHS vanishes as $t\to\infty$, due to continuity of translation in $L^1$.
Best Answer
In general, $f$ will be “almost” a power function. More precisely, it will be regularly varying with index $\beta$, i.e., of the form $f(x)=x^\beta\cdot\ell(x)$ where $\ell$ is a slowly varying function, such as $\ell(x):=\log(1+x)$. This is in essence the content of Karamata's theorem (see Theorems 1.5.11 and 1.6.1 of Bingham, Goldie, and Teugel's book “Regular variation”): Karamata's theorem states that a positive, locally bounded function $f:[0,\infty)\to[0,\infty)$ is a regularly varying function with index $\beta\ge0$ if and only if $$\frac{x^{\sigma+1}f(x)}{\int_0^xt^\sigma\,f(t)\,\mathrm dt}\xrightarrow[x\to\infty]{}\sigma+\beta+1\tag{$C_\sigma$}$$ holds for some $\sigma>-(\beta+1)$. In this case, $(C_\sigma)$ holds for all $\sigma\ge-(\beta+1)$.
The representation theorem (Theorem 1.3.1 in the book) says that we can write any slowly varying function $\ell$ in the form $$\ell(x)=c(x)\exp\left(\int_0^x\varepsilon(u)\frac{\mathrm du}u\right)$$ for some $c$ measurable with $c(x)\to c\in(0,\infty)$ and $\varepsilon(x)\to0$ as $x\to\infty$.