Is Young’s inequality useful here

convolutionfunctional-analysisyoung-inequality

I want to prove that for a given $0<\alpha<N$ and for all $0<\varepsilon< N-\alpha$ there exists $C>0$ s.t.
$$ \Vert\vert x\vert^{-\alpha}\ast\vert u \vert^2\Vert_{L^\infty(\mathbb{R}^N)} \leq C\Vert u \Vert_{L^{\frac{2N}{N-\alpha-\varepsilon}}(\mathbb{R}^N)}\Vert u \Vert_{L^{\frac{2N}{N-\alpha+\varepsilon}}(\mathbb{R}^N)} $$

This sounds familiar to the Young's convolution inequality, the problem is the singularity of $\vert x \vert ^{-\alpha}$. One teacher told me to try to write the definition of convolution, separate it (distinguishing between inside and outside of a ball $B(0,R)$ and apply Hölder inequality.
$$(\vert x\vert^{-\alpha}\ast\vert u \vert^2)(x) = \int_{\mathbb{R}^N} \frac{\vert u(y)\vert^2}{\vert x-y\vert^\alpha}\ dy = \int_{B(0,R)} \frac{\vert u(y)\vert^2}{\vert x-y\vert^\alpha}\ dy + \int_{\mathbb{R}^N\setminus B(0,R)} \frac{\vert u(y)\vert^2}{\vert x-y\vert^\alpha}\ dy $$
According to my teacher, after applying Hölder I must get something like
$$(\vert x\vert^{-\alpha}\ast\vert u \vert^2)(x) \leq A(R)\cdot(?) + B(R)\cdot(?)$$
Then he suggested me to minimize that function depending on R, $F(R)=A(R)\cdot(?) + B(R)\cdot(?)$
But I a stucked. Any help would be appreciated. Thanks in advance.

Best Answer

We may write for all $x$, $$ \int_{\mathbb{R}^N} \frac{\vert u(y)\vert^2}{\vert x-y\vert^\alpha}\ \mathrm dy = \int_{|x-y|>R} \frac{\vert u(y)\vert^2}{\vert x-y\vert^\alpha}\ \mathrm dy + \int_{|x-y|\le R} \frac{\vert u(y)\vert^2}{\vert x-y\vert^\alpha}\ \mathrm dy. $$ The first term can be estimated by Holder's inequality $$\begin{align*} \int_{|x-y|>R} \frac{\vert u(y)\vert^2}{\vert x-y\vert^\alpha}\ \mathrm dy&\le\left( \int_{|x-y|>R}\vert u(y)\vert^{\frac{2N}{N-\alpha+\epsilon}}\ \mathrm dy\right)^{\frac{N-\alpha+\epsilon}{N}}\left(\int_{|x-y|>R} \vert x-y\vert^{-\frac{\alpha N}{\alpha-\epsilon}}\ \mathrm dy\right)^{\frac{\alpha-\epsilon}N}\\ &\le \|u\|_{L^{\frac{2N}{N-\alpha+\epsilon}}}^{2}\left(\int_R^\infty r^{-1-\frac{N\epsilon}{\alpha-\epsilon}}\ \mathrm dr\right)^{\frac{\alpha-\epsilon}{N}}=\left(\frac{\alpha-\epsilon}{N\epsilon}\right)^{\frac{\alpha+\epsilon}{N}}R^{-\epsilon}\|u\|_{L^{\frac{2N}{N-\alpha+\epsilon}}}^{2}. \end{align*}$$ Similarly for the second term, we get $$\begin{align*} \int_{|x-y|\le R} \frac{\vert u(y)\vert^2}{\vert x-y\vert^\alpha}\ \mathrm dy&\le\left( \int_{|x-y|\le R}\vert u(y)\vert^{\frac{2N}{N-\alpha-\epsilon}}\ \mathrm dy\right)^{\frac{N-\alpha-\epsilon}{N}}\left(\int_{|x-y|\le R} \vert x-y\vert^{-\frac{\alpha N}{\alpha+\epsilon}}\ \mathrm dy\right)^{\frac{\alpha+\epsilon}N}\\ &\le \|u\|_{L^{\frac{2N}{N-\alpha-\epsilon}}}^{2}\left(\int_0^R r^{-1+\frac{N\epsilon}{\alpha+\epsilon}}\ \mathrm dr\right)^{\frac{\alpha+\epsilon}{N}}=\left(\frac{\alpha+\epsilon}{N\epsilon}\right)^{\frac{\alpha+\epsilon}{N}}R^{\epsilon}\|u\|_{L^{\frac{2N}{N-\alpha-\epsilon}}}^{2}. \end{align*}$$ Finally, by letting $$R^\epsilon =\frac{ \|u\|_{L^{\frac{2N}{N-\alpha+\epsilon}}}}{\|u\|_{L^{\frac{2N}{N-\alpha-\epsilon}}}},$$ we get the desired inequality.

Related Question