$\newcommand\ol\overline$The steps were:
Show that $f$ is infinitely differentiable, $\xi \mapsto f(x,\xi)$ is convex and $f_{\xi \xi} (x,\xi) > 0$ holds for all $x$ except for $x=0$.
Show that the function
\begin{equation}
\overline{u}(x) = \begin{cases} x^2 \sin \frac \pi x,& x \not = 0\\ 0, & x=0\end{cases} \tag{0}
\end{equation}
yields the minimum of the minimization problem. Also confirm that this function is Lipschitz continuous on $[-1,1]$.
Show that there is no other minimizer except for the function $\overline{u}$ above.
Show that $\overline{u}$ does not belong to $C^1 ([-1,1])$.
Step 1: You have already checked that $f$ is infinitely differentiable.
Next,
\begin{equation*}
f(x,\xi)=w(x)^2(\xi - g(x))^2, \tag{1}
\end{equation*}
where
\begin{equation*}
g(x):=2 x \sin\frac{\pi}{x} - \pi \cos\frac{\pi}{x}.
\end{equation*}
Note that $f(x,\xi)$ is so far undefined at $x=0$ (since $g(x)$ is so far undefined at $x=0$). So, let $g(0):=0$ and $f(0,\xi):=0$, so that (1) holds even for $x=0$.
Then clearly $\xi \mapsto f(x,\xi)$ is convex and $f_{\xi \xi} (x,\xi)=2w(x)^2 > 0$ for all $x\ne0$.
Steps 2 and 3: We have $I[\ol u]=0$, since
\begin{equation*}
I[u] = \int_{-1}^1 w(x)^2(u'(x) - g(x))^2\, dx
\end{equation*}
and $\ol u'=g$. Also, if $u\in X\setminus\{\ol u\}$, then the Lebesgue measure of the set $\{x\colon u'(x)\ne g(x)\}=\{x\colon u'(x)\ne\ol u'(x)\}$ is $>0$ and hence $I[u]>0$.
So, $\ol u$ is the only minimizer of $I[u]$.
Also, $|\ol u'(x)|=|g(x)|\le2|x|+\pi\le2+\pi$ for $x\in[-1,1]\setminus\{0\}$ and $\ol u$ is continuous. So, $\ol u$ is Lipschitz continuous on $[-1,1]$.
Step 4: By the definition of the derivative and (0), $\ol u'(0)=0$. However, $\ol u'(x)$ does not converge as $x\to0$. Indeed, otherwise, $-\pi \cos\frac{\pi}{x}=g(x)-2 x \sin\frac{\pi}{x}=\ol u'(x)-2 x \sin\frac{\pi}{x}$ would converge as $x\to0$, which is clearly not so. So, $\ol u\notin C^1 ([-1,1])$.
This completes the steps.
$\newcommand{\ep}{\varepsilon}$This conjecture is not true in general.
Indeed, suppose the "convex" part of your conjecture is true. Then (letting $x:=\phi$, $t:=\theta_1$, and $\theta_2\downarrow\theta_1=t$) we see that for any strictly increasing convex smooth function $g$ and all $x$ and $t$ in $(0,\pi/2)$ we would have $h_2(g;x,t):=\partial_x\partial_t\,\ln(g(\cos^2(x-t))-g(\cos^2(x+t))\ge0$. (Note that for all $x$ and $t$ in $(0,\pi/2)$ we have $\cos^2(x-t)-\cos^2(x+t)=\sin2x\,\sin2t>0$, so that $h_2(g;x,t)$ is well defined.)
For real $\ep>0$ and real $u$, let $u_{+;\ep}:=\frac12(u+\sqrt{\ep^2+u^2})$, an "$\ep$-smoothed" version of $u_+:=\max(0,u)$. For $c$ and $c_*$ in $[0,\infty)$, let $g_{c_*,\ep}(c):=(c-c_*)_{+;\ep}$.
Then the function $g_{c_*,\ep}$ is strictly increasing, convex, and smooth on $[0,\infty)$. However, $h_2(g;x,t)=-44051.358\ldots\not\ge0$ if $g=g_{c_*,\ep}$, $c_*=\frac12$, $\ep=\frac1{1000}$, $x=\frac{39}{100}$, and $t=\frac{118}{100}$. So, the "convex" part of your conjecture is not true in general.
Suppose now the "concave" part of your conjecture is true. Then for any strictly increasing concave smooth function $g$ and all $x$ and $t$ in $(0,\pi/2)$ we would have $h_2(g;x,t)\le0$.
For $c$ and $c_*$ in $[0,\infty)$, let $G_{c_*,\ep}(c):=c-\sqrt{\ep^2+(c-c_*)^2}$.
Then the function $G_{c_*,\ep}$ is strictly increasing, concave, and smooth on $[0,\infty)$. However, $h_2(G;x,t)=32614.565\ldots\not\le0$ if $G=G_{c_*,\ep}$, $c_*=\frac12$, $\ep=\frac1{1000}$, and $x=\frac{39}{100}=t$. So, the "concave" part of your conjecture is not true in general either.
$\quad\Box$
Best Answer
Call the sum $S(f)$. Let $0<b<1$, and let $f$ and $g$ be two functions mapping $X$ to $\Bbb R$. Then $$ S(bf+(1-b)g)=b^2S(f)+(1-b)^2S(g)+b(1-b)\sum_{i,j}w_{i,j} [f(x_i)-f(x_j)]\cdot[g(x_i-g(x_j)]. $$ By Cauchy-Schwarz, $$ \sum_{i,j}w_{i,j} [f(x_i)-f(x_j)]\cdot[g(x_i-g(x_j)]\le 2\sqrt{S(f)S(g)}. $$ (The non-negativity of $w_{i,j}$ is used here: $w_{i,j} = \sqrt{w_{i,j}}\cdot \sqrt{w_{i,j}}$.) Therefore $$ \eqalign{ S(bf+(1-b)g) &\le b^2S(f)+(1-b)^2S(g)+2b(1-b)\sqrt{S(f)S(g)}\cr &= \left[b\sqrt{S(f)}+(1-b)\sqrt{S(g)}\right]^2\cr &\le bS(f)+(1-b)S(g), } $$ the final inequality because the square function is convex. This shows that $f\mapsto S(f)$ is convex. (Intuitively, $f\mapsto [f(x_i)-f(x_j)]^2$ is convex because the square is convex; and then $S$ is convex becasue it's a positive-linear combination of such functions of $f$.)