$\renewcommand\bar\overline$Indeed, it is not obvious why "$u'_{n_k} \to \overline{u}'$ in the sense of $L^r[a,b]$".
Look at this example: $[a,b]=[0,2\pi]$, $u_n(x)=\dfrac{\sin nx}n$, $\bar u=0$. Then $u_n\to\bar u$ uniformly, but $u_{n_k}'\not\to\bar u'$ in $L^r$ for any increasing sequence $(n_k)$ of natural numbers, because $u_n'(x)=\cos nx$ and hence $\|u_n'\|_r^r=c_r:=\int_0^{2\pi}|\cos u|^r\,du>0$ for all $n$.
Note also that your argument does not use the condition that $g$ is convex.
(Also, your post seems to have hardly anything to do with the Tonelli theorem.)
Here is how to fix this. Using, as you did, the Arzelà–Ascoli theorem and then passing to a subsequence, without loss of generality (wlog) we may assume that $u_n\to \bar u$ uniformly. Also, you showed that the sequence $(u_n')$ is bounded in (the reflexive Banach space) $L^r$.
So, by the Eberlein–Shmulyan theorem (Kôsaku Yosida, Functional Analysis, Springer 1980, Chapter V, Appendix, section 4; alternatively, see e.g. this version), passing again to a subsequence, wlog we may assume that $u_n'\to v$ for some $v\in L^r$ in the weak topology of $L^r$.
Further, by Mazur's lemma, for each natural $n$ there exist a natural $N_n\ge n$ and nonnegative real numbers $a_{n,k}$ for $k\in\{n,\dots,N_n\}$ such that $\sum_{k=n}^{N_n}a_{n,k}=1$ and
\begin{equation*}
v_n:=\sum_{k=n}^{N_n}a_{n,k} u_k'\to v \tag{0}
\end{equation*}
in $L^r$.
For $x\in[a,b]$, let now
\begin{equation*}
w_n(x):=u_n(a)+\int_0^x v_n(t)\,dt
=u_n(a)-\sum_{k=n}^{N_n}a_{n,k}u_k(a)+\sum_{k=n}^{N_n}a_{n,k}u_k(x). \tag{1}
\end{equation*}
Since $u_n\to \bar u$ uniformly and the $u_n$'s are uniformly bounded, we see that $w_n\to \bar u$ uniformly and the $w_n$'s are uniformly bounded. Therefore and because $f$ is continuous, we have
\begin{equation*}
J_1[w_n] := \int_a^b f(x,w_n(x))\, dx\to J_1[\bar u]=\lim_n J_1[u_n].
\end{equation*}
Also, by the convexity of $g(x,\xi)$ in $\xi$,
\begin{equation*}
J_2[w_n] := \int_a^b g(x,w_n'(x))\, dx
\le\sum_{k=n}^{N_n}a_{n,k}J_2[u_k].
\end{equation*}
Also, $J[w_n]=J_1[w_n]+J_2[w_n]$. So,
\begin{equation*}
\begin{aligned}
\limsup_n J[w_n]&\le \lim_n J_1[w_n]+\limsup_n J_2[w_n] \\
&\le \lim_n J_1[u_n]+\sum_{k=n}^{N_n}a_{n,k}\limsup_n J_2[u_n] \\
&= \lim_n J_1[u_n]+\limsup_n J_2[u_n] \\
&= \limsup_n (J_1[u_n]+J_2[u_n]) \\
&= \limsup_n J[u_n]= \lim_n J[u_n]=\inf_{u\in X} J[u].
\end{aligned}
\end{equation*}
So, passing to a subsequence, wlog we may assume that
\begin{equation*}
J[w_n]\to\inf_{u\in X} J[u].
\end{equation*}
Recall that $w_n\to \bar u$ uniformly. So, in view of (1) and (0),
\begin{equation*}
\bar u(x)=\bar u(a)+\int_0^x v(t)\,dt
\end{equation*}
for $x\in[a,b]$, so that $\bar u\in AC$ and $\bar u'=v$ almost everywhere (a.e.).
It also follows that $w_n'=v_n\to v=\bar u'$ in $L^r$ and hence in measure. So, by the continuity of $f$ and $g$ and the Fatou lemma,
\begin{equation}
J[\bar u]=J[\lim_n w_n]\le\liminf_n J[w_n]=\lim_n J[w_n]=\inf_{u\in X} J[u].
\end{equation}
It is also easy to see that $\bar u\in X$. Thus, $\bar u$ is a minimizer of $J[u]$ over $u\in X$.
$\newcommand\ol\overline$The steps were:
Show that $f$ is infinitely differentiable, $\xi \mapsto f(x,\xi)$ is convex and $f_{\xi \xi} (x,\xi) > 0$ holds for all $x$ except for $x=0$.
Show that the function
\begin{equation}
\overline{u}(x) = \begin{cases} x^2 \sin \frac \pi x,& x \not = 0\\ 0, & x=0\end{cases} \tag{0}
\end{equation}
yields the minimum of the minimization problem. Also confirm that this function is Lipschitz continuous on $[-1,1]$.
Show that there is no other minimizer except for the function $\overline{u}$ above.
Show that $\overline{u}$ does not belong to $C^1 ([-1,1])$.
Step 1: You have already checked that $f$ is infinitely differentiable.
Next,
\begin{equation*}
f(x,\xi)=w(x)^2(\xi - g(x))^2, \tag{1}
\end{equation*}
where
\begin{equation*}
g(x):=2 x \sin\frac{\pi}{x} - \pi \cos\frac{\pi}{x}.
\end{equation*}
Note that $f(x,\xi)$ is so far undefined at $x=0$ (since $g(x)$ is so far undefined at $x=0$). So, let $g(0):=0$ and $f(0,\xi):=0$, so that (1) holds even for $x=0$.
Then clearly $\xi \mapsto f(x,\xi)$ is convex and $f_{\xi \xi} (x,\xi)=2w(x)^2 > 0$ for all $x\ne0$.
Steps 2 and 3: We have $I[\ol u]=0$, since
\begin{equation*}
I[u] = \int_{-1}^1 w(x)^2(u'(x) - g(x))^2\, dx
\end{equation*}
and $\ol u'=g$. Also, if $u\in X\setminus\{\ol u\}$, then the Lebesgue measure of the set $\{x\colon u'(x)\ne g(x)\}=\{x\colon u'(x)\ne\ol u'(x)\}$ is $>0$ and hence $I[u]>0$.
So, $\ol u$ is the only minimizer of $I[u]$.
Also, $|\ol u'(x)|=|g(x)|\le2|x|+\pi\le2+\pi$ for $x\in[-1,1]\setminus\{0\}$ and $\ol u$ is continuous. So, $\ol u$ is Lipschitz continuous on $[-1,1]$.
Step 4: By the definition of the derivative and (0), $\ol u'(0)=0$. However, $\ol u'(x)$ does not converge as $x\to0$. Indeed, otherwise, $-\pi \cos\frac{\pi}{x}=g(x)-2 x \sin\frac{\pi}{x}=\ol u'(x)-2 x \sin\frac{\pi}{x}$ would converge as $x\to0$, which is clearly not so. So, $\ol u\notin C^1 ([-1,1])$.
This completes the steps.
Best Answer
The answer to (3) is yes. Indeed, then all the conditions of what you call "special version of Tonelli’s theorem" (proved in this answer) are satisfied.
The answer to (2) is no. Indeed, for natural $n$ and $x\in[0,1]$, let $$u_n(x):=\sqrt a\,d(x,E_n),$$ where $E_n$ denotes the set $\{0,\frac1n,\frac2n,\dots,\frac nn\}$ and $d(x,E_n)$ denotes the shortest distance from $x$ to the set $E_n$. Then $u_n\in AC[0,1]$ and $$J[u_n]=b\int_0^1\ln(1+u_n^2(x))\, dx\le b\ln(1+a/(4n^2))\to0$$ as $n\to\infty$. Since $J[\cdot]\ge0$, it follows that $\inf J[\cdot]=0$. However, this zero infimum is not attained at any $u$ -- if $J[u]=0$, then $u=0$ and hence $J[u]=a^2>0$.
The answer to (1) is no as well. This follows because the $u_n$'s as above can be appropriately approximated by $C_1$ functions.