Here's proof of a weaker result. Let $r_n = 2 - s_n$.
$$
r_{n+1} = 2 - s_{n+1}=
2 - \sqrt{2}^{s_n}=
2\left(1 - \sqrt{2}^{s_n-2}\right)=
2\left(1 - \sqrt{2}^{\,-r_n}\right)
$$
Now, let $\alpha = \ln 2$.
$$
\sqrt{2}^{\,-r_n}=
\left(e^{\ln \sqrt{2}}\right)^{-r_n}=
e^{-\frac{1}{2}\alpha r_n}=
1-\frac{1}{2}\alpha r_n + O\left(r_n^2\right)
$$
and so
$$
r_{n+1}=\alpha\,r_n + \phi\left(r_n\right),\quad
\phi\left(s\right) = O\left(s^2\right)
$$
Hence,
$$
\ln r_{n+1}=
\ln\left(\alpha\,r_n + O\left(r^2_n\right)\right)=
\ln\alpha + \ln r_n + \ln\left(1+O\left(r_n\right)\right)=
$$
$$
=\ln\alpha + \ln r_n + \psi\left(r_n\right),\quad \psi(s)=O\left(s\right)
$$
and
$$
\ln r_n = \ln r_1 + \left(n-1\right) \ln \alpha + \sum_{k=2}^n\psi\left(r_k\right)
$$
This last sum is convergent (proof below), so the rate of convergence is determined by behaviour of $\psi\left(r_n\right)$. Since $\psi\left(s\right)=O(s)$, it's in fact determined by behaviour of $r_n$.
Lemma For any $\varepsilon > 0$, we have
$\left|r_n\right|=O\left(\left(\alpha + \varepsilon\right)^n \right)$
Proof Let $\varepsilon > 0$. As $r_n\to 0$ and $\phi\left(s\right)=O\left(s^2\right)$
for sufficiently large $n$ we have $\left|\phi\left(r_n\right)\right|<\varepsilon\, \left|r_n\right|$, and so
$$
\left|r_{n+1}\right|=
\left|\alpha\,r_n+\phi\left(r_n\right)\right| <
\left(\alpha + \varepsilon\right)\left|r_n\right|
$$
and it is clear we can bound $\left|r_n\right|$ by geometric sequence with ratio $\alpha + \varepsilon$.
Using this, we can easily bound the tail of the sum by $O\left(\left(\alpha + \varepsilon\right)^n\right)$, and so, to conclude
$$
\ln\left(2 - s_n\right) = n\ln\ln 2 + c + O\left(\left(\ln 2 + \varepsilon\right)^n\right)
$$
for every $\varepsilon > 0$, where $c=\ln\left(2 - \sqrt{2}\right) - \ln\ln 2 + S$, where $S$ is sum of the series discussed above.
Of course, that's strictly weaker than the conjuecture from question. Still, I believe it's close enough to be at least relevant.
Assume the lowest bases aren't 1. If they are, comparing expressions is trivial, since $1^x=1<2^y$ for any $x,y\in[1,10]$.
Let's start by tackling 3 powers. As you noted, $a^{b^c}$ can be easily compared to $f^{g^h}$ by taking the log, which massively reduces the size of the numbers we are working with. We get $b^c\log(a)$ compared to $g^h\log(b)$, which are now reasonable to try and compute.
For 4 powers, we again take the log. This reduces the problem to comparing $b^{c^d}\log(a)$ and $g^{h^i}\log(f)$. By taking the log again, we get $c^d\log(b)+\log(\log(a))$ and $h^i\log(g)+\log(\log(f))$. This is also very reasonable to compute. One can also see that $\log(\log(x))$ is almost nearly irrelevant. The difference between $\log(\log(2))$ and $\log(\log(10))$ is only about $0.5$ if we use base ten for our logarithms.
For 5 powers, taking log twice gets us to $c^{d^e}\log(b)+\log(\log(a))$ and $h^{i^j}\log(g)+\log(\log(f))$. If we ignore the double log term, we can take another log to get $d^e\log(c)+\log(\log(b))$ and $i^j\log(h)+\log(\log(g))$. And as long as these two are far enough apart, we can ignore the dropped terms. How far apart? Assume wlog that $d^e\log(c)+\log(\log(b))\le i^j\log(h)+\log(\log(g))$. Then we let $y=\log(\log(f))-\log(\log(a))$$=\log(\log(f)/\log(a))$. The question is then a matter of what $\log(x)-\log(x+y)=-\log(1+y/x)$ is i.e. how far off are we when we ignore this term when taking the log of both sides. For simplicity, we use the natural log, base $e$. We then have the readily tight bounds of
$$\frac y{x+y}\le\ln\left(1+\frac yx\right)\le\frac yx$$
which is most nearly zero for large values of $x$. I strongly doubt that this will come into play, unless everything except the dropped terms come out to be equal.
Best Answer
It suffices to see that $3^3>6\times4$ and that
$$3^{6n}>6\times4^n$$
for all $n\ge1$. By induction this gives us:
$$3\uparrow\uparrow(n+1)>6(4\uparrow\uparrow n)$$
for all $n\ge1$.
Of course much better bounds can be given, but this suffices.