I cannot comment yet, so I'm posting this as an answer.$\def\ci{\perp\!\!\!\perp}$
This is probably not what you were asking, but I think it's interesting and relevant enough to post.
It's known possible to construct arbitrary distributions from uniform variables. Furthermore, given a $\mathcal U[0,1]$ variable, it's possible to produce from it an i.i.d sequence of such variables, which can then be used to obtain more general distributions. We can always extend a space to obtain such variables by$$\hat{\Omega}=\Omega\times[0,1]\text{, }\hat{\mathscr{A}}=\mathscr{A}\otimes\mathscr{B}\text{, }\hat{P}=P\otimes\lambda $$
in which case $\vartheta(\omega,t):= t$ is $\mathcal{U}[0,1]$ and $\vartheta\ci \mathscr{A}$.
For more details see Kallenberg - Foundations of Modern Probability (2002), in particular the discussion before Theorem 6.10 (transfer).
For any permutation $\sigma$ on $\{1,\cdots,n\}$, let $E_{\sigma} = \{ X_{\sigma(1)} > \cdots > X_{\sigma(n)}\}$. Then by symmetry and continuity of $X_i$'s, we have
$$\mathbb{P}(E_\sigma) = \frac{1}{n!}. $$
Indeed, symmetry tells that the value of $\mathbb{P}(E_\sigma)$ is independent of the choice of $\sigma$. Continuity tells that $\sum_{\sigma} \mathbb{P}(E_\sigma) = 1$.
This is the very reason we can play with the order of indices, since you can convert the problem of computing the probability of any events only in terms of orderings of $X_i$'s into the corresponding counting problem. For instance, in OP's problem,
$$ \mathbb{P}(X_n > X_1 > \max\{X_2,\cdots,X_{n-1}\})
= \sum_{\substack{\sigma(1) = n \\ \sigma(2) = 1}} \mathbb{P}(E_{\sigma}), $$
where the sum on the RHS runs over all the permutations $\sigma$ on $\{1,\cdots,n\}$ such that $\sigma(1) = n$ and $\sigma(2) = 1$. Since there are exactly $(n-2)!$ such permutations, we obtain
$$ \mathbb{P}(X_n > X_1 > \max\{X_2,\cdots,X_{n-1}\})
= \frac{(n-2)!}{n!} = \frac{1}{n(n-1)}. $$
Remark. Notice that the entire argument depends only on the following two obserations
The joint distribution of $(X_1, \cdots, X_n)$ is continuous.
$(X_1, \cdots, X_n)$ is exchangeable, i.e. for any permutation $\sigma$, $(X_{\sigma(1)}, \cdots, X_{\sigma(n)})$ has the same distribution as $(X_1, \cdots, X_n)$.
So the above argument extends without modification whenever these two conditions hold.
Best Answer
Since $P(X_i=X_j)=0$ for every $ i \ne j$, we infer that $$\mathbb{P}(X_{1}>X_{2}>\cdots>X_{n-1}<X_{n})= $$ $$ \mathbb{P}(X_{1}> \cdots>X_{n-1})-\mathbb{P}(X_{1}> \cdots>X_{n-1}>X_{n}) =\frac{1}{(n-1)!}-\frac{1}{n!}=\frac{n-1}{n!} \,.$$
$ \,$
Another way to reach the same conclusion is to observe that the event $X_{1}>X_{2}>\cdots>X_{n-1}<X_{n}$ represents $n-1$ permutations of the $n$ variables, determined by what is the first $k \in [1,n-1]$ such that $X_n>X_k$.