Let $f(x)=\arcsin(1-x)$ for $x\in[0,2]$.
Since the derivative of $f(x)=O\left( x^{-1/2}\right)$ for $x\sim 0$, we let $t=x^{1/2}$ and $g(t)=\arcsin(1-t^2)$.
We will now develop the first few terms of the Taylor series for $g(t)$ around $t=0$.
We have for the first derivative $g^{(1)}(t)$
$$\begin{align}
g^{(1)}(t)&=-\frac{2t}{\sqrt{1-(1-t^2)^2}}\\\\
&=-\frac{2}{\sqrt{2-t^2}}\tag 1
\end{align}$$
Differentiating the right-hand side of $(1)$, we obtain the second derivative, $g^{(2)}(t)$
$$\begin{align}
g^{(2)}(t)&=-\frac{2t}{(2-t^2)^{3/2}}\tag 2
\end{align}$$
Continuing, we have for $g^{(3)}(t)$
$$\begin{align}
g^{(3)}(t)&=-\frac{4(t^2+1)}{(2-t^2)^{5/2}}\tag 3
\end{align}$$
And finally, we have for $g^{(4)}(t)$
$$\begin{align}
g^{(4)}(t)&=-\frac{12t(t^2+3)}{(2-t^2)^{7/2}}\tag 4
\end{align}$$
We evaluate $(1)-(4)$ at $t=0$ and form the expansion
$$\bbox[5px,border:2px solid #C0A000]{\arcsin(1-x)=\frac{\pi}{2}-\sqrt{2}x^{1/2}-\frac{\sqrt{2}}{12}x^{3/2}+O\left(x^{5/2}\right)}$$
I think it's a combination of a lot of things. There is always simply the possibility that some terms cancel out, or that the higher order derivatives don't exist for the function you are trying to approximate.
Many distributions only consider second order interactions between variables (quadratic), and thus derivatives higher than the second end up evaluating as 0 anyway. You're example of the Fisher information as the covariance of the asymptotic distribution of the MLE is a good example. Suppose you were looking at the asymptotic distribution of the mean in a multivariate normal with known precision: $X_i \overset{iid}{\sim} N(\mu,\Omega)$.
If you calculate the Fisher information matrix, you end up getting rid of the exponential through the log, and you're left with quadratic interactions as a function of $\Omega$:
$\mathcal{I}(\mu)_{rs} = \frac{\partial^2}{\partial_{\mu_r}\partial_{\mu_s}} \frac{1}{2}\sum_{r,s}\mu_r\mu_s\Omega_{rs}$.
So you can see that if you consider higher order derivatives you end up having terms which cancel out anyway. This isn't explicitly a Taylor series, but the idea that most models and distributions consider second order interactions gives intuition about why second order approximations come up so much. The fact that second order interactions are so easy to work with also motivates people stopping at a second order approximation, as it's usually sufficient and the problem can rapidly get more difficult to solve if one expands into a higher order approximation.
There's also situations in which clever choices of the argument and point about which you expand your Taylor series can make it such that higher order terms cancel out. You see this a lot with people expanding their function $f(x)$ about local maxima or minima $x^*$, and having the second term $(x f'(x^*))$ fall out. It's very possible that higher order derivatives do exist in some regions of the domain (unlike in the previous Fisher information example), but if you specify $x^*$ conveniently you don't have to worry about it. Other ways this comes up are in limits; if you're interested in studying function $f(x)$ at $f(x+\epsilon)$ with an expansion around $x = 0$, then you end up with:
$f(\epsilon) = f(0) + \epsilon f'(0) + \sum_{n = 2}^\infty \frac{\epsilon^n f^{(n)}(0)}{n!}$
And you can see that if your examining the limiting behavior, with $\epsilon$ very small, then $\epsilon^n$ rapidly becomes trivial, and those higher order terms can be mostly dismissed.
As other people have also said, people definitely do consider the accuracy of the approximation as well, whether that be deterministically or probabilistically. You can see that on the Wikipedia page for the Delta Method: https://en.wikipedia.org/wiki/Delta_method. They briefly go into the order of the approximation, and most advanced textbook do this in general when using Taylor series.
In general I think a lot of it has to do with what people are interested in the problem as well. There's obviously a lot of connections between Taylor series and moments, because it turns what can be a very complicated function into a polynomial where moments are easier to calculate. You can see this used for example in these two pages:
https://en.wikipedia.org/wiki/First-order_second-moment_method
https://en.wikipedia.org/wiki/Taylor_expansions_for_the_moments_of_functions_of_random_variables
So going back to most interesting investigations focusing on second order interactions, or asymptotic normality (where you only need to get the first two moments), it makes sense that people stop at the quadratic term.
Best Answer
Let $y=f(x)=\arccos(1-x)$. Then $$1-x=\cos y=1-\frac{y^2}2+\frac{y^4}{24}+O(y^6),$$ so $$x=\frac{y^2}2-\frac{y^4}{24}+O(y^6).$$ Now clearly there is no actual Taylor series for $y$ about $x=0$ because $f'(0)$ does not exist. However, a generalized power series solution can be written down, known variously as the Frobenius method or the asymptotic expansion of $y=f(x)$ near $x=0$. Solving this equation formally:
$$2x=y^2\left(1-\frac{y^2}{12}+O(y^4)\right)\Rightarrow y=\sqrt{2x}\left(1-\frac{y^2}{12}+O(y^4)\right)^{-1/2}$$
Since $0<y\ll1$, $\frac{y^2}{12}\ll1$ so that we can use the binomial theorem $(1+x)^p=1+px+\cdots$ to get the next-leading order term:
$$y=\sqrt{2x}+\sqrt{2x}\frac{y^2}{24}+O(y^4)=\sqrt{2x}+\frac{\sqrt{2x}}{24}\left(\sqrt{2x}+\frac{\sqrt{2x}}{24}y^2+O(y^4)\right)^2+O(y^4)$$ $$=\sqrt{2x}+\frac{(2x)^{3/2}}{24}\left(1+\frac{1}{12}y^2+O(y^4)\right)+O(y^4)=\sqrt{2x}+\frac{(2x)^{3/2}}{24}+\frac{(2x)^{3/2}}{24\cdot 12}y^2+O(x^{3/2}y^4)+O(y^4)$$
Now, since $y=O(\sqrt x)$ (which follows from the leading order term), we can simplify all that to get $$y=\sqrt{2x}+\frac{(2x)^{3/2}}{24}+O(x^2).$$