As mentioned in the comment, the key is to show $$\lim \sup_{k \to \infty} A_k = \{\lim \sup_{k \to \infty} |X_k - \mu| \geq \epsilon \}$$ You can do that in a couple steps, showing $\cup_{k \geq n} A_k = \{\omega \, : \,\, \sup_{k \geq n} |X_k(\omega) - \mu| \geq \epsilon \}$ holds for all $n$ using a double-inclusion argument.
Then the step of pulling the limit outside the probability is OK, as you said, from monotone limit properties of probability measures.
For the other direction, take $U_N = \{\lim \sup_{k \to \infty} |X_k - \mu| \geq 1/N \}$. Then you get $$P(U_N) = \lim_n P(\sup_{k \geq n} |X_k - \mu| \geq 1/N) = 0$$ for any $N$ by assumption.
$U_N \uparrow \{\lim \sup_{k \to \infty} |X_k - \mu| > 0 \} = U$, so $P(U) = \lim_N P(U_N) = 0$. That means
$$0 \leq \lim \inf_{k \to \infty} |X_k - \mu| \leq \lim \sup_{k \to \infty} |X_k - \mu| = 0$$ with probability one, so the limit exists and is zero.
About your question in italics, it is always true that for a real-valued random variable sequence $\{f_k\}$ you have for any $a \geq 0$ $$\lim \sup_{k \to \infty} \{|f_k| \geq a \} = \{\lim \sup_{k \to \infty} |f_k| \geq a \}$$
which is what you would prove in the step mentioned in the comment. But if you switch the inequality direction it doesn't need to hold.
This is a consequence of continuity of measures. (As pointed out in a different answer, please consult Section 2.6 in Ross'.)
Continuity from below
Let $\mu$ be a measure on $(\mathcal{X},\mathcal{E})$, and let $E,E_1,E_2,\ldots$ be sets in $\mathcal{E}$. If $E_n \nearrow E$, then $\mu(E_n) \nearrow \mu(E)$.
Application
Let $(\Omega,\mathcal{F},\mathbb{P})$ be a background probability space, and let $X : (\Omega,\mathcal{F}) \to (\mathbb{R},\mathbb{B}(\mathbb{R}))$ be some random variable. Denote by $X(\mathbb{P})$ the pushforward measure on $(\mathbb{R},\mathbb{B}(\mathbb{R}))$ given by $X(\mathbb{P})(A) = P(X \in A)$ for $A\in\mathbb{B}(\mathbb{R})$.
Let $b\in\mathbb{R}$, let $B = (-\infty,b)$, and let $B_n = (-\infty,b-1/n]$ for $n\in\mathbb{N}$. Clearly, $B, B_1, B_2, \ldots$ are subsets of $\mathbb{B}(\mathbb{R})$. Note that $B_n \nearrow B$. Then according to the continuity from below of the probability measure $X(\mathbb{P})$, we find that
$$
X(\mathbb{P})(B_n) \nearrow X(\mathbb{P})(B).
$$
Rewriting according to the definition of the pushforward measure yields
\begin{align*}
\mathbb{P}(X \leq b-1/n) \nearrow \mathbb{P}(X < b)
\end{align*}
as desired.
Direct proof from axioms
Recall Kolmogorov's axioms: $\mathbb{P}$ is non-negative, $\mathbb{P}(\Omega)=1$, and if $E_1,E_2,\ldots$ is a sequence of mutually exclusive events, then
\begin{align*}
\mathbb{P}\!\left(\cup_{n=1}^\infty E_n\right) = \sum_{n=1}^\infty \mathbb{P}(E_n).
\end{align*}
This last axiom is also known as subadditivity.
Let $F_n = (X \leq b-1/n)$ for $n\in\mathbb{N}$. Note that $\cup_{n=1}^\infty F_n = (X < b)$. Define $E_n$ recursively by $E_1 = F_1$, $E_2 = F_2 \setminus F_1$, $E_3 = F_3 \setminus F_2$, and so on. For example, $F_2 = (b-1 < X \leq b-1/2)$. Note that $E_1,E_2,\ldots$ is a sequence of mutually exclusive events, and that $\cup_{n=1}^\infty E_n = \cup_{n=1}^\infty F_n = (X < b)$. Then by twice applying first countable subadditivity and then finite subadditivity (i.e. Kolmogorov's last axiom),
\begin{align*}
\mathbb{P}(X < b) &= \mathbb{P}\!\left(\cup_{n=1}^\infty E_n\right) \\
&= \sum_{n=1}^\infty \mathbb{P}(E_n) \\
&= \lim_{N\to\infty} \sum_{n=1}^N \mathbb{P}(E_n) \\
&= \lim_{N\to\infty} \mathbb{P}(\cup_{n=1}^N E_n) \\
&= \lim_{N\to\infty} \mathbb{P}(F_N) = \lim_{N \to \infty}\mathbb{P}(X \leq b - 1/N).
\end{align*}
Best Answer
Consider the probability measure. The complement of the intersection is the union of complements. Each complement is of measure 0. Then their union is also of measure 0. Hence the intersection is of measure 1.