The $\: \limsup \:$ of a sequence of measurable functions is defined pointwise. $\;\;$ (I could show how
to get from "pointwise limits of sequences of measurable functions are always measurable" to
"the limsup of a sequence of measurable functions is always measurable" if you want me to.)
$\displaystyle\lim_{n\to \infty} \: \log n \;\; = \;\; \scriptsize+\normalsize\infty$
For all members $\omega$ of the domain of your random variables,
$\displaystyle\limsup_{n\to \infty} \frac{X_n(\omega)}{\log n} \: = \: 1 \;\; \implies \;\; \displaystyle\limsup_{n\to \infty} \frac{X_n(\omega)}{\log n} \: \not\leq \: 0$
$\implies \;\; \displaystyle\limsup_{n\to \infty} X_n(\omega) \: = \: \scriptsize+\normalsize\infty \;\; \implies \;\; \sup_n X_n(\omega) \: = \: \scriptsize+\normalsize\infty$
$\left\{\omega \;\; : \;\; \displaystyle\limsup_{n\to \infty} \frac{X_n(\omega)}{\log n} \: = \: 1 \right\} \;\;\;\; \subseteq \;\;\;\; \left\{\omega \;\; : \;\; \displaystyle\sup_n X_n(\omega) \: = \: \scriptsize+\normalsize\infty \right\}$
$1 \;\; = \;\; \mathbb{P}\left(\displaystyle\limsup_{n\to \infty} \frac{X_n(\omega)}{\log n} \: = \: 1 \right) \;\; \leq \;\; \mathbb{P}\left(\displaystyle\sup_n X_n(\omega) \: = \: \scriptsize+\normalsize\infty\right) \;\; \leq \;\; 1$
$\mathbb{P}\left(\displaystyle\limsup_{n\to \infty} X_n(\omega) \: = \: \scriptsize+\normalsize\infty\right) \;\; = \;\; 1$
Therefore $\;\;\;\; \displaystyle\sup_n \: X_n(\omega) \;\; = \;\; \scriptsize+\normalsize\infty \;\;\;\;$ almost surely.
Since the augmented filtration is right-continuous, we may assume that $(M_t)_{t \geq 0}$ has càdlàg sample paths. Since $(M_t)_{t \geq 0}$ is a local martingale, there is a sequence of stopping times $(\tau_k)$ such that $\tau_k \uparrow \infty$ and $(M_{t \wedge \tau_k})_{t \geq 0}$ is a martingale. Set
$$Y := M_{T \wedge \tau_k}$$
for fixed $k \in \mathbb{N}$ and $T>0$. Since $Y \in L^1$ and $L^2$ is dense in $L^1$, we can find a sequence $(Y_n)_{n \in \mathbb{N}} \subseteq L^2(\mathcal{F}_T)$ such that $Y_n \to Y$ in $L^1$. Obviously,
$$M_t^n := \mathbb{E}(Y_n \mid \mathcal{F}_t), \qquad t \leq T,$$
is an $L^2$-bounded $\mathcal{F}_t$-martingale. By the martingale representation theorem for $L^2$-martingales, there exists $\phi_n \in L^2(\lambda_T \otimes \mathbb{P})$ such that
$$M_t^n = \int_0^t \phi_n(s) \, dW_s, \qquad t \leq T.$$
In particular, $(M_t^n)_{t \leq T}$ has continuous sample paths. By the maximal inequality,
$$\mathbb{P} \left( \sup_{t \leq T} |M_{t \wedge \tau_k}-M_t^n| > \epsilon \right) \leq \epsilon^{-1} \mathbb{E}|Y-Y_n| \to 0,$$
i.e. $\sup_{t \leq T} |M_{t \wedge \tau_k}-M_t^n|$ converges in probability to $0$. Extracting a convergent subsequence, we conclude that $(M_{t \wedge \tau_k})_{t \leq T}$ has continuous sample paths. Since both $k$ and $T$ are arbitrary, we find that $(M_t)_{t \geq 0}$ has a.s. continuous sample paths. Now the claim follows using the argumentation described in the question.
For martingales with not necessarily continuous sample paths (which are not adapted to a filtration generated by a Brownian motion), we need more general representation results; the following result is due to Ikeda-Watanabe.
Let $(M_t)_{t \geq 0} \in \mathcal{M}_2$ a martingale with respect to a filtration $(\mathcal{F}_t)_{t \geq 0}$ generated by a Lévy process. Then there exists predictable processes $f,g$ as well as a Brownian motion $(W_t)_{t \geq 0}$ and a Poisson random measure $N$ such that $$M_t - M_0 = \int_0^t f(s) \, dW_s + \int_0^t g(s) \, d\tilde{N}_s$$ where $\tilde{N}$ denotes the compensated Poisson random measure.
See also this question.
Best Answer
I'll elaborate on my comment.
To make the notation more explicit, I'll write $N_K = \min \{ n : X_n \le -K \}$, and abbreviate $\{\omega : N_K(\omega) = \infty\}$ as $\{N_K = \infty\}$. Since $X_{n \wedge N_K} \ge -K-M$ for all $n$, $X_{n \wedge N_K}$ is a martingale which is bounded below, thus it converges a.s. But on the set $\{N_K = \infty\}$, $X_n = X_{n \wedge N_K}$ for all $n$, so $X_n$ converges a.s. on $\{N_K = \infty\}$.
$X_n$ need not converge on the set $\{N_K < \infty\}$. If $X_n$ is simple random walk, then $P(N_K < \infty) = 1$, and $X_n$ converges with probability 0. (This is why the theorem read $P(C \cup D) = 1$ instead of $P(C) = 1$.)
To complete the proof, since $K$ was arbitrary, we have that $X_n$ converges a.s. on $\bigcup_K \{N_K = \infty\}$. This latter set is precisely $\{\liminf X_n > -\infty\}$. Replacing $X_n$ by $-X_n$ we can also show that $X_n$ converges a.s. on $\{\limsup X_n < \infty\}$. Together, these are equivalent to the statement $P(C \cup D) = 1$.