It semms alright, up to some typos. But it would be simpler to say that $1_{\{X_n>\alpha\}}$ converges to $1$ almost surely, and so does $\mathbb E[1_{\{X_n>\alpha\}}]=\mathbb P(X_n>\alpha)$ by the dominated convergence theorem.
EDIT: apparently I was a bit fast, so let me give more details.
We suppose that $\mathbb P(X_n\underset{n\to+\infty}{\longrightarrow}\beta)=1$, or in other words, $X_n$ converges almost surely to $\beta$ as $n\to+\infty$. Let $\alpha<\beta$. We want to show that $\mathbb P(X_n>\alpha)\underset{n\to+\infty}{\longrightarrow}1$.
To do so, we can first observe that $\mathbb P(X_n>\alpha)=\mathbb E[1_{\{X_n>\alpha\}}]$. The interest of this trick is that now that it is formulated as the convergence of an expected value, we can use all the theorems we know about convergences of expected values, such that dominated convergence, monotonous convergence, etc. As a reminder, the dominated convergence theorem states that if $(Y_n)_{n\in\mathbb N}$ is a sequence of real-valued random variables such that:
(i) $Y_n$ converges almost surely to some random variable $Y$ ;
(ii) There exists an integrable random variable $Z$ such that for all $n\in\mathbb N$, $\vert Y_n\vert\le Z$ (almost surely),
then $\mathbb E[Y_n]\underset{n\to+\infty}{\longrightarrow}\mathbb E[Y]$ (and we even have that $Y_n$ converges in $L^1$ to $Y$, but the convergence of the expected values is sufficient for our purpose, as it is often the case).
Now just apply the latter with $Y_n=1_{\{X_n>\alpha\}}$. For almost all $\omega\in\Omega$, $X_n(\omega)$ converges to $\beta$ as $n\to+\infty$, which implies that for $n$ large enough, $X_n>\alpha$, or equivalently, $Y_n(\omega)=1$. Therefore, $Y_n$ converges almost surely to $1$. Moreover, $Y_n$ is clearly dominated by $1$ (so we take $Z=1$, see item (ii)). Therefore, by the dominated convergence theorem,
$$
\mathbb P(X_n>\alpha)=\mathbb E[1_{\{X_n>\alpha\}}]=\mathbb E[Y_n]\underset{n\to+\infty}{\longrightarrow}\mathbb E[Y]=1.
$$
In fact, one should not use Borel-Cantelli here. Write $S_n=\sum_{i=1}^n X_i$. (As usual in probability theory, we suppress the argument $\omega$.) The hypothesis is
$$\lim_{n\rightarrow \infty} \frac{S_n}{n}=Y \quad (*) \,.$$
Rewriting this with the index $n-1$ replacing $n$ gives
$$\lim_{n\rightarrow \infty} \frac{S_{n-1}}{n-1}=Y \,.$$
Multiplying this by $ (n-1)/n$ gives
$$\lim_{n\rightarrow \infty} \frac{S_{n-1}}{n}=Y \,.$$
Subtracting this from (*), we obtain
$$\lim_{n\rightarrow \infty} \frac{X_n}{n}=0 \,,$$
which implies
$$\Bbb{P}(|X_n|>n~~ i.o.)=0 \,,$$
by the definition of convergence.
Best Answer
Here is my answer to the exercise.
Let us assume that $\sum_{n=1}^\infty\Bbb{P}(\Lambda_n)=\infty$ where $\Lambda_n=\{|X_n|>n\}$. Now since all $X_i$'s are independent we also get that $\Lambda_n$'s are independent. Hence we can apply Borel-Cantelli $2$ and get that $\Bbb{P}(\lim\sup_n \Lambda_n)=1$. But this contradicts the assumption that $\Bbb{P}(\Lambda_n~~\text{infinitely often})=0$. Hence $\sum_{n=1}^\infty\Bbb{P}(\Lambda_n)<\infty$.