No, this isn't correct.
Chebyshev's inequality says that for any non-negative random variable $Z$ and any $\epsilon > 0$, we have $$P(Z \ge \epsilon) \le \frac{E[Z]}{\epsilon}.$$
You are applying this with $Z = |X_n - 1|$, so you get
$$P(|X_n - 1| \ge \epsilon) \le \frac{E[|X_n - 1|]}{\epsilon}.$$
Note the absolute value bars on the right side, which you apparently dropped. Now $E[|X_n - 1|] = 1/n$, not $-1/n$, and you get $P(|X_n - 1| \ge \epsilon) \le \frac{1}{n \epsilon}$. This is true but a bit silly, since you can get a better bound without Chebyshev: for $\epsilon \le 1$ you have $P(|X_n - 1| \ge \epsilon) = P(X_n = 0) = 1/n$, and for $\epsilon > 1$ you have $P(|X_n - 1| \ge \epsilon) = 0$. But this also doesn't help you apply Borel-Cantelli.
As an immediate sign that something was wrong, note that your argument "showed" that the probability of an event, which by definition is between 0 and 1, was less than or equal to a negative number. Uh oh.
In fact, you cannot prove from the given information that $X_n \to 1$ a.s., because that can be false. Suppose that the random variables $X_n$ were independent (the statement of the problem doesn't assume this, but also doesn't rule it out). Then you can use the second Borel-Cantelli lemma to show that $P(X_n = 0 \text{ i.o.}) = 1$ and also $P(X_n = 1 \text{ i.o.}) = 1$. Hence the sequence diverges almost surely.
It is true that $X_n \to 1$ in probability. You can use Chebyshev for this (if you use it correctly) but the "better bound" I mention above seems easier.
It holds if $E[|\sum X_i|] < \infty$ or $E[\sum |X_i|] < \infty$.
Consider $X_1, X_2, ...$ in $(\Omega, \mathscr F, \mathbb P) = ([0,1], \mathscr B([0,1]), \lambda)$ where
$$X_n = 2^n 1_{A_n} + -2^{n} 1_{B_n} + 01_{A_n^C \cap B_n^C}$$ where $\lambda(A_n) = \frac{1}{2^{n+1}} = \lambda(B_n)$ and $A_i \cap B_j = A_n \cap B_n = A_i \cap A_j = B_i \cap B_j = \emptyset$
We have:
$$\sum_{n=1}^{\infty} X_n < \infty \ \lambda-a.s.$$
$$\sum_{n=1}^{\infty} E[X_n] = \sum_{n=1}^{\infty} 0 = 0$$
But we cannot compute
$$E [\sum_{n=1}^{\infty} X_n ]$$
because we have
$$ E [| \sum_{n=1}^{\infty} X_n |]$$
$$= E [|\sum_{n=1}^{\infty} 2^n 1_{A_n} + -2^{n} 1_{B_n}|] $$
$$= E [\sum_{n=1}^{\infty} |2^n| 1_{A_n} + |-2^{n}| 1_{B_n}] $$
$$= E [\sum_{n=1}^{\infty} (2^n 1_{A_n} + 2^{n} 1_{B_n})] $$
Note that $(2^n 1_{A_n} + 2^{n} 1_{B_n}) \ge 0$. Hence:
$$= \sum_{n=1}^{\infty} E [(2^n 1_{A_n} + 2^{n} 1_{B_n})] $$
$$= \infty$$
I think the above is independent of whether or not the random variables are independent.
Best Answer
Here is a simple proof. By monotone convergence theorem: $$ \sum_j E|X_j| = E \big[ \sum_{j} |X_j| \big]. $$ It follows from the assumption that $E \big[ \sum_j |X_j| \big] < \infty$. Any random variable which has finite expectation should be finite almost surely. Thus, $\sum_j |X_j| < \infty$ almost surely. But absolute convergence for series implies convergence, hence $\sum_j X_j$ converges almost surely.