By the strong law of large numbers, we have
$$\frac{S_n}{n} \to \mathbb{E}(X_1) = -1 \quad \text{a.s}$$
This implies, in particular,
$$S_n = n \frac{S_n}{n} \xrightarrow[]{n \to \infty} - \infty \quad \text{a.s.}$$
and so
$$\{n \in \mathbb{N}; S_n(\omega) \geq 0\}$$
is a finite set for almost every $\omega \in \Omega$. Hence, $\mathbb{P}(T=\infty)=0$.
Note that the usual law of large numbers could look as a particular case of this result, namely, by taking $a_{n,i}=1/n$ for each $1\leqslant i\leqslant n$. But the SLLN only requires $X_1$ to be integrable.
The difficulty here is that we cannot use maximal inequality because the weights may not depend on $n$ nicely.
But we can use a truncation argument: for a fixed $R$, let $X_{i,\leqslant R}:=X_i\mathbf{1}_{\{\lvert X_i\rvert\leqslant R\}}$ and $X_{i,\gt R}:=X_i\mathbf{1}_{\{\lvert X_i\rvert\gt R\}}$. Then
$$
\tag{1}
S_n=\sum_{i=1}^na_{n,i}\left(X_{i,\leqslant R}-\mathbb E\left[X_{i,\leqslant R}\right]\right)+\sum_{i=1}^n a_{n,i}\mathbb E\left[X_{i,\leqslant R}\right]+
\sum_{i=1}^na_{n,i}X_{i,\gt R}.$$
Observe that since $X_i$ is centered and has the same distribution as $X_1$,
$$
\sum_{i=1}^n a_{n,i}\mathbb E\left[X_{i,\leqslant R}\right]=-\sum_{i=1}^n a_{n,i}\mathbb E\left[X_{i,\gt R}\right]=-\sum_{i=1}^na_{n,i}\mathbb E\left[X_{1,\gt R}\right]
$$
hence by the Cauchy-Schwarz inequality, $$\tag{2}\left\lvert \sum_{i=1}^n a_{n,i}\mathbb E\left[X_{i,\leqslant R}\right]\right\rvert\leqslant \mathbb E\left[\lvert X_{1}\rvert \mathbf{1}_{\{\lvert X_1\rvert>R\}}\right].$$
An application of Cauchy-Schwarz inequality gives
$$\tag{3}
\left\lvert \sum_{i=1}^na_{n,i}X_{i,\gt R}\right\rvert\leqslant \sqrt{\sum_{i=1}^na_{n,i}^2\sum_{j=1}^nX_{j,>R}^2}=\sqrt{\frac 1n\sum_{j=1}^nX_{j,>R}^2}.
$$
The combination of (1), (2) and (3) with the strong law of large numbers applied to the i.i.d. integrable sequence $\left(X_{j,>R}^2\right)_{j\geqslant 1}$ give the almost sure inequality
$$\tag{4}
\limsup_{n\to\infty}\left\lvert S_n \right\rvert\leqslant
\limsup_{n\to\infty}\left\lvert \sum_{i=1}^na_{n,i}\left(X_{i,\leqslant R}-\mathbb E\left[X_{i,\leqslant R}\right]\right)\right\rvert+\mathbb E\left[\lvert X_{1}\rvert \mathbf{1}_{\{\lvert X_1\rvert>R\}}\right]+\mathbb E\left[X_1^2\mathbf{1}_{\{\lvert X_1\rvert>R\}}\right]
$$
By Hoeffding's inequality , the following estimate takes place:
$$
\mathbb P\left(\left\lvert \sum_{i=1}^na_{n,i}\left(X_{i,\leqslant R}-\mathbb E\left[X_{i,\leqslant R}\right]\right)\right\rvert>\varepsilon\right)
\leqslant 2\exp\left(-\frac{\varepsilon^2}{4\sum_{i=1}^na_{n,i}^2}\right)
= 2\exp\left(-n\frac{\varepsilon^2}{4}\right).
$$
By the Borel-Cantelli lemma, the convergence
$$
\lim_{n\to\infty}\left\lvert \sum_{i=1}^na_{n,i}\left(X_{i,\leqslant R}-\mathbb E\left[X_{i,\leqslant R}\right]\right)\right\rvert=0
$$
takes place for each $R>0$ hence (4) gives
$$
\limsup_{n\to\infty}\left\lvert S_n \right\rvert\leqslant
\mathbb E\left[\lvert X_{1}\rvert \mathbf{1}_{\{\lvert X_1\rvert>R\}}\right]+\mathbb E\left[X_1^2\mathbf{1}_{\{\lvert X_1\rvert>R\}}\right].
$$
Since $R$ is arbitrary, the conclusion follows by the monotone or dominated convergence theorem.
Best Answer
Let $E_n := \{ X_n = 1 \}$ then $$\sum_{n=1}^{\infty} \mathbb{P}(E_n) = \sum_{n=1}^{\infty}\mathbb{P}(X_n = 1) = \infty \implies \mathbb{P}(E_n \text{ i. o. }) = \mathbb{P}(T = \infty) = 1,$$ by second Borel-Cantelli lemma.