The sigma-algebra generated by $1_{[0,1/2]}$ is simply
$$
\bigl\{\emptyset,[0,1],[0,1/2],(1/2,1]\bigr\}.
$$
It consists of the preimages under the function $1_{[0,1/2]}$ of all Borel sets in the codomain of the function $1_{[0,1/2]}$, namely, $(\mathbb R,B(\mathbb R))$. (Notice that the preimage $1_{[0,1/2]}^{-1}(M)$ is completely determined by the information of whether 0 and 1 do or do not belong to $M$ respectively.)
The situation for $1_{[1/4,3/4]}$ is similar.
The random variables $1_{[0,1/2]}$ and $1_{[1/4,3/4]}$ on $([0,1],B[0,1],L)$ are indeed independent: For this you have to check that $L(A\cap B)=L(A)\cdot L(B)$ for all $A\in 1_{[0,1/2]}^{-1}(B(\mathbb R))$ and $B\in 1_{[1/4,3/4]}^{-1}(B(\mathbb R))$.
The most interesting case is $L([0,1/2]\cap [1/4,3/4])=L([0,1/2])\cdot L([1/4,3/4])$.
Check that both sides are equal!
Also think about the following question: Are the random variables $1_{[0,1/2]}$ and $1_{[1/4,1]}$ on $([0,1],B[0,1],L)$ also independent?
By definition, $\sigma(Z)$ denotes the smallest $\sigma$-algebra $\Sigma$ on $\Omega$ such that $Z: (\Omega, \Sigma) \to (\mathbb{R},\mathcal{B}(\mathbb{R}))$ is measurable.
This means in particular that $X:(\Omega,\sigma(X)) \to (\mathbb{R},\mathcal{B}(\mathbb{R}))$ is measurable. Since $\sigma(Y) = \sigma(X)$, we also have that
$$Y: (\Omega,\sigma(X)) \to (\mathbb{R},\mathcal{B}(\mathbb{R}))$$
is measurable. Consequently, the sum
$$X+Y: (\Omega,\sigma(X)) \to (\mathbb{R},\mathcal{B}(\mathbb{R}))$$
is measurable as a sum of two measurable random variables. Since $\sigma(X+Y)$ is the smallest $\sigma$-algebra $\Sigma$ such that
$$X+Y: (\Omega,\Sigma) \to (\mathbb{R},\mathcal{B}(\mathbb{R}))$$
is measurable, this shows $\sigma(X+Y) \subseteq \sigma(X)$.
Best Answer
The discussion in the comments to the OP indicates that one is supposed to understand that one considers a single random variable $X$ defined on $Ω=[0,+\infty)$ by $X(t)=\mathbf 1_{t>t_0}\mathbf 1_A(t)$ for every $t$ in $Ω$, or, equivalently, $$X=\mathbf 1_{A\cap(t_0,+\infty)}.$$ User @saz answered that question perfectly. By contrast, the answer below is not relevant since it addresses the real text of the question, which states that, for every $t$, one considers some random variable $X_t$ defined on $Ω=[0,+∞)$ by $X_t(\omega)=\mathbf 1_{t>t_0}\mathbf 1_A(\omega)$ for every $\omega$ in $Ω$.
This explains why I deleted my answer, before undeleting it due to a comment by the OP.
Let $X_t=\mathbf 1_{t\gt t_0}\mathbf 1_A$ and $\sigma(X_t)$ the sigma-algebra generated by the random variable $X_t$.
The process $X=(X_t)_{t\geqslant0}$ is quite degenerate since there exists two random variables $Y$ and $Z$ such that $X_t=Y$ for every $t\leqslant t_0$ and $X_t=Z$ for every $t\gt t_0$. Thus, the sigma-algebra $\sigma(X)$ generated by the whole process $X$ is $\sigma(Y,Z)$.
In the present case, $Y=0$ and $Z=\mathbf 1_A$ hence $\sigma(X)=\{\varnothing,A,\Omega\setminus A,\Omega\}$.