The sigma-algebra generated by $1_{[0,1/2]}$ is simply
$$
\bigl\{\emptyset,[0,1],[0,1/2],(1/2,1]\bigr\}.
$$
It consists of the preimages under the function $1_{[0,1/2]}$ of all Borel sets in the codomain of the function $1_{[0,1/2]}$, namely, $(\mathbb R,B(\mathbb R))$. (Notice that the preimage $1_{[0,1/2]}^{-1}(M)$ is completely determined by the information of whether 0 and 1 do or do not belong to $M$ respectively.)
The situation for $1_{[1/4,3/4]}$ is similar.
The random variables $1_{[0,1/2]}$ and $1_{[1/4,3/4]}$ on $([0,1],B[0,1],L)$ are indeed independent: For this you have to check that $L(A\cap B)=L(A)\cdot L(B)$ for all $A\in 1_{[0,1/2]}^{-1}(B(\mathbb R))$ and $B\in 1_{[1/4,3/4]}^{-1}(B(\mathbb R))$.
The most interesting case is $L([0,1/2]\cap [1/4,3/4])=L([0,1/2])\cdot L([1/4,3/4])$.
Check that both sides are equal!
Also think about the following question: Are the random variables $1_{[0,1/2]}$ and $1_{[1/4,1]}$ on $([0,1],B[0,1],L)$ also independent?
The first definition is the informal one, but at the same time seems rather convoluted to me.
I'd prefer: X and Y are conditionally independent with respect to a given Z iff
$P(X \; Y | Z) = P(X | Z ) P(Y | Z)$
Recall that conditioning one (or several) variables on the value of another, is (informally) the same as restricting the whole universe to a part of it.
Then, if you are given the value of $Z$, you can think as if you are defining new variables that are the same as the unconditioned but that are restricted to our new (smaller universe) $X' \equiv X | Z$ $Y' \equiv Y | Z$
The above formula simply states that $X'$ and $Y'$ are independent.
The first definition says the same, but applying (in words) the property that two variables are independent iff their conditioned probabilities are the same as the unconditioned : $A$ indep $B$ iff $P(A | B ) = P (A)$
Best Answer
A random variable $X$ is independent of a $\sigma$-algebra $\mathcal{G}$ if the $\sigma$-algebras $\sigma(X)$ and $\mathcal{G}$ are independent. Recall that $\sigma(X)$ consists of all sets of the form $\{X\in B\}$, where $B$ is a Borel subset of $\mathbb{R}$.
If $X$ has finite expectation and is independent of $\mathcal{G}$, then in fact $$ \mathbb{E}[X|\mathcal{G}]=\mathbb{E}[X]$$ On the other hand, if $X$ is $\mathcal{G}$-measurable (and has finite expectation) then $$ \mathbb{E}[X|\mathcal{G}]=X $$
This fits with the intuitive description of conditional expectation as an average based on the information contained in $\mathcal{G}$.