All the theory you need
$Z\sim F$, and you want to estimate $\mathrm{E}[Z]$.
Let $\sigma^2=\mathrm{Var}[Z]<\infty$.
Simple Monte Carlo
Construct $X_1,X_2,\dots$ IID with $X_1\sim F$.
Define $\bar{X}_n=\frac{1}{n}\sum_{i=1}^n X_i$.
Result: $\mathrm{Var}[\bar{X}_n]=\sigma^2/n$.
Strong Law: $\bar{X}_n\to\mathrm{E}[Z]$ a.s.
Antithetic Variables
Construct $X'_1,X'_2,\dots$ such that, for $i\geq 1$,
- $X'_i\sim F$;
- $\mathrm{Cov}[X'_{2i-1},X'_{2i}]<0$;
- The pairs $(X'_1,X'_2),(X'_3,X'_4) ,\dots$ are IID.
Define $Y_i=(X'_{2i-1}+X'_{2i})/2$ and $\bar{Y}_n=\frac{1}{n}\sum_{i=1}^n Y_i$.
Result: $$\mathrm{Var}[\bar{Y}_n] =\frac{\sigma^2+\mathrm{Cov}[X'_1,X'_2]}{2n} < \frac{\sigma^2}{2n}<\mathrm{Var}[\bar{X}_n].$$
Strong Law: $\bar{Y}_n\to\mathrm{E}[Z]$ a.s.
Your application
Define $U_1,U_2,\dots$ IID $\mathrm{U}[0,1]$.
For $i\geq 1$, define $X'_{2i-1}=(\log(1-U_i))^2$ and $X'_{2i}=(\log(U_i))^2$.
Prove 1, 2, 3 above.
Remember that $U_i\sim 1-U_i$, and $\mathrm{Cov}[U_i,1-U_i]<0$.
Also, $x\mapsto (\log x)^2$ is monotonically decreasing for $x\in(0,1]$.
Simulation
n <- 10^6
u <- runif(n)
# simple
x <- (log(1-u))^2
mean(x)
sqrt(var(x)/n)
# antithetic
y <- ((log(1-u))^2 + (log(u))^2)/2
mean(y)
sqrt(var(y)/n)
Best Answer
First, they compare crude (M) and hit-or-miss (H) there; they don't mention 'improvements' until after that section where the comparison is done.
The page apparently contains a typographical error.
The crude estimate ($M$) has a smaller variance, $\sigma^2_M$ than the hit-and-miss estimate's ($H$) variance, $\sigma^2_H$. That is, $\sigma^2_M<\sigma^2_H$
As requested, I will illustrate this via a simple example. Consider using both methods on the following integration problem:
that is, $\int_0^1 f(x) dx$ where $f(x)=x,\quad 0<x<1$.
Here's the results of 1000 simulations with 1000 points of each kind of estimator:
As you see, $\sigma^2_M$ has the smaller variance, so the claim from the page that $\sigma^2_M-\sigma^2_H>0$ is a mistake.
I did the simulations in R:
We can also calculate the variance of the two algebraically.
The variance of the crude estimate for my example problem is simply $\frac{1}{n} \text{Var}(X)$ where $X$ is uniform on $(0,1)$; that is, it should be 1/(12n) for this problem.
The variance of the hit-or-miss estimate is given here. For our problem, $\theta=1/2$, so the variance for this problem should be 1/(4n).
The standard deviations for $n=1000$ should then be about 0.00913 for the crude and about 0.0158 for the hit-and-miss. The above simulations came out pretty close to those values (0.00911 vs 0.0162).
Note that the integral they give there, $\frac{1}{n}\int_0^1 f(u)(1-f(u))du$ works out to be 1/(6n) for our toy problem here... which happens to be the same as 1/(4n) - 1/(12n) ... which is $\sigma^2_H-\sigma^2_M$ in our example. That's no accident, and - as I said at the start - apparently it's just a typo.
That is, they just got the subscripts backward; they actually prove that $\sigma^2_H-\sigma^2_M >0$ (edit: their result isn't true in all circumstances; they must elsewhere restrict the class of functions)