Hints: First consider $B^2 \geq 4AC$. Now, if $U$ is uniform$(0,1)$, then $-\log(U)$ is exponential$(1)$; further, the sum of two independent exponential$(1)$ random variables has pdf $x e^{-x}$, $x > 0$. Thus, using the law of total probability, the answer can be found by solving an elementary one dimensional integral. I've confirmed the result by simulations.
EDIT: Specifically,
$$
{\rm P}(B^2 - 4AC \ge 0) = \int_{\log 4}^\infty {[1 - e^{ - (x - \log 4)/2} ]xe^{ - x} \,{\rm d}x} = \frac{{5 + 3\log 4}}{{36}} \approx 0.2544134.
$$
The answer was confirmed using Monte Carlo simulations: $N=10^8$ runs gave average of $0.25444043$.
EDIT: Note that it is quite easy to determine, moreover, the distribution of the product of $n$ independent uniform$(0,1)$ random variables. Indeed, let $U_1,\ldots,U_n$ be independent uniform$(0,1)$ variables. Write
$$
U_1 \cdots U_n = \exp \Big[ - \sum\nolimits_{i = 1}^n { - \log U_i } \Big].
$$
Since the $-\log U_i$ are independent exponential$(1)$ variables, $U_1 \cdots U_n$ is merely distributed as $e^{-X}$, where $X$ has gamma pdf $x^{n-1}e^{-x}/(n-1)!$, $x > 0$.
EDIT: Elaborating in response to the OP's request (see the first comment below).
Actually, the hint was supposed to send you in a slightly different direction, that is to consider the probability
$$
{\rm P}\bigg( - \log B \le \frac{{( - \log A) + ( - \log C) - \log 4}}{2}\bigg),
$$
or
$$
{\rm P}\bigg(X \le \frac{{Y - \log 4}}{2}\bigg),
$$
where $X$ is exponential$(1)$ and $Y$, independent of $X$, has gamma pdf $f_Y (x) = xe^{-x}$, $x > 0$.
Then, by the law of total probability (and using that $X$ and $Y$ are independent), the above probability is given by
$$
\int_0^\infty {{\rm P}\bigg(X \le \frac{{Y - \log 4}}{2}\bigg|Y = x\bigg)f_Y (x)\,{\rm d}x} = \int_0^\infty {{\rm P}\bigg(X \le \frac{{x - \log 4}}{2}\bigg)xe^{ - x} \,{\rm d}x},
$$
and so substituting the exponential$(1)$ cdf yields the desired integral.
This is such a natural problem to study that I would check to see if it has been done before, and it seems it has:
"On the distribution of the distance between two points in a cube" by
Antanas Žilinskas,
Random Operators and Stochastic Equations, Volume 11, Issue 1, Pages 21–24, March 2003.
Abstract: We are interested in the distribution of distance between two random points in a cube. It is well known, that the derivation of the formulae of the distribution function of interest implies integration problems which are almost intractable. We show, that the problem may be successfully solved using a symbolic computation tool.
I was unable to find the paper for free online.
I also found what looks like a proof of the exact same thing at http://www.degruyter.com/view/j/rose.2000.8.issue-4/rose.2000.8.4.339/rose.2000.8.4.339.xml . It also is behind a paywall. Tantalizingly, you can see part of the answer on the first page of the paper, which is displayed at that URL.
This paper was free online:
"THE PROBABILITY DISTRIBUTION OF THE
DISTANCE BETWEEN TWO RANDOM POINTS IN A
BOX", posted at http://www.math.kth.se/~johanph/habc.pdf , which actually answers the more difficult problem with a box with not necessarily equal sides. This would be harder to use, but easier to get for free right away.
Best Answer
There are lots of approaches one could take, but the simplest one I can think of is to consider the shape of the CDF of $R^2$.
What is the probability that $R^{2}<t$? Well, this is clearly $0$ for $t<0$ and $1$ for $t>1$. For $t\in [0,1]$, the probability is the area of the circle of radius $R$ (divided by $\pi$, since $1/\pi$ is the joint density in the unit circle) where $R^2=t$. Substituting $R=\sqrt{t}$, this is: $$\frac{\pi R^{2}}{\pi} = \frac{\pi t}{\pi} = t.$$ And we're done. This is the CDF of the uniform distribution on $[0,1]$, as required!