It turns out that the independence of $\frac{B_{t_1}}{t_1} - \frac{B_{s_1}}{s_1}$ and $\frac{B_{t_2}}{t_2} - \frac{B_{s_2}}{s_2}$ relies crucially on the fact that these two random variables are jointly Gaussian; independence of the increments alone does not imply that these random variables are independent.
For example, consider a rate-1 Poisson process $(N_t)_{t \geq 0}$. This process has independent increments, and for $s < t$, $N_t - N_s$ is distributed as a Poisson random variable with parameter $t-s$. It also has the same covariance function as Brownian motion, namely $\mathrm{Cov}(B_s, B_t) = \min\{s,t\}$, so that
$$ \mathrm{Cov} \left( \frac{N_{t_1}}{t_1} - \frac{N_{s_1}}{s_1}, \frac{N_{t_2}}{t_2} - \frac{N_{s_2}}{s_2} \right) = 0 $$
just as above. However, I claim that $\frac{N_{t_1}}{t_1} - \frac{N_{s_1}}{s_1}$ and $\frac{N_{t_2}}{t_2} - \frac{N_{s_2}}{s_2}$ are not in general independent. Consider $N_1 - 2N_{1/2} = U$ and $\frac{1}{2} N_2 - N_1 = V$. Denote the increments $X = N_{1/2} - N_{0}$, $Y = N_{1} - N_{1/2}$, and $Z = N_2 - N_1$. All three are independent Poisson, the first two have parameter $1/2$, and the last has parameter $1$. Write
$$ U = X-Y, \qquad V = \frac 1 2 (-X-Y+Z) ,$$
and use the characteristic function for Poisson random variables to find a value $(s,t)$ for which
$$ E(e^{isU + itV}) \neq E(e^{isU}) \, E(e^{isV}) . $$
If my calculation is correct, for $(s,t) = (\pi, 2 \pi)$, the above reads $e^{-3} \neq e^{-6}$. I would appreciate if someone could double-check this!
Recall that $X \in \sigma(Y_1, \ldots, Y_k)$ iff there a measurable function $g : \mathbb{R}^k \to \mathbb{R}$ so that $X = g(Y_1, \ldots, Y_k)$. (This is the Doob-Dynkin lemma.)
First, note that $\sigma(B_{t_0} - B_{t_1}, B_{t0} - B_{t_2}, B_{t_1} - B_{t_2}) = \sigma(B_{t_0} - B_{t_1}, B_{t_1} - B_{t_2}) =: F$.
Let's note that $B_{t_0} \not \in F$: $B_{t_0}$ is independent from $F$ by independence of increments, and as long as $B_{t_0}$ is not deterministic it cannot be independent from itself. (Except in the case that $t_0 = 0$, so $B_{t_0} = 0$.)
Thus, if $X = B_{t_0} - a B_{t_1} - b B_{t_2} \in F$ for all $a,b$, then since taking limits preserves measurability, $B_{t_0}$ would be. This is a contradiction.
On the other hand, if you add $B_{t_0}$ into your sigma algebra, then it is the same as $\sigma( B_{t_0} , B_{t_1}, B_{t_2})$, so you can express $X$.
Best Answer
Fix $s < t$. By symmetry of Brownian motion, we have $$\mathbb{P}(B_s B_t \geq 0) = \mathbb{P}(B_s \geq 0, B_t \geq 0) + \mathbb{P}(-B_s \geq 0, -B_t \geq 0) = 2 \mathbb{P}(B_s \geq 0, B_t \geq 0).$$ Consequently, it suffices to compute the probability on the right-hand side. Conditioning on $\mathcal{F}_s := \sigma(B_r; r \leq s)$, we find from the Markov property that
$$\mathbb{P}(B_s \geq 0, B_t \geq 0)= \mathbb{E}\left( 1_{\{B_s \geq 0\}} \mathbb{P}(B_{t-s}+x \geq 0) \bigg|_{x=B_s}\right). \tag{1}$$
Denote by $\Phi$ the cdf of the standard Gaussian distribution. Since $B_{t-s}$ is Gaussian with mean $0$ and variance $t-s$, we have
$$\mathbb{P}(B_{t-s} +x \geq 0) = \mathbb{P}(B_{t-s} \leq x) = \Phi(x/\sqrt{t-s}).$$
Hence, by $(1)$,
$$\mathbb{P}(B_s \geq 0,B_t \geq 0) = \mathbb{E}(1_{\{B_s \geq 0\}} \Phi(B_s/\sqrt{t-s})).$$
If we denote by $\varphi$ the density of the Gaussian distribution, then we can write this equivalently as
$$\mathbb{P}(B_s \geq 0,B_t \geq 0) = \int_0^{\infty} \Phi(\sqrt{s} x/\sqrt{t-s}) \varphi(x) \, dx.$$
The latter integral can be calculated explicitly, see e.g. this answer, and we get
$$\mathbb{P}(B_s \geq 0,B_t \geq 0) = \frac{1}{4} + \frac{1}{2\pi} \arctan \sqrt{\frac{s}{t-s}}.$$
Hence,
$$\mathbb{P}(B_s B_t \geq 0)= \frac{1}{2} + \frac{1}{\pi} \arctan \sqrt{\frac{s}{t-s}}.$$
A short sanity check: For $s=t$ both sides equal $1$ - as they should.