More random points on a circle

independencemathematical modelingprobabilityprobability distributionsuniform distribution

Please assist with this problem.

Suppose 3 (distinct) points are uniformly and independently distributed on a circle of unit length (smaller than a unit circle!). This is really circle and not disc. Call one of these points $B$. Let $W$ be the minimum distance of the point B to its nearest neighbour, either clockwise or anti-clockwise, whichever is smaller.

Find the pdf of $W$. (Well there's no measure theory for this problem, but I assume this pdf exists. Of course we can see for ourselves by computing the cdf $F_W(w)= P(W \le w)$ 1st and then hope the cdf is absolutely continuous.)

A. My model:

  • A.1. The circle is bijective with $[0,1)$, so let's call these 3 points $A,B,C$ s.t. they are iid $\sim \ Unif(0,1)$ (or $[0,1)$ or whatever).

  • A.2. (Not sure if any measure theory needed here, but they are all distinct $\mathbb P$-almost surely.)

  • A.3. Let $A$ be the anti-clockwise neighbour and $C$ clockwise.

Question:

Well, I hope to find the pdf of $W$ via its cdf, which I think I'm able to compute if I know what $W$ is. Is it $W = \min\{B-A,A-C\}$? Or is it $W=\min\{Z,U\}$, where $Z$ is the distance from $B$ to $A$ (anti-clockwise of course) and $U$ is the distance from $B$ to $C$ (clockwise of course; which I think is equivalent to saying 'distance from $C$ to $B$ anti-clockwise') ? Or something else? (Note: I asked about $Z$ here, but I hope I made this post self-contained.)


These questions are all related, but I hope I made each self-contained

Best Answer

Let $X$ be the anticlockwise distance along the circle's circumference from $B$ to $A$. Let $Y$ be the anticlockwise distance from $B$ to $C$.

Since the three points are independently uniformly distributed along the circumference of the circle, $X$ and $Y$ are iid variables with uniform distributions on $[0,1).$

Since the clockwise arc from $B$ to $A$ is $1 - X,$ the distance from $B$ to $A$ is the smaller of $X$ and $1-X$. That is, if we let $X_m = \min\{X, 1 - X\}$, then $X_m$ is the distance from $B$ to $A$.

Similarly, let $Y_m = \min\{Y, 1 - Y\}$, then $Y_m$ is the distance from $B$ to $C$.

By symmetry, $X_m \in [0,\frac12]$ and $X_m$ is uniformly distributed (glossing over the fact that the zero-probability events $X_m=0$ and $X_m=\frac12$ can each happen in only one way). Likewise, $Y_m$ also is uniform on $[0,\frac12]$. Moreover, $X_m$ and $Y_m$ are iid.

The nearest neighbor of $B$ is $A$ if $X_m < Y_m$, but is $C$ if $X_m > Y_m$. So $W = \min\{ X_m, Y_m \}.$

So we want the pdf (or cdf) of the minimum of two iid variables uniformly distributed on $[0,\frac12]$. For $0 \leq w \leq \frac12$ we have

\begin{align} P(W < w) &= 1 - P(W \geq w)\\ & = 1 - P(X_m \geq w) P(Y_m \geq w) \\ & = 1 - \left(2\left(\frac 12 - w\right)\right)^2 \end{align}

with $P(W < 0) = 0$ and $P(W < 1) = 1.$ The cdf is therefore $$ F_W(w) = 1 - (1 - 2w)^2. $$

The pdf is easily found from this.