Statistical Estimation – Finding the MVUE of a Circle’s Center with Unknown Location

self-studyunbiased-estimatoruniform distribution

Is there a known analytic solution for finding the minimum variance unbiased estimator of a disk of an unknown location given that a sample of $n$ points was drawn uniformly and randomly from the disk of known radius $r$; and, if so, what is it?

Intuitively, I would expect it to be the midrange of the longest segment of the set of all possible pairs of points, but that might discard information created by the added dimension.

EDIT
Unfortunately, I am stuck rather early in the process and I am aware that an MVUE may not exist. A unique maximum likelihood estimator does not exist.

First, I define $$\hat{\theta}_x=\hat{\theta}_x(x_1,x_2,\dots,x_n)$$ and $$\hat{\theta}_y=\hat{\theta}_x(y_1,y_2,\dots,y_n).$$

Then, I am trying to solve for $$\min_{\hat{\theta}}\mathbb{E}[(\theta_x-\hat{\theta}_x)^2+(\theta_y-\hat{\theta}_y)^2]$$ subject to $$\mathbb{E}(\theta_x-\hat{\theta}_x)=0$$ and $$\mathbb{E}(\theta_y-\hat{\theta}_y)=0$$ and $$(x_i-\theta_x)^2+(y_i-\theta_y)^2\le{r}^2$$ and $$f((x_i,y_i)|(\theta_x,\theta_y))=\frac{1}{\pi{r^2}}.$$

I had hoped that if I built a Lagrangian, the inequality constraints would be useful in determining $\hat{\theta}$ even if through some envelope condition, but, of course, as $\hat{\theta}$ isn't a term in the inequalities, they drop out entirely.

The other thought I had was to possibly, in some way, use the fact that the variance of $x$ and $y$ were known to constrain the estimator in some way.

Best Answer

Here is a homework problem from Mark Schervish's Theory of Statistics that addresses a similar question:

  1. Let $(X_1, Y_1),\dots,(X_n, Y_n)$ be conditionally IID with uniform distribution on the risk of radius $r$ centered at $(\theta_1, \theta_2)$ in $\mathbb R^2$ given $(\Theta_1, \Theta_2, R)=(\theta_1, \theta_2, r)$.

    a. If $(\Theta_1, \Theta_2)$ is known, find a minimal sufficient statistic for $R$.

    b. If all parameters are unknown, show that the convex hull of the sample points is a sufficient statistic.

When only the center $\theta$ is unknown, the likelihood is constant over the set $$\mathfrak O = \bigcap_{i=1}^n \{\theta; d(x_i,\theta)\le r\}$$ which is therefore a (set-valued) "sufficient statistic". There is however no sufficient statistic in the classical sense and any value in $\mathfrak O$ is a MLE.

The problem can be considered from a different perspective, namely as a location model, since $$Z_1,\ldots,Z_n\sim\mathcal U(\mathcal B(\theta,r))$$ is the translation by $\theta$ of a $$U_1,\ldots,U_n\sim\mathcal U(\mathcal B(0,r))$$ [ancillary] sample. Therefore one could consider the best equivariant estimator¹ attached with this problem (and squared error loss), $$\hat\theta^\text P = \dfrac{\int \theta\,\mathbb I_{\max_i \vert\theta-z_i\vert<r}\,\text d\theta}{\int \mathbb I_{\max_i \vert\theta-z_i\vert<r}\,\text d\theta}$$ as established by Pitman (1933). This best equivariant estimator is

  1. unbiased
  2. unique
  3. the center of the "sufficient" region mentioned above
  4. with constant risk (by construction)
  5. minimax (because of 4.)
  6. the MVUE if the later exists
  7. admissible under squared error loss (as Stein phenomenon only occurs in dimension 3 and more for spherically symmetric distributions), which means that it cannot be dominated everywhere by another estimator
  8. but not sufficient.

Both 5. and 7. indicate that this estimator is minimum variance in the weak sense that there is no other estimator with a strictly everywhere smaller maximal MSE.


¹See Theory of Point estimation, by Lehmann and Casella, for a textbook entry to equivariance and best equivariant estimators.