Probability Theory – Unbiased Estimator of Exponential Measure of a Set: Detailed Explanation

measure-theorymonte carloprobabilityunbiased-estimator

Suppose we have a (measurable and suitably well-behaved) set $S\subseteq B\subset\mathbb R^n$, where $B$ is compact. Moreover, suppose we can draw samples from the uniform distribution over $B$ wrt the Lebesgue measure $\lambda(\cdot)$ and that we know the measure $\lambda(B)$. For example, perhaps $B$ is a box $[-c,c]^n$ containing $S$.

For fixed $\alpha\in\mathbb R$, is there a simple unbiased way to estimate $e^{-\alpha \lambda(S)}$ by uniformly sampling points in $B$ and checking if they are inside or outside of $S$?

As an example of something that doesn't quite work, suppose we sample $k$ points $p_1,\ldots,p_k\sim\textrm{Uniform}(B)$. Then we can use the Monte Carlo estimate $$\lambda(S)\approx \hat\lambda:= \frac{\#\{p_i\in S\}}{k}\lambda(B).$$
But, while $\hat\lambda$ is an unbiased estimator of $\lambda(S)$, I don't think it's the case that $e^{-\alpha\hat\lambda}$ is an unbiased estimator of $e^{-\alpha\lambda(S)}$. Is there some way to modify this algorithm?

Best Answer

Suppose that you have the following resources available to you:

  1. You have access to an estimator $\hat{\lambda}$.
  2. $\hat{\lambda}$ is unbiased for $\lambda ( S )$.
  3. $\hat{\lambda}$ is almost surely bounded above by $C$.
  4. You know the constant $C$, and
  5. You can form independent realisations of $\hat{\lambda}$ as many times as you'd like.

Now, note that for any $u > 0$, the following holds (by the Taylor expansion of $\exp x$):

\begin{align} e^{-\alpha \lambda ( S ) } &= e^{-\alpha C} \cdot e^{\alpha \left( C - \lambda ( S ) \right)} \\ &= e^{- \alpha C} \cdot \sum_{k \geqslant 0} \frac{ \left( \alpha \left[ C - \lambda ( S ) \right] \right)^k}{ k! } \\ &= e^{- \alpha C} \cdot e^u \cdot \sum_{k \geqslant 0} \frac{ e^{-u} \cdot \left( \alpha \left[ C - \lambda ( S ) \right] \right)^k}{ k! } \\ &= e^{u -\alpha C} \cdot \sum_{k \geqslant 0} \frac{ u^k e^{-u} }{ k! } \left(\frac{ \alpha \left[ C - \lambda ( S ) \right]}{u} \right)^k \end{align}

Now, do the following:

  1. Sample $K \sim \text{Poisson} ( u )$.
  2. Form $\hat{\lambda}_1, \cdots, \hat{\lambda}_K$ as iid unbiased estimators of $\lambda(S)$.
  3. Return the estimator

$$\hat{\Lambda} = e^{u -\alpha C} \cdot \left(\frac{ \alpha }{u} \right)^K \cdot \prod_{i = 1}^K \left\{ C - \hat{\lambda}_i \right\}.$$

$\hat{\Lambda}$ is then a non-negative, unbiased estimator of $\lambda(S)$. This is because

\begin{align} \mathbf{E} \left[ \hat{\Lambda} | K \right] &= e^{u -\alpha C} \cdot \left(\frac{ \alpha }{u} \right)^K \mathbf{E} \left[ \prod_{i = 1}^K \left\{ C - \hat{\lambda}_i \right\} | K \right] \\ &= e^{u -\alpha C} \cdot \left(\frac{ \alpha }{u} \right)^K \prod_{i = 1}^K \mathbf{E} \left[ C - \hat{\lambda}_i \right] \\ &= e^{u -\alpha C} \cdot \left(\frac{ \alpha }{u} \right)^K \prod_{i = 1}^K \left[ C - \lambda ( S ) \right] \\ &= e^{u -\alpha C} \cdot \left(\frac{ \alpha }{u} \right)^K \left[ C - \lambda ( S ) \right]^K \end{align}

and thus

\begin{align} \mathbf{E} \left[ \hat{\Lambda} \right] &= \mathbf{E}_K \left[ \mathbf{E} \left[ \hat{\Lambda} | K \right] \right] \\ &= \mathbf{E}_K \left[ e^{u -\alpha C} \cdot \left(\frac{ \alpha }{u} \right)^K \left[ C - \lambda ( S ) \right]^K \right] \\ &= e^{u -\alpha C} \cdot \sum_{k \geqslant 0} \mathbf{P} ( K = k ) \left(\frac{ \alpha }{u} \right)^K \left[ C - \lambda ( S ) \right]^K \\ &= e^{u -\alpha C} \cdot \sum_{k \geqslant 0} \frac{ u^k e^{-u} }{ k! } \left(\frac{ \alpha \left[ C - \lambda ( S ) \right]}{u} \right)^k \\ &= e^{-\alpha \lambda ( S ) } \end{align}

by the earlier calculation.

Related Question