[Math] linear least square estimation with random sum

estimation-theoryprobability theory

Let $N$ be a geometric r.v. with mean $1/p$; let $A1,A2,… $be a sequence of i.i.d. random variables, all independent of $N$, with mean $1$ and variance $1$; let $B1,B2,… $be another sequence of i.i.d. random variable, all independent of $N$ and of $A1,A2,…$, also with mean $1$ and variance $1$. Let $A=\sum _{i=1}^ N A_ i$ and $B=\sum _{i=1}^ N B_ i$.

1) Find the ${\bf E}[AB]\,$and ${\bf E}[NA]\,$ using the law of iterated expectations. Express each answer in terms of $p$.

2) Let $\hat{N} = c_1 A + c_2$ be the LLMS estimator of $N$ given $A$. Find $c_1$ and $c_2$ in terms of $p$.

I am stuck here and don't know how to apply law of iterated expectations from the very begining. Any hint will be helpful, thanks a lot.

Best Answer

I just figured out the answer, it is a good drill for understanding the basic idea about the law of iterated expectations.

1) $E[AB]=E[E[AB|N]]=E[E[A|N]E[B|N]]$, Because $E[A|N]=E[B|N]=N$, so $E[AB]=E[N^2]$. We know $E[N]=\frac{1}{p}$ and $Var(N)=\frac{1-p}{p^2}$. It is easy to get $E[N^2]$. Similarly, $E[AN]=E[E[AN|N]]=E[E[N|N]E[A|N]]$

2) After getting the terms in part 1, this part is pretty smooth sailing.$$\hat{N}=E[N] + \frac{Cov(N,A)}{Var(A)}(A-E[A])$$

Related Question