Solved – Expectation of Sufficient Statistic

expectation-maximizationexpected valueexponential-familyself-study

Consider $X \sim B(n,p)$ with pmf $P(X=x) = {{n}\choose{x}} p^x (1-p)^{n-x}$.

The general exponential form of an exponential family distribution is $p(x|\theta) = f(x) g(\theta) e^{\phi(\theta)^T T(x)}$. Writing the binomial distribution in this form, I get $f(x) = {n \choose x}, g(p)=(1-p)^n, \phi(\theta)=\log{\frac{p}{1-p}}, T(x) = x$.

In other words, $x$, the number of successes is a sufficient statistic for the probability of success on each Bernoulli trial, as is to be expected.

I am asked to find:

(1) the expected value of the sufficient statistic, $<T(x)>_{p(x|\theta)}$ (often called the "mean" or "moment" parameter of the distribution).

(2) the maximum likelihood value of the mean parameter in the above question

I have to be honest, I don't really understand the difference between the two questions. So far I have attempted this:

(1) $\theta=p$ so the likelihood $P(x|p)$ is the probability of getting p successes in n trials given a Bernoulli prob of p. i.e. $P(x|p) = {n \choose x} p^x(1-p)^x$.

Then $<T(x)> = \sum_x T(x)P(x|p) = \sum_x x {n \choose x} p^x(1-p)^x$.

At this point, I'm not sure what to do.

(2) I know that for MLE I should find the maximum of $log{L(p)}= log{P(x|p)} = \log{n \choose x} n+ n \log{1-p} + x \log{\frac{p}{1-p}}$. If I differentiate and set to zero, I get that $x = T(x) = np$ which seems right as this is the expected value for $X$ following a binomial distribution.

Am I at all on the right lines here?

Best Answer

Hint #0: computing the expectation of a sufficient statistic (of which there is an infinity if there is one) is not equivalent or even related to finding the maximum likelihood of $\theta$ (which for one thing depends on the choice of parameterisation $\theta$)

Hint #1: check Wikipedia for the expectation of a Binomial $\mathcal B(n,p)$ random variable

Hint #2: solve the likelihood equation in $p$, not in $x$

Related Question