Solved – Find a complete sufficient statistic,

estimationinferenceself-studysufficient-statistics

Let $X_1,…,X_n$ be iid observations.Find a complete sufficient statistics for

i)$f(x|\theta)=\frac{\theta}{(1+x)^{1+\theta}}I_{[0\infty)}(x), \theta>0$

What I did
$$\frac{\theta}{(1+x)^{1+\theta}}=\theta (1+x)^{-(1+\theta)}=\theta e^{log(1+x)^{-(1+\theta)}}=\theta e^{-(1+\theta)log(1+x)}$$
$$f_n(x|\theta)=\theta^n e^{-(1+\theta)\sum log(1+x)}$$ then $T(x)=\sum log(1+X)$ is a sufficient statistics, but in general as I check if the statistic is complete?

Can anyone give me a brief explanation of the complete statistical theorem on exponential family,that theorem http://cseweb.ucsd.edu/~elkan/291winter2005/lect08.pdf, says that the parameter space contains an open set $\mathbb{R}^k$. But in an exercise of Casella, he applies this theorem when the parameter is set in the range $0\leq \theta \leq 1$

Best Answer

I'll sketch out an approach but I'll leave the details up to you.

You can show that $\log(1 + X) \sim \Gamma(1, \theta^{-1})$. Use this to find the distribution of your sufficient statistic.

Then you need to suppose that $E(g(T)) = 0$ for an arbitrary function $g$, i.e. $$ \int \limits_0^\infty g(t) f_T(t)dt = 0 $$

where I'm ignoring parameters. You'll need to fill those in appropriately.

Use this to make a statement about $g$ so that you can conclude that $P(g(T) = 0) = 1$ for all $\theta \in \Theta$.

Edit: Because you know the exponential family result, I'll go through this proof in more detail.

Let $Y = \log(1+X)$. Note that this is 1-1. The inverse of the transformation is $X = \exp(Y) - 1$ so the Jacobian is $e^Y$. Putting this together we have $$ f_Y(y|\theta) = f_X(e^y - 1|\theta) \times e^y $$ $$ = \frac{\theta e^y}{(1 + e^y - 1)^{\theta + 1}} \times I(0 < y < \infty) $$ $$ = \theta e^{-\theta y} =_d \Gamma(1, \theta^{-1}) $$ where I'm dropping the indicator because the support is independent of $\theta$ so it's not too important.

This means that $T \sim \Gamma(n, \theta^{-1})$ and therefore $$ E(g(T)) = \int \limits_0^\infty g(t) t^n e^{-\theta t} \frac{\theta^n}{\Gamma(n)} dt =_{set} 0 $$

$$ \implies \int \limits_0^\infty g(t) t^n e^{-\theta t} dt = 0. $$

Because $\forall t > 0$ and $\theta > 0$ $t^n e^{-\theta t} > 0$, it must be that $g(t) = 0$ almost surely (you could do this a lot more rigorously).