Difference between Factorization theorem and Fischer-Neymann theorem for t to be sufficient estimator of $\theta$

parameter estimationself-learningstatistical-inferencestatistics

Difference between Factorization theorem and Fischer-Neymann theorem for t to be sufficient estimator of $\theta$

Factorization theorem says for parameter t to be sufficient statistic of $\theta$, joint density must be equal to product of 2 functions – one function is only function of random sample and other function is function of t and $\theta$

While Fischer-Neyman theorem just says the same but instead of that factor function to be that of t and $\theta$, it adds that it must be pdf(or pmf) of function of t and $\theta$

Can you please explain the difference?

Best Answer

It seems you are quite confused. The Fisher-Neymann factorization theorem says that if the density f can be factored into nonnegative functions g, h such that $f(x)=h(x)g(T(x),\theta)$, then T(x) is a sufficient statistic for theta. Usually people say factorization theorem for short.

By contrast, the definition of sufficiency is that T(x) is a sufficient statistic for theta if f(x|T(x)) does not depend on theta.