Solved – Entropy and Likelihood Relationship

entropy

This is a theoretical question.

Suppose that I have a sample s1 coming from distribution K, and a sample s2 coming from distribution M. But I don't know what K or M are. I hypothesize that s1 and s2 are coming from distribution T. I plug in s1 and s2 to pdf of T and calculate their likelihoods, say L(s1), L(s2). If I know differential entropies of K and M, being H(K) and H(M) respectively, can I find a relation between L(s1) and L(s2) in terms of H(K) and H(M). Basically what I am trying to show is something along the lines of this:

$\frac {L(s1)}{L(s2)} \sim \frac {e^{H(K)}} {e^{H(M)}}$

Is this even meaningful?

Thanks

Best Answer

It is known that maximizing the likelihood of the data is equivalent to minimizing the Kullback-Liebler distance between the hypothesized distribution family, T, and K or M. The kullback-Liebler distance is related to entropy, but its not as simple as your assumed proportional model. See also link.

Related Question