Solved – Maximum likelihood estimation of a Dirichlet distribution multivariate parameters

dirichlet distributionmaximum likelihood

Is it necessary to find the 'maximum likelihood estimates' of prior dirichlet parameters after finding their initial values through the 'method of moments ' to find posterior probabilities through bayesian analysis (using dirichlet -multinomial conjugate pairs)? If yes, I need to know an easy way to find the maximum likelihood estimates of dirichlet parameters. I did the bayesian analysis with the initial guess made through the 'method of moments ' without finding MLE . Is it correct to do so?
Eg. Let A,B,C be the three dependent variables with a prior probability of occurrence (Pij's). So, j = 1,2,3 for the three variables.
I have three observations of three different time periods for each variable A,B,C. So, i = 1,2,3 for the three observations.(The data given below is only an example; not original).
j=1 implies the variable A.
j=2 implies the variable B.
j=3 implies the variable C.
Let 0.1, 0.4 , and 0.2 be the three observations (for i=1,2,3) of variable A.
Then 0.6, 0.5, and 0.7 be the three observations of variable B.
Then 0.3, 0.1, and 0.1 be the three observations of variable C.
I fitted this data with a dirichlet distribution.
With this prior Pij's I made the initial guess of dirichlet parameters (alphas) by the 'method of moments' from a reference. In the reference they use this intial guess of alphas to find the 'maximum likelihood estimates' of alphas.
But I skipped the MLE method and did the bayesian analysis with a multinomial data as my likelihood function (frequency counts of each variable) and used the initial guess of alphas to estimate the posterior probability of occurrence of the three dependent variables.
Is MLE necessary for prior alphas?

Best Answer

You seem to be confusing many things in your question.

Is there any quick solution (either by any statistical software or manual workout) to find the maximum likelihood estimates of alpha of three independent variables of a dirichlet distribution

First of all, Dirichlet distribution is a multivariate distribution, I assume that you mean trivariate distribution in here. Obviously, the individual variables are not independent, it should be obvious at least from the fact that if $x_1,x_2,\dots,x_k$ are draws from Dirichlet distribution, then $\sum_{i=1}^k x_i = 1$, so they need to be dependent to meet the constraint.

provided that the initial values of the three parameters are found by the method of moments.

What do you mean by "method of moments" in here? There are many ways of computing the parameters of Dirichlet distribution (see, e.g. Minka, 2000; Huang, 2005), that in most cases maximize the likelihood numerically, and there is no simple, closed-form solution.

Or, Is the initial values sufficient to obtain posterior estimates in multinomial-dirichlet bayesian analysis?

To obtain posterior in Bayesian analysis you do not need to find the maximum likelihood estimates of the parameters. Maximum likelihood and Bayesian approaches are two different approaches to estimating parameters. Maximum likelihood is about finding such combination of parameters that maximize the likelihood function. In Bayesian case, you estimate the parameters in terms of the likelihood function and the priors. In Dirichlet-multinomial model (this is not the same as Dirichlet distribution), this is straightforward since Dirichlet is a conjugate prior for the multinomial distribution, and we have a closed form solution. The posterior estimate for $k$-th $\alpha$ is $\alpha_k + y_k$ where $\alpha_k$ is your prior guess for $\alpha_k$ and $y_k$ is the observed number of successes for $k$-th category in the multinomial distribution.

Huang, J. (2005). Maximum likelihood estimation of Dirichlet distribution parameters. CMU Technique Report.

Minka, T. (2000). Estimating a Dirichlet distribution. Online draft.