Solved – Bayesian posterior for Geometric Distribution

bayesiangeometric-distributionself-study

I have the following homework problem I am trying to solve for but am stuck with the posterior part. Note the the geometric distribution is a discrete distribution that has a probability mass function of the form

$$\text{Pr}(X = k) = (1 – p)^k \cdot p;\quad k = 0,1,2,…;\, 0 < p < 1$$

Perform a Bayesian analysis for the parameter $p$. For the
prior distribution, I use a beta distribution with hyperparameters $a = 9$ and
$b = 3$.

Part a) What is E[p], the prior expected value of p?

Part b) Now suppose that I obtain 5 observations with values {6, 9, 8, 12, 7}

Calculate the posterior hyperparameters a* and b*.

Part c) What is E[ p | Data ], the posterior expectation of p?

Part d) Is the beta distribution a conjugate prior for the geometric distribution?

Essentially I am trying to do a beta-geometric model with the following information…

I figured out part A simply by using the beta to solve for E[p] = a/(a+b) = 9/(9+3) = .75. "Hope that's right"… now just need help with part b-d.

Best Answer

The prior is $\displaystyle \Big( \text{constant} \times r^{9-1} (1-r)^{12-1} \, dr \Big).$

The likelihood function is $$ L(r) = \Big( (1-r)^6 r\Big) \Big( (1-r)^9 r\Big) \Big( (1-r)^8 r\Big) \Big((1-r)^{12} r\Big) \Big( (1-r)^7 r\Big). $$ Notice that in the prior I included $\text{“}dr\text{''}$ since it is a measure, and in the likelihood I did not, since it is not a measure. Multiplying these and then normalizing by multiplying by the appropriate constant yields the posterior distribution, in which you will see $\text{“}dr\text{''}$ since it is a measure.

You will get $\left( \text{constant} \times r^{\text{something}\,-\,1} (1-r)^{\text{something} \, - \, 1} \, dr \right)\!,$ so it's a beta distritbuion, and you can use what you know about the beta distribution to find the posterior expected value.

Related Question