However, I am stuck as to how I can manipulate equation [1] into the Beta distribution.
The function $p$ defined by $p(\theta) = \theta^{\mu}(1-\theta)^{\tau-\mu}$ is proportional to the beta density $(a,b)$ with $a=\mu+1$ and $b=\tau-\mu+1$.
The density for each observation is $\displaystyle f_{X_i}(x) = \begin{cases} 1/2a & \text{if } -a\le x\le a, \\ 0 & \text{if } x<-a \text{ or } x>a. \end{cases}$
You didn't say your observations were independent but I will assume that was intended. The joint density is therefore
$$
f_{X_1,\ldots,X_n} (x_1,\ldots,x_n) = \begin{cases} 1/(2a)^n & \text{if for every } i\in\{1,\ldots,n\} \text{ we have } -a\le x_i \le a, \\ 0 & \text{otherwise}. \end{cases}
$$
The condition that for every $i\in\{1,\ldots,n\}$ we have $-a\le x_i\le a$ is the same as $\min\{x_1,\ldots,x_n\} \ge -a$ and $\max\{x_1,\ldots,x_n\}\le a.$ That condition on $\min$ is the same as $-\min\{x_1,\ldots,x_n\} \le a.$ So we need $a\ge\max$ and $a\ge -\min.$ I leave it as an exercise to show that
$$
\Big( a\ge \max\{x_1,\ldots,x_n\} \text{ and } a\le -\min\{x_1,\ldots,x_n\} \Big) \text{ if and only if } a \ge \max\{|x_1|,\ldots,|x_n|\}.
$$
Therefore the likelihood function is
$$
L(a) = \begin{cases} 1/(2a)^n & \text{if } a \ge \max\{|x_1|,\ldots,|x_n|\}, \\ 0 & \text{otherwise.} \end{cases}
$$
Now notice that $L(a)$ increases as $a$ decreases, until $a$ gets down to $\max\{|x_1|,\ldots,|x_n|\}.$ Therefore that maximum is the MLE.
Best Answer
Find $f(x|\theta)$. Using Bayes theorem we know that $f(x|\theta) = C f(\theta | x)f(\theta)$. $C$ is just a normalisation constant to make it integrate to $1$.
$f(\theta)$ is the PDF of the prior distribution. I.e. beta distribution (with some parameters $(\alpha, \beta)$). Here $f(\theta) = C' \theta^{\alpha-1}(1-\theta)^{\beta-1}$
$f(\theta |x)$ is the likelihood function for $\theta$ given that the data $x$ is distributed by a geometric distribution with parameter $\theta$. Our geometric likelihood function is $f(\theta | x) = \prod_{i=0}^n (1-\theta)^{x_i}\theta = (1-\theta)^{\sum_{i=0}^n x_i}\theta^n$.
Now were going to find the product of these and we expect it will have the same form as the beta prior but with new parameters $\alpha', \beta'$, and we will find the parameters.
So $f(\theta |x)f(\theta) = C' \theta^{\alpha+n-1}(1-\theta)^{\sum_{i=0}^n x_i+\beta-1}$. We can see the new parameters are $\alpha' = \alpha+n$, and $\beta' = \sum_{i=0}^n x_i +\beta$. Mission accomplished.