Yes. The Hamiltonian is your $H$. $p_r=0$ because $\frac{\partial L}{\partial \dot r}=0$ with $L:=\frac{m}{2}(R^2\dot\theta^2+R^2\sin^2\theta\dot\phi^2)$.
The integral is over the 4-dimensional phase space: $(\theta,\phi,p_{\theta},p_{\phi})$, because the particles just move over the 2D surface of the sphere.
$J=1$ regardless of the coordinate system you are expressing the Hamiltonian. This is because of the Liouville's theorem that states that the phase volume is conserved under canonical transformations. In particular a coordinate transformation is included. So, in 3 dimensions:
$$Z=\frac{1}{h^3}\int d^3q d^3p \, \, \exp[-\beta H(\bar q,\bar p)] \qquad (*)$$
being $\bar q = (x,y,z)$ or $q=(r,\theta,\phi)$, etc.
*Note. The constant $h$ introduced in the integral (*) in order to maintain $Z$ dimensionless. So $h$ is just a constant (yet unknown), with units of action, i.e. units of angular momentum.
Okay, this is actually pretty straightforward, but I don't know where to start.
Review: What's a partition function?
Let's step back and derive what we're talking about: what is a partition function? So we have a system which takes on a set of energy levels with degeneracies $\left \{(E_i, g_i) \right \}.$
We know that your system $s$ is in contact with a reservoir $r$, but together they are sealed up in a microcanonical ensemble with $S = S_s + S_r$, $U = U_s + U_r$. Now that reservoir is big and complicated so its internal degrees of freedom over the (to it) smallish changes in $U_r$ can be linearized as $S_r(U - U_s) = S_r(U) - U_s/T$, where $T$ is its (effectively constant) thermodynamic temperature $T^{-1} = \left(\frac{\partial S_r}{\partial U_r}\right)_{N_r,~V_r}$. Therefore the overall entropy of the reservoir system in state $i$ is $S_r(i) = S_0 - E_i/T$ for some $S_0$. But we know that the definition of entropy is $S = k_B \ln W$ where $W$ is a multiplicity of the state, so counting in the degeneracy, the total multiplicity of the state is simply:
$$W_i = g_i ~ W_r(i) = g_i ~ e^{S_0/k_B - E_i / (k_B T)} $$and the probability is therefore $$p_i = \frac{W_i}{\sum_k W_k} =\frac {g_i ~ e^{-E_i / (k_B T)}} Z$$ for some constant $Z$ independent of the state index $i$, incorporating both $\sum_k W_k$ and $e^{S_0/k_B}$. Since the probabilities sum to 1, we can say that:$$Z = \sum_i g_i ~ \exp\left(\frac{-E_i} {k_B T}\right). $$If the system is continuous then we need a density-of-states $g(E)$ so that the number of states with energies between $E$ and $E + dE$ is roughly $g(E) ~ dE$, then we convert the above to an integral.
From particles to complex systems
Okay, now that we're both on the same page about what it is, what happens if your system has a bunch of parts? Then each $i$ now labels a configuration of the parts. It potentially gets complicated! The first easy thing to do is to ditch the degeneracies $g_i$ and instead store all of their energies in a multiset: this is a set which can hold the same number multiple times. That might be confusing so let's procede formally a different way.
Let's talk now about a set $C = \left\{c_i\right\}$ where the $c_i$ is some mathematical object telling me the configuration of the state $i$, and we'll assume that this is distinct for each $i$, and now we have to transition from a set of $E_i$ to a function $E(c_i)$ which gives the energy of a configuration of the parts. As a side effect now $g_i = 1$ for each $i$ since each configuration is treated independently, but the same result holds:$$Z = \sum_i \exp\left(\frac{-E(c_i)} {k_B T}\right).$$
Non-interacting systems
If you're with me so far, there's just one more step! What is the form of $c_i$ and $E(c)$?
Well for a system of $N$ identical noninteracting particles, we have the single-particle energies $E_i$ from before, and the total energy is the sum of the energies for which the states are occupied. That is, the ideal form for $c_i$ becomes an occupation function, $c_i = \left\{n_{i,k}\right\}$ which tells us, in configuration $i$, how many particles are in the state with energy $E_k$. Then the energy of the state is:$$E(c_i) = \sum_k n_{i,k} ~ E_k,$$hence,$$Z = \sum_i \exp\left(\frac{-\sum_k n_{i,k} ~ E_k} {k_B T}\right)$$So that is where the sum up-top comes from: we now have a complicated multi-particle state but as long as the particles themselves are noninteracting we can use the sum of single-particle energies to get the overall energy.
Best Answer
Why in the heck are you worrying about the difference between 3/2 and 5/2? Your formulae differ not just by 3/2 and 5/2 but also by $\ln(N)$. For the thermodynamic limit, $\ln(N)$ is infinity, and so a lot bigger than 1.
The number 3/2 has in this context no meaning, except that it is 5/2-1. The number 5/2 gives the correct entropy correct for the ideal, non-relativistic gas, composed of indistinguishable particles with a single spin state and no degrees of freedom other than the position and momentum. However, you should understand that the entropy is defined up to a constant only by the third law of thermodynamics. This third law, in turn, would not be valid if there were not an underlying quantum mechanics. So, this is an intrinsically QUANTUM notion, and can not really be understood without quantum mechanics.