1) As OP basically notes, an $n$-dimensional delta function transforms under change of variables $f:\mathbb{R}^n \to \mathbb{R}^n$ with (the absolute value of) an inverse Jacobian
$$ \tag{1} \delta^n(f(x))~=~ \sum_{x_{(0)},f(x_{(0)})=0 }\frac{1}{|\det(\partial f(x_{(0)}))|} \delta^n(x-x_{(0)}), $$
where the sum $\sum$ is over all zeroes $x_{(0)}$ of $f$, i.e. points $x_{(0)}$ where the function $f(x_{(0)})=0$ vanishes.
2) A typical situation arises when one reparametrizes $m$ constraints
$$ \tag{2} \chi^{a}(x) ~\longrightarrow~ \chi^{\prime a}(x)~=~ \sum_{b=1}^{m} R^{a}{}_{b}(x) \chi^{b}(x) $$
in an $n$-dimensional space (say $\mathbb{R^n}$ for simplicity) with $m\leq n$. Here both sets of constraints
$$\tag{3} \chi^{1}(x)~=~0, \ldots, \chi^{m}(x)~=~0, \qquad \text{and} \qquad \chi^{\prime 1}(x)~=~0, \ldots, \chi^{\prime m}(x)~=~0,$$
are supposed to describe the same $n-m$ dimensional zero-locus set $\Sigma\subseteq\mathbb{R^n}$. Technically, let us also impose that the quadratic matrix $R^{a}{}_{b}$, and both the rectangular matrices $\partial_i \chi^{a}$ and $\partial_i \chi^{\prime a}$, have maximal rank $m$ on the zero-locus set $\Sigma$. Then the delta function of constraints transforms with an inverse Jacobian
$$ \tag{4} \delta^m(\chi^{\prime }(x))
~=~ \frac{1}{|\det(R(x))|}\delta^m(\chi(x)). $$
3) In all branches of physics it is important to keep track of such Jacobian factors.
E.g. in the path integral formulation of gauge theories, where one imposes gauge-fixing conditions $\chi$ (also known as choice of gauge) via a delta function $\delta(\chi)$, it is important to insert a compensating Faddeev-Popov determinant (that transforms with the opposite Jacobian factor under reparametrizations of the gauge-fixing conditions), to keep the path integral independent of $\chi$.
4) As far as we can tell Ref. 1 is careful to keep the same convention throughout the book. Rather than speculate what is the most natural convention, it is more important to be consistent. Concretely, Ref. 1 is discussing volume elements for distributions of the form
$$\tag{5} d\tau~=~dV d\Gamma, \qquad dV~:=~dx~dy~dz,$$
where $d\Gamma$ denote additional variables.
For a monoatomic gas, one may choose $d\Gamma=d^3p$, but one could also choose $d\Gamma=d^3v=m^{-3} d^3p$, where ${\bf p}=m{\bf v}$.
In the approximation that Ref. 1 works in, for a diatomic molecule, there is no moment of inertia (and hence no angular momentum component ${\bf M}\cdot {\bf n}=0$) in the direction ${\bf n}$ of the (symmetry) axis of the diatomic molecule. Here $|{\bf n}|=1$. Hence it is natural to choose
$$\tag{6} d\Gamma~=~d^3p~d^3M \iint_{{\bf n}}\! d^2n~\delta({\bf M}\cdot {\bf n}). $$
With spherical decomposition of ${\bf M}=M{\bf m}$, where ${\bf m}\cdot {\bf n}=\cos\theta$, this becomes
$$ d\Gamma~=~d^3p~M^2dM~d^2m \iint_{{\bf n}}\! d^2n ~\delta(M\cos\theta)$$
$$~=~d^3p~MdM~d^2m \int_{0}^{2\pi}\! d\varphi \int_{-1}^{1}\!d\cos\theta ~\delta(\cos\theta)$$
$$\tag{7}~=~2\pi ~d^3p~MdM~d^2m, $$
which is eq. (1.1) in Ref. 1.
References:
- Pitaevskii and Lifshitz, Physical Kinetics.
While ekardnam's answer is completely correct, I think it's worth seeing a slightly more direct proof as well. I'll look at the one-dimensional version, since your three-dimensional version is just a product of three one-dimensional sums of the form
$$
\frac{1}{L} \sum_p e^{i p(x-y)}, \quad p = \frac{2\pi}{L}\mathbb{Z}
$$
First, note that since the desired sum is invariant under $x-y \rightarrow x-y + L$, the result of the sum must be periodic in $L$. If you allow $x$ and $y$ to take any real values (rather than be constrained to $[0,L]$), then the correct result for the sum is actually
$$
\frac{1}{L} \sum_p e^{ip(x-y)} = \sum_n \delta(x-y-nL), \quad n \in \mathbb{Z}
$$
From this expression, you can see that the sum is essentially just the Poisson summation formula. (To get the version on Wikipedia, just integrate both sides against an arbitrary function.)
Best Answer
Consider Fourier series expansion of the periodic function $$\sum_{\ell,m,n}\delta(x-\ell L)\delta(y-m M)\delta(z-nN)=\sum_{\ell,m,n}c_{\ell mn}e^{i2\pi\ell x/L}e^{i2\pi m y/M}e^{i2\pi n z/N}$$ Multiply both sides by $e^{-i2\pi\ell' x/L}e^{-i2\pi m' y/M}e^{-i2\pi n' z/N}$, integrate within a single "unit cell" of volume $LMN$, and use the orthogonality of these functions to conclude $$c_{\ell'm'n'}=\frac{1}{LMN}$$ and thus $$\sum_{\ell,m,n}\delta(x-\ell L)\delta(y-m M)\delta(z-nN)=\frac{1}{LMN}\sum_{\ell,m,n}e^{i2\pi\ell x/L}e^{i2\pi m y/M}e^{i2\pi n z/N}$$ or in your language $$\sum_{\mathbf R}\delta(\mathbf r - \mathbf R) = \frac 1 V \sum_{\mathbf k}e^{i\mathbf k \cdot \mathbf r}$$ If we are only evaluating this function within a single "unit cell", e.g. the one containing $\mathbf r = 0$, then only one term of the sum on the left-hand side is non-zero and we are left with $$\delta(\mathbf r) = \frac 1 V \sum_{\mathbf k}e^{i\mathbf k \cdot \mathbf r}.$$
I've skipped some steps, and you can ask reasonable questions like "does a periodic impulse train even have a Fourier series", but this is one way to obtain this relation. Also see here.
A more direct approach is to note that $$\sum_{\ell = -{\ell_0}}^{\ell_0} e^{i2\pi\ell x/L}=\frac{\sin[(l_0 + \frac 1 2)2\pi x/L]}{\sin(\pi x/L)}.$$ This follows from the partial sum of the geometric series and the fact that $\sin z = (e^{iz}-e^{-iz})/(2i)$. Lor large $\ell_0$, this is an oscillatory function bounded by the envelope provided by the denominator, with a very sharp peak at $x = 0$. It can be shown that its integral on any interval containing $x = 0$ converges to $1/L$ as $\ell_0\to\infty$, consistent with $\delta(x)/L$.
While the value of the sum does not strictly converge at $x\ne 0$, the sum gets increasingly oscillatory for large $\ell_0$ such that its integral on any interval not containing $x = 0$ converges to zero. As such, in a distribution sense we are justified in writing $$\frac 1 L\sum_{\ell = -\infty}^{\infty} e^{i2\pi\ell x/L}=\delta(x).$$
This can be extended in the straightforward way to 3D to arrive at the result obtained by the Fourier series approach.
In your last line of equations, you are evaluating $\int_V \delta(0) d^3\mathbf r'$, so it is no surprise you get an infinity.