The definition of temperature through Maxwellian and Boltzmann distributions have certain problems in quantum mechanics.
In thermodynamics temperature is usually defined through the derivative of entropy as you say:
$$
\frac{1}{T} = \frac{\partial S(E,\mathbf{V})}{\partial E}. \qquad (1)
$$
The division of the system into different parts (or different degrees of freedom) can be understood form the microcanonical distribution. Let the system have Hamiltonian of the following form:
$$
H = H(\mathbf{q}, \mathbf{p}, \mathbf{V});
$$
where $\mathbf{q}$ and $\mathbf{p}$ are the vectors of microscopic generalized coordinates and momenta respectively and $\mathbf{V}$ is the vector of macroscopic parameters that are constant (at the average) in the equilibrium.
The dimension of $\mathbf{q}$ and $\mathbf{p}$ is the number of the degrees of freedom of the system. Note that degrees of freedom of the same type (e.g. translation along $x$ axis) of different particles are different degrees of freedom. The set of $(\mathbf{q},\mathbf{p})$ pairs is the phase space of the system.
The distribution function for the system is
$$
f(\mathbf{q},\mathbf{p}) =
\frac{
\delta\bigl( E - H(\mathbf{q}, \mathbf{p}, \mathbf{V}) \bigr)
}{\Omega(E, \mathbf{V})};
$$
where $E$ is the internal energy and $\Omega(E, \mathbf{V})$ is the phase density of states or the number of accessible microscopic states for given $E$ and $\mathbf{V}$:
$$
\Omega(E, \mathbf{V}) =
\int \delta\bigl( E - H(\mathbf{q}, \mathbf{p}, \mathbf{V}) \bigr) d\mathbf{q} d\mathbf{p}.
$$
The entropy is
$$
S(E, \mathbf{V}) = \ln \Omega(E, \mathbf{V})
$$
Temperature of a subsystem
Let the system consist of two independent (non-interacting) subsystems. Then
$$
\mathbf{q} = (\mathbf{q}_1, \mathbf{q}_2); \quad \mathbf{p} = (\mathbf{p}_1, \mathbf{q}_2);
$$
$$
H(\mathbf{q}, \mathbf{p}, \mathbf{V}) =
H_1(\mathbf{q}_1, \mathbf{p}_1, \mathbf{V}) +
H_2(\mathbf{q}_2, \mathbf{p}_2, \mathbf{V}). \qquad (2)
$$
NB:
The subsystems are not obliged to be separated spatially. They even are not obliged to consist of different particles. The only requirement is that the Hamiltonian must have the form (2). We can put all translational coordinates to $\mathbf{q}_1$, rotational to $\mathbf{q}_2$, oscillatory to $\mathbf{q}_3$ and so on. If the energy transfer (interaction) between the subsystems is negligible during some period of time then expression (2) is correct for that period.
We can introduce distribution functions for each subsystem:
$$
f_i(\mathbf{q}_i,\mathbf{p}_i) =
\frac{
\delta\bigl( E_i - H_i(\mathbf{q}_i, \mathbf{p}_i, \mathbf{V}) \bigr)
}{\Omega_i(E_i, \mathbf{V})};
$$
where $E_i$ is the internal energy of the subsystem.
The entropy of the subsystem then is
$$
S_i(E_i, \mathbf{V}) = \ln \Omega_i(E_i, \mathbf{V})
$$
and the temperature is
$$
T_i = \left( \frac{\partial S_i(E_i, \mathbf{V})}{\partial E_i} \right)^{-1} \qquad (3)
$$
Here is the definition of the temperature of the subsystem (degree of freedom).
Temperatures in the equilibrium
Since the subsystems are independent the distribution function of whole system is the product:
$$
f(\mathbf{q},\mathbf{p}) = f_1(\mathbf{q}_1,\mathbf{p}_1)f_2(\mathbf{q}_2,\mathbf{p}_2);
$$
and total number of accessible states is:
$$
\Omega(E_1, E_2, \mathbf{V}) = \Omega_1(E_1, \mathbf{V})\Omega_2(E_2, \mathbf{V}).
$$
Hence the total entropy is
$$
S(E_1, E_2, \mathbf{V}) = S_1(E_1, \mathbf{V}) + S_2(E_2, \mathbf{V}) \qquad (4)
$$
If there is an interaction between the subsystems the internal energy will be transfered from one system to the other until the equilibrium will be reached. During this process the total energy is constant:
$$
E = E_1 + E_2 = \text{const}
$$
The energies of the subsystems changes with time and have certain values in the equilibrium. According to the 2nd law of thermodynamics the total entropy is maximal in this state. The condition of the extremum is
$$
\frac{\partial S(E_1, E_2(E, E_1), \mathbf{V})}{\partial E_1} = 0.
$$
From (4) we get:
$$
\frac{\partial S(E_1, E_2(E, E_1), \mathbf{V})}{\partial E_1} =
\frac{\partial S_1(E_1, \mathbf{V})}{\partial E_1} +
\frac{\partial S_2(E_2, \mathbf{V})}{\partial E_2}\frac{\partial E_1}{\partial E_2} =
$$
$$
\frac{1}{T_1} - \frac{1}{T_2} = 0
$$
or
$$
T_1 = T_2.
$$
One can prove that these temperatures are equal to $T$ defined as (1).
Today I heard a thermodynamic argument for about this. Since there is little work done by the system in solid and liquid phase. The heat capacity must be (roughly) same for the solid and liquid phase.
This one does not convince me at all. Especially, because no work is done by the system in the case one considers the constant volume heat capacity $c_V$. I think the argument you heard was more likely that $c_V \approx c_p$ for liquids and solids (as their volume expansion coefficients are small, and thus $W = p \Delta V$ is small in the constant pressure case), while the values differ relevantly for gases (as they expand relevantly).
So I googled and found that one do expect that heat capacity of liquid be more (i.e. for given heat small temp change) than that of the solids and gases as the majority of contribution to heat capacity, in solids come from 3 vibrational degrees of freedom and 3 translational degrees of freedom while in liquids both are significant hence the energy gets distributed in 3 + 3 = 2 × 3 degrees of freedom. Hence we expect some liquid (at least).
While the I do not see how the math fits here, this argument is much more convincing. At high temperatures (depending on the system, this usually means above some $10\,\text{K}$), each vibrational degree of freedom takes energy $T$, each translational degree of freedom takes energy $T/2$, rotational degrees of freedom do also take energy $T/2$.
In a solid state system we have a number of phonon modes, depending on the number of atoms $n$ in the unit cell (specifically $3n$). Thus the high temperature limit of $c$ (per amount of substance) will be $3n$ (when counted per unit cell, as with ionic solids), in the case of water it will be $3$ (although water ice has more than one molecule per unit cell).
As an atom cannot rotate (without being electronically excited, which takes lots of energy), the number of rotational degrees of freedom depends on the form of the molecule (a two atom molecule having two rotations, a non-linear three atom molecule having three rotations).
This gives for gases: $c = 3/2$ for atomic gases, $c = 5/2$ for handles and $c = 3$ for three atom molecules (like water). (Vibrations of molecules usually have higher energies than our environment, thus the degrees of freedom are "frozen out").
In a liquid it is more complicated (and I am not sure about the correctness of my statements here), but we could probably argue we have three translational and vibrational degree of freedom (the longitudinal phonon) rotations are usually irrelevant in a liquid due to dense packing. Additionally, in liquid water energy can be dispensed by breaking hydrogen bonds (and exactly this effect causes the negative volume expansion coefficent of water near $0^\circ C$).
As you can see this seems to predict that $c_s = c_g$ for water but not that $c_l = 2c_s$. In other words it shows, that the case of the liquid is much more complicated. Overmore it shows that the factors are not random, but also, that the rules are not general for all materials (but depend on the structure).
Especially, you can not get away with simple counting of degrees of freedom (as the energy in the high temperature limit depends on the kind of degree of freedom).
Best Answer
The thermal energy for each degree of freedom is given by $$\frac{1}{2} k_{B} T $$ If this thermal energy is lower than the gap between ground state and first excited state (supposing quantised energy levels) then a thermal excitation becomes very unlikely. I would guess that this is what the author means by "frozen" degrees of freedom.
For a more precise answer it would help if you would give the reference of your quote.