The definition of temperature through Maxwellian and Boltzmann distributions have certain problems in quantum mechanics.
In thermodynamics temperature is usually defined through the derivative of entropy as you say:
$$
\frac{1}{T} = \frac{\partial S(E,\mathbf{V})}{\partial E}. \qquad (1)
$$
The division of the system into different parts (or different degrees of freedom) can be understood form the microcanonical distribution. Let the system have Hamiltonian of the following form:
$$
H = H(\mathbf{q}, \mathbf{p}, \mathbf{V});
$$
where $\mathbf{q}$ and $\mathbf{p}$ are the vectors of microscopic generalized coordinates and momenta respectively and $\mathbf{V}$ is the vector of macroscopic parameters that are constant (at the average) in the equilibrium.
The dimension of $\mathbf{q}$ and $\mathbf{p}$ is the number of the degrees of freedom of the system. Note that degrees of freedom of the same type (e.g. translation along $x$ axis) of different particles are different degrees of freedom. The set of $(\mathbf{q},\mathbf{p})$ pairs is the phase space of the system.
The distribution function for the system is
$$
f(\mathbf{q},\mathbf{p}) =
\frac{
\delta\bigl( E - H(\mathbf{q}, \mathbf{p}, \mathbf{V}) \bigr)
}{\Omega(E, \mathbf{V})};
$$
where $E$ is the internal energy and $\Omega(E, \mathbf{V})$ is the phase density of states or the number of accessible microscopic states for given $E$ and $\mathbf{V}$:
$$
\Omega(E, \mathbf{V}) =
\int \delta\bigl( E - H(\mathbf{q}, \mathbf{p}, \mathbf{V}) \bigr) d\mathbf{q} d\mathbf{p}.
$$
The entropy is
$$
S(E, \mathbf{V}) = \ln \Omega(E, \mathbf{V})
$$
Temperature of a subsystem
Let the system consist of two independent (non-interacting) subsystems. Then
$$
\mathbf{q} = (\mathbf{q}_1, \mathbf{q}_2); \quad \mathbf{p} = (\mathbf{p}_1, \mathbf{q}_2);
$$
$$
H(\mathbf{q}, \mathbf{p}, \mathbf{V}) =
H_1(\mathbf{q}_1, \mathbf{p}_1, \mathbf{V}) +
H_2(\mathbf{q}_2, \mathbf{p}_2, \mathbf{V}). \qquad (2)
$$
NB:
The subsystems are not obliged to be separated spatially. They even are not obliged to consist of different particles. The only requirement is that the Hamiltonian must have the form (2). We can put all translational coordinates to $\mathbf{q}_1$, rotational to $\mathbf{q}_2$, oscillatory to $\mathbf{q}_3$ and so on. If the energy transfer (interaction) between the subsystems is negligible during some period of time then expression (2) is correct for that period.
We can introduce distribution functions for each subsystem:
$$
f_i(\mathbf{q}_i,\mathbf{p}_i) =
\frac{
\delta\bigl( E_i - H_i(\mathbf{q}_i, \mathbf{p}_i, \mathbf{V}) \bigr)
}{\Omega_i(E_i, \mathbf{V})};
$$
where $E_i$ is the internal energy of the subsystem.
The entropy of the subsystem then is
$$
S_i(E_i, \mathbf{V}) = \ln \Omega_i(E_i, \mathbf{V})
$$
and the temperature is
$$
T_i = \left( \frac{\partial S_i(E_i, \mathbf{V})}{\partial E_i} \right)^{-1} \qquad (3)
$$
Here is the definition of the temperature of the subsystem (degree of freedom).
Temperatures in the equilibrium
Since the subsystems are independent the distribution function of whole system is the product:
$$
f(\mathbf{q},\mathbf{p}) = f_1(\mathbf{q}_1,\mathbf{p}_1)f_2(\mathbf{q}_2,\mathbf{p}_2);
$$
and total number of accessible states is:
$$
\Omega(E_1, E_2, \mathbf{V}) = \Omega_1(E_1, \mathbf{V})\Omega_2(E_2, \mathbf{V}).
$$
Hence the total entropy is
$$
S(E_1, E_2, \mathbf{V}) = S_1(E_1, \mathbf{V}) + S_2(E_2, \mathbf{V}) \qquad (4)
$$
If there is an interaction between the subsystems the internal energy will be transfered from one system to the other until the equilibrium will be reached. During this process the total energy is constant:
$$
E = E_1 + E_2 = \text{const}
$$
The energies of the subsystems changes with time and have certain values in the equilibrium. According to the 2nd law of thermodynamics the total entropy is maximal in this state. The condition of the extremum is
$$
\frac{\partial S(E_1, E_2(E, E_1), \mathbf{V})}{\partial E_1} = 0.
$$
From (4) we get:
$$
\frac{\partial S(E_1, E_2(E, E_1), \mathbf{V})}{\partial E_1} =
\frac{\partial S_1(E_1, \mathbf{V})}{\partial E_1} +
\frac{\partial S_2(E_2, \mathbf{V})}{\partial E_2}\frac{\partial E_1}{\partial E_2} =
$$
$$
\frac{1}{T_1} - \frac{1}{T_2} = 0
$$
or
$$
T_1 = T_2.
$$
One can prove that these temperatures are equal to $T$ defined as (1).
Best Answer
The "$2$" in the "phase rule" conventionally written as $F=C-P+2$, where $F$ is the degrees of freedom, $C$ is the number of components and $P$ is the number of phases, refers to the temperature and pressure that are the usual intensive parameters in chemical equilibrium of several phases and components. But this equation is only a special case of a more general one where there are other intensive parameters representing other than thermal and mechanical interactions, such as magnetic or electric fields. In fact, if the number of possible non-chemical interactions is denoted by $N$ then the phase rule is $F=C-P+N$ where now we can count the temperature for thermal and pressure for volumetric mechanical among the possible non-chemical interactions, if any. $C$ can be viewed as representing the number of chemical interactions, if you wish. (I think the connection between the formulas of Euler and Gibbs are as deep or shallow as that two assassinated Presidents Lincoln and Kennedy both had VPs named Johnson...)