[Physics] four-dimensional definition of entropy

entropyspecial-relativitystatistical mechanics

It seems odd that entropy is usually only defined for a system in a single 'slice' of time or spacelike region. Can one define the entropy of a system defined by a 4d region of spacetime, in such a way that yields a codimension one definition which agrees with the usual one when the codimension one slice is spacelike?

Best Answer

You are thinking about Boltzmann's definition of entropy, I guess?

In Boltzmann's definition, entropy is just the logarithm of the amount of possible states associated with certain macroscopic variables. In its generality, therefore, it doesn't seem to me to exclude the possibility of counting states with different time coordinates. Or in your more general context, on different time-slices. The question is, what does this correspond to? Does it make sense to do that? You would have to specify the time-development of the macroscopic variables and count the number of microscopic trajectories compatible with those macroscopic trajectories.

As a matter of fact, there exist so-called dynamical entropies. In a heuristic sense, what they do is counting the density of phase-space trajectories of a system, whereas Boltzmann entropy just counts the amount of accessible states under certain macroscopic constraints.

http://en.wikipedia.org/wiki/Kolmogorov%E2%80%93Sinai_entropy#Measure-theoretic_entropy

Related Question