I previously asked whether Planck's curve actually reaches zero, to which some of you answered that it only approaches it, which raises the question: doesn't this mean that a photon with extremely high frequency will have massive energy? I know that the intensity (number of photons) of such high-frequency photons is close to zero, but even if we have only one photon with high frequency the ultraviolet catastrophe would still be unresolved. So what is it I am missing?
[Physics] How does Planck’s curve solve the ultraviolet catastrophe
electromagnetic-radiationsingularitiesthermal-radiation
Related Solutions
Avoiding mathematical formulae to the maximum, and warning for furious hand-waving ahead, I would state it like this:
In the classical picture, there is no quanta concept, so you could have just a little bit of radiation energy at any frequency. However, since quantization appeared, the minimum amount of radiation energy that you could possibly have at a frequency $\nu$ is $h\nu$. Since the temperature regulates the distribution of energy by the Boltzmann equation ($\mathbb{P}(E)\propto e^{-E/kT}$), if you have radiation at a certain frequency such that $h\nu\gg kT$, then having just one photon is "too much", or it is highly unlikely.
Alternatively, we know from statistical mechanics that it is very unlikely to have radiation with energy higher than $kT$, with the precise expression for this statement given by Boltzmann law. But for $h\nu\gg kT$ type of radiation, Planck leaves us no choice: you have to have more than one photon, or none. Therefore, you almost do not have this type of radiation (or these type of photons) when the temperature is low, therefore "depopulating" the UV states.
Now I realize something: Planck could have "solved" the UV catastrophe by postulating not complete quantization, but only that the minimum amount of energy at a certain frequency is $h\nu$. With this "floor", you do not need to go further and say that the energy has to come on integer multiples of this minimum quantity, just that, if energy of radiation at this frequency exists at all, then it has a minimum $h\nu$.
The problem I think you are having is that once you assume a false statement, you can prove anything. So everything you said in the second paragraph is true if you treat the problem classically. You are right that each electromagnetic standing-wave mode in the cavity would have no energy, and so there would be no electromagnetic energy at all even at finite temperature.
However, this is not the exact line of reasoning the author intended. The author reasoned as follows:
We know from experience that it takes only a finite amount of energy $E$ to raise the temperature of a hollow metal box (radiation cavity) by some temperature $\Delta T$.
We know from equipartition that this energy $E$ must be split evenly between each mode of the cavity
Since there are an infinite number of modes, each mode's energy must increase by $E/\infty$, but this is zero and so no mode will have any more energy after the temperature is raised.
These first three points pretty much agree with what he says. Then I think his next point goes something like this
We know that if we pump energy into a low frequency mode and wait, the system will thermalize, so that energy will be transfered to higher freqency modes.
We know from experience that our cavity will still emit radiation after thermalization.
Because of equipartition, we expect much of the radiation to occur it higher frequencies. This contradicts experience, because we never see a room temperature blackbody emitting x-rays.
Now the way I was taught the ultraviolet catastrophe was the following. We try to figure out the total energy $E$ of the system at temperature $T$. This will be the sum of all modes $\nu$ of the energy in that mode $E_\nu$. Since there are modes with arbitrarily high $\nu$, this sum is actually infinite, so it can be written as a limit: $$E = \lim_{\nu^* \to \infty} \sum_{\nu=0}^{\nu^*} E_\nu.$$ Now classicaly each $E_\nu$ should just be $kT$, so that our equation becomes $$ E = \lim_{\nu^* \to \infty} \sum_{\nu=0}^{\nu^*} kT = \lim_{\nu^* \to \infty} kT N(\nu < \nu^*),$$ where $N(\nu < \nu^*)$ is the number of modes with frequency $\nu$ less than $\nu^*$. Now when $\nu^*$ is pushed higher and higher (into the ultraviolet), $N(\nu < \nu^*)$ keeps increasing without bound, so that the estimate of the total energy $E$ keeps getting bigger and bigger. The fact that $E$ appears to become infinite when you put $\nu^*$ deeper and deeper into the ultraviolet is why it is called the ultraviolet catastrophe.
So now you have seen the ultraviolet catastrophe explained to ways. The way the author explained it, he assumed a finite total energy, and divided this energy up between an infinite number of modes to get zero energy per mode. I would say "disappearing energy" catastrophe is a good name for this. The way I explained it was to assume a constant finite energy per mode, and have the energy diverge as higher and higher frequencies are considered. It makes more sense to call this one the ultraviolet catastrophe. Either way, it is clear something is wrong.
Best Answer
The Planckian approaches zero fast enough: there can be some very high energy photons in a thermal distribution, but the probability of their presence drops fast; crucially, it drops faster than the energy rises. Qualitatively, the $\sim e^{-h \nu / k_B T}$ factor in the distribution can be multiplied by any power of $\nu$ and still yield a finite integral. Therefore, the energy carried by photons that have, say, frequencies higher than $N$ times $k_B T / h$ drops to zero as $N$ grows large.
You could find a $1 \mathrm{J}$ photon in the Sun's blackbody (where the average photon energy is of the order $10^{-19} \mathrm{J}$) but it is astronomically unlikely that you will, so on average the energy contribution by such photons is very small.
I've tried to give an intuitive answer; if you want a more formal one try to integrate the Planckian yourself to find the energy carried in a specific frequency band!