Nuclear Weapons – Are They Energy Negative?

energyexplosionsnuclear-engineeringnuclear-physics

Not sure if this belongs here, Chemistry, History, or some other Stack Exchange site, but I'll start it here and welcome migration to a different site if appropriate.

In doing some reading on the history of the Manhattan Project, and then subsequent later nuclear weapons research, it's obvious that a huge amount of electrical power is typically required to refine, reduce, separate, and enrich uranium and plutonium (and likely other materials as well) which go into the "boom" part of the weapon fuel. Although a nuclear explosion releases a huge amount of energy in an extremely brief period of time, how does this compare to the amount of energy that would have been used to generate the fuel material in the first place, over months and years of processing and enriching from raw materials? Obviously there are lots of variables, both in fuel refining, as well as explosive yield, and so ballpark answers are perfectly acceptable to me.

Just wondering if nuclear weapons are "energy negative", when comparing what was used to produce the weapon, compared to explosive yield.

Best Answer

The Fat Man detonation which destroyed Nagasaki is said to have released 88 TJ, or 24 GWh of energy, which is pretty small by the standards of modern H bombs, 1-50 PJ or 270 to 13,500 GWh.

On the other hand, according to this (http://www.ipsr.ku.edu/ksdata/ksah/energy/18ener7.pdf) the State of New Mexico in 2019 used about 11,000 GWh in a whole year, or about 2-3 modern high end H bombs. And the state's energy usage is at least double what it was in 1945 I imagine, based on overall US trends.

So given all the development cost, e.g. all the centrifuges running for months on end to enrich, it's hard to say if the original nukes were energy positive, but the modern ones definitely are, or would be if they were ever used.

Let's hope that return on investment never gets realized.

Related Question