[Physics] Ohmic Heating in Wires

electric-circuitselectrical-resistancepower

please could someone tell me why Ohmic losses are always referred to as $I^2 R$ losses? Here is my problem. If the power coming from a power station is fixed then you can either deliver this power as high voltage, low current or high current, low voltage. But isn't $I^2R$ equal to $V^2 / R$, therefore if R is constant doesn't the power depend on the square of the voltage so surely it doesn't matter whether it is high voltage or high current. The only way I can reconcile this is that a high current must cause a greater heating effect than a high voltage. I can't figure why though. If this is the case then is there a reason why a high current causes more heating than a high voltage?

Best Answer

But isn't $I^2R$ equal to $V^2/R$, therefore if $R$ is constant doesn't the power depend on the square of the voltage so surely it doesn't matter whether it is high voltage or high current.

Consider the wires connecting the power plant to the appliance; let the effective amplitude of oscillating current flowing through all wires be $I$ and let $V$ be drop of electric potential across one wire conducting current $I$. The situation can be drawn like this:

              V
  <------------------------>
  o‒‒∧‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒o
     |  
     |U
  o‒‒∨‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒o


  o‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒‒o

power                     appliance
plant

The Joule loss of energy per unit time in the wires is $P_{loss}=VI$ and since Ohm's law says $V=I/R$, we have expressions $P_{loss}=V^2/R= I^2R$.

When people say higher voltage means lower losses of energy, by "voltage" they do not mean $V$; that would make, as you realized, no sense since the energy losses are proportional both to $I^2$ and to $V^2$.

By "higher voltage", they mean higher voltage $U$ between two separate wires at the same distance from the power plant. Higher $U$ than generated can be achieved in the power plant using appropriate voltage transformer.

Why is higher $U$ beneficial?

The power utilizable at the end of the power line is $P_{useful}=UI$. The power that is being lost is $RI^2$.

So by making $U$ higher, the same useful power can be transferred with much lower current $I$ and thus much lower energy losses $RI^2$ in the power line. It is easy too see that if we double the voltage $U$, the power loss decreases by factor of four.