Voltage – Why Do Power Lines Use High Voltage?

dissipationelectric-currentelectrical-resistancepowervoltage

I have just read that using high voltage results in low current, which limits the energy losses caused by the resistance of the wires.
What I don't understand is why it works this way. Does it have anythnig to do with electromagnetic induction in the wire which resists the current?

Best Answer

If the total resistance of the transmission line leading from a power station to you is $R$ and the city/town you're in demands an average amount of power $P$. Then $P=I\times V$ . This makes the current drawn by the city/town is $I=\frac PV$ and so the higher the transmission line voltage, the smaller the current. The line loss is given by $P_{loss}=I^2R$, or, substituting for $I$, $P_{loss} = \frac {P^2R}{V^2}$ Since $P$ is fixed by demand, and $R$ is as small as you can make it(with the big cable), line loss decreases strongly with increasing voltage($V$)(in the denominator). So smallest amount of current that you can use to deliver the power leads to the least amount of power loss. It may help to think of it as current causing 'friction/heat' which is lost on transmission.