[Physics] Why do high current conductors heat up a lot more than high voltage conductors

electric-currentpowerthermodynamicsvoltage

120 volts x 20 amps = 2,400 Watts

However, if I increased the voltage and lowered the current, you can also use a smaller wire size (more inexpensive), also have less heat and achieve the same watt Power.

1,000 volts x 2.4 amps = 2,400 Watts

  1. Why doesn't it heat up like current?
  2. To me this approach seems more efficient and less costly because you don't use as much material, so why isn't this common?

Best Answer

The term you're looking for is Joule Heating.

The heat dissipated in a conductor is proportional to $I^2 R$ where $I$ is the current and $R$ is the resistance. Heating happens when moving charge (electrons) collide with the molecules in the conductor inelastically (that is, they transfer some kinetic energy to the molecule).

Remember, current is defined as the amount of charge passing a given spot per unit of time. It makes sense then that the more charge passing a spot, the more collisions occur and therefor the more heat is dissipated.

This is why power lines use high voltages for transmission so that they can provide more power with less current.