Electric Circuits – Current vs Voltage in High Voltage Transmission Lines

electric-circuitselectric-currentelectrical-resistanceelectricityvoltage

I know this question has been answered many times, but sadly I'm still not quite sure I get it.

Here's my interpretation, please correct me at any point:

We have some source. There's various ways to think about this source, but the easy way to me, is to simply interpret it the same as if it were a battery – it has some sort of potential difference associated with it, and we can assume this value is fixed. This might already be wrong though, because people often talk about the "power" the source generates. I don't get why we can't just think of this as a battery.

From the source, the current will travel across wires, then reaching some point where a the transformation will happen, and it seems to me like all we need to know is the resistance of this transformer.

All we really do want is to maximize the resistance of the transformer, in relation to the resistance of the wires – we want as much of the work to be happening at the transformer, i.e. we want the voltage drop there to be as high as possible. Does that make sense?

So I'm a bit confused why so frequently people talk about "high voltage" being efficient, because to me, it seems like the voltage at the source is completely irrelevant. High voltage at the transformer at the transformer is efficient, but to me that seems like a very convoluted way of saying: "you want the resistance on of the wires be low in relation to the transformer".

Best Answer

people often talk about the "power" the source generates. I don't get why we can't just think of this as a battery.

You can. But only a real battery, not an ideal voltage source. If you short-circuit a battery and check the terminals with a voltmeter, it won't register 1.5V. It will be lower. The same happens to an overloaded power station. It has a maximum power delivery. If the load is greater than that, the voltage will sag (and a real power generator can be damaged).

it seems to me like all we need to know is the resistance of this transformer.

Transformers aren't ohmic devices. It doesn't have a "resistance" the way a regular resistor does. If you tried to measure the resistance of it in a naive way (with AC current), you would find it has very high resistance if the secondary side could sink a load, and it would have very low resistance if the secondary side did not (for instance if the secondary side were an open circuit).

The behavior of the secondary side affects the behavior of the primary side, which affects the behavior of the primary circuit.

"you want the resistance on of the wires be low in relation to the transformer".

More like "you want the power loss on the wires to be low enough that you can live with it". Given a particular power delivery and a wire resistance, you can calculate what your losses will be on the wire for different voltages.

Your customers demand a particular amount of power. If you don't deliver that much power, the voltage on the line will sag. Given our target delivery, we can divvy that power up with any combination of voltage and current that makes sense. But the higher current will have higher losses.

Let's imagine we need to deliver a megawatt of power to a neighborhood, the total resistance in the line is 5 ohms. Lets see how much power is lost in those lines based on our choice of voltage:

Power Voltage Current Wire loss
1MW 230V 4348A 94MW
1MW 2.3kV 435A 945kW
1MW 23kV 43A 9.4kW
1Mw 230kV 4.3A 94W

None of these depend on the characteristics of the transformer (other than assuming that we're using it in a reasonably efficient manner).

Could it be said that the source voltage of a battery determines the power as well, at least in some idealized scenario.

It is a factor, it doesn't determine it. And it's not an idealized scenario where that is true, it is a different scenario where that holds.

In a simple circuit with ideal voltage and a ohmic resistance, then any given voltage produces a specific amount of power at the resistor. You can do your calculations and they hold.

The big difference here is that the resistor is consuming all the power. In the case of transmission lines we are trying to deliver power elsewhere (the end load). The goal is for the lines to consume less of it and for the end load to consume more of it.

In that scenario, our voltage does not determine the total power, because we will have devices between the wires and the load (the transformers) that change the circuit.

Think of your computer with a universal power supply. Plug it in to a 120V line and it consumes 150W. Plug it in to a 240V line and it consumes 150W. It's not a simple resistance and it's not useful to predict the power it consumes by the voltage supplied.

It's confusing to me how power keeps being mentioned as what's being generated. I can image it's important as something that is used as some type of invariant, but at least at the start, before any transformers are used, I'm under the impression that it's this source voltage difference, that determines the power.

In a simple circuit where we have excess supply and simple resistance, yes the power consumed is a simple function of the supplied voltage. In those cases, go ahead.

But if you want to talk about why transmission lines benefit from high voltage, that view is not useful. The load is not ohmic and power delivered does not vary with the quadratic of the voltage.