[Physics] Do transformers lose energy

dissipationelectromagnetismenergy-conservationmagnetic fields

EDIT: The title should rather be how/why transformers lose energy

My idea of a transformer is that it is composed of two separate wire windings around some metal core. The purpose is to increase/decrease AC voltage. The transfer of energy from the primary to the secondary winding is due to magnetic coupling or mutual inductance.

My question: how much energy is lost in this system due to the metal core AND why? What determines the efficiency of a transformer? For example, it must be based on some property of the metal core. I suppose this question is directed at electrical engineers in particular.

For instance, I've read in textbooks that transformers are fairly efficient, with an "efficiency rating" of around 98% or so.

The equation that governs this is

$$
\text{Efficiency(%)} = 100\times\frac{P_{\text{out}}}{P_{\text{in}}}
$$

where $P_{\text{in}}$ is the primary power (i.e. $V_p\times I_p$) in Watts and $P_{\text{out}}$ is the secondary power (i.e. $V_s\times I_s$) in Watts.

Best Answer

A significant source of power loss in a transformer is the induced eddy currents in the core. Just as the varying magnetic field induces current in the secondary coil, it can also induce currents in the core itself. These currents do nothing but dissipate energy, and so are to be avoided.

To reduce eddy currents, you either build your transformer out of a non-conductive magnetic material (e.g. ferrite), or you split the core's conductive material into many plates separated by insulating layers. The insulating layers block the large-scale eddy currents while passing the magnetic field, thus reducing the power loss.

Related Question