[Physics] Why does a 60W bulb glow brighter than a 100W bulb in a series

electric-currentelectrical-resistancepowervoltage

In my physics class I have this problem that shows two lightbulbs, one 60W and one 100W in series, connected to a 120V battery.
The problems are:

Which bulb is brighter? (A: 60W)

Calculate the power dissipated by the 60W bulb. (A: 23.4W)

Calculate the power dissipated by the 100W bulb. (A: 14.1W)

Why is the power dissipated not simply the wattages of the bulbs? I followed one workthrough online where you first find R for both using P = (V^2)/R and then use I = V/R to get a current of 0.3125A. The power dissipated is then calculated using P = I^2R and you get the above answers. However, doesn't that assume the voltage drop between the lightbulbs is 120V in both cases, and isn't that wrong?

I tried getting it another way where I said P1 = IV1, P2 = IV2, and V1+V2=120(Volts). I solved the voltage drop on the 60W lightbulb to be 45V and 75V on the 100W one. Then, current is solved to be 4/3A which lets us solve the resistance for each one as 33.75Ohms and 56.25Ohms. Then using the formula P = V^2/R, the original wattages are found as the answer. Why is it right to assume 120V for both bulbs?

Diagram

Best Answer

Why is the power dissipated not simply the wattages of the bulbs?

The power rating of a bulb is calculated assuming that the bulb will be used in a normal lighting circuit, that is, it will be in a parallel circuit, receiving the full domestic supply voltage, which I assume is 120V in your country.

A 60W bulb doesn't "know" it's supposed to pull 60W. It has a (fairly) constant resistance (once it has warmed up) which determines how much current it will draw when provided with a given voltage.

So for your series circuit you need to use 120V to calculate each bulb's resistance in ohms from its power rating. Then you can work out the total series resistance, and hence the total current for the curcuit, as you have done.