# [Physics] Help me solve a roommate argument about leaving the AC on / vs turning on two when needed

cooling

I live in a home with 2 other roommates. One is a neuroscientist from MIT who contends that opening the window "wastes a week's worth of cooling down the apartment". Her perspective is that even if there is a 10 degree temperature difference between indoors and outdoors, then all AC utility is lost.

I agree there is a point where this is true, but not to the extreme of leaving the AC on for a week or two will meet this happy temperature saturation point.

My argument (not successfully explained) is that the apartment (2,000 sq ft) cools to a reasonable temperature quickly when two AC units are on, furthermore it will take a long time to reach maximum cooling. Even if maximum cooling is desired, it's just not efficient past a certain point because cooling loss through the glass windows means the AC unit must compensate for that.

Furthering my annoyance is that the apartment is empty for the entire daytime (9am-7pm) hours, when the outdoor temperature drops, making the two AC solution even more efficient in my mind.

The 2005 model AC doesn't list the BTU, but the following info is on the sticker:

Volts 208
Total Amps Clg 7.4
Total Cmps Htg .82
RA Fan Amps .8
RA Fan HP .08
DA Fan Amps .83
DA Fan HP .13
Compr LRA 38
Compr RLA 7.1
R-22  35.1 oz
Design Pressure: 350 PSI


Am I correct in saying the two AC solution is more efficient? At which point would it be more efficient to leave the single AC on all week?

Think of an air conditioner as a device that pumps heat from inside a room to outside. To bring the temperature down to a given point, an amount of heat needs to be pumped out of the room. If the room is perfectly insulated it will take a fixed amount of energy to pump that amount of heat out, call that e. It will also take a fixed amount of time call it t.

Now assume that the room is not perfectly insulated, but that heat is transferring into the room. We know that the rate at which heat transfers is always faster the greater the temperature difference between inside and outside. Now to bring the temperature down to a given point, the air conditioner needs to pump out the usual amount of heat, plus the amount that transfers in -- so it will take e+et (where et is the incremental energy to pump out the transferred in heat,it is based on the rate of transfer in over time t).

Now that it is down to temperature, heat continues to transfer in.. so it will take eh energy to hold the temperature (eh is the energy to pump out heat as fast as it is transferring in over the time the temperature is held down).. This is now the scenario where you leave the ac on all day.

Now think of the other scenario.. you leave the AC off all day and turn it on when you need it. Again to bring the temp down to a given point it will take e2+et2.

So now lets compare them.. e+et+eh < or > than e2+et2 ?

e and e2 are the same, (if you assume the outside air is constant temp.. the inside will be heated to the same level for both scenarios)

et and et2 -- are equal. based on the same assumption the starting temp is the same, the outside temp is the same, so the amount of transferred in heat during the cool down is the same.

So it looks like under these conditions the outstanding factor is eh, so it is better to leave the AC off all day and turn it on when you come home.

You might question the assumption of constant outside temperature.. but if you look at things... even if the temperature rises during the day and thus leaving scenario 2 with a higher starting temp for the cool down (e2 > e), et and eh are now larger than they were before. (e2 - e) is always < eh (Since there is no heat transfer in once the room temp equalizes with outside temp for scenario 2, yet there is a constant transfer in in scenario 1 because the ac maintains a temperature difference, and remember the greater the temp difference the faster the heat transfer).