Ohm’s law holding true on temperature-dependent resistances

conductorselectric-currentelectrical-resistancetemperature

As far as I know, Ohm's law (in macroscopic form) states that the in some devices/conductors/materials, the (instantaneous) current through the device is directly proportional to the (instantaneous) voltage across the device. It can be proven this occurs only if the (static/DC) resistance of the device is constant; if it is variable, then the voltage and current aren't directly proportional, so Ohm's law is not satisfied.

In the above, I haven’t discussed about temperature. In metallic conductors, as we know their resistivity is dependent on the temperature of the conductor, which depends on the ambient temperature and the current flowing through the conductor (Joule’s law). Since the resistance depends on the resistivity, it follows the resistance of a metallic conductor depends on the current through it. Doesn’t that make metallic conductors not obey Ohm’s law (since the voltage is not directly proportional to current)? Or am I wrong (if so, how)?

(This question was originally longer, but people suggested to shorten it, so I did it. The content of the original post is in this Quora answer.)


I read the following questions and corresponding answers, but they don’t address or answer my question:

Best Answer

There seem to be at least two valid ways in which physicists and engineers use the term Ohm's law, neither of which is merely a definition of resistance.

(a) If $I$ is proportional to $V$ then the conductor obeys Ohm's law, otherwise it doesn't.

(b) At constant temperature (and, strictly, at constant pressure) metallic conductors and most single-substance conductors exhibit $I$ proportional to $V$.

Clearly Ohm's law as defined in (a) is obeyed only by a narrow class of conductors. But (b) attempts to be a self-contained law of nature with few exceptions.

Your thermistor doesn't obey Ohm's law according to (a), for the reasons that you state.

On the other hand, (b) has nothing to say on whether or not a thermistor, self-heating or externally heated, obeys Ohm's law. This is because (b) specifically doesn't deal with conductors that aren't at constant temperature!

My own preference is for (a), but as nasu implies (in his comment on the question) the matter is of no great moment: you can usually work out from the context what someone means by Ohm's law.

Let the down-votes descend!