[Physics] Does the temperature coefficient of a material depend on temperature

electrical-resistanceelectricitytemperature

In my textbook, a relationship is plotted between the resistance of the material and the temperature, and the temperature coefficient is defined as the slope of that graph divided by an arbitrary resistance $R_1$ on the graph. Does that mean that the coefficient varies depending on the temperature I choose to divide the slope by?

If so, how does it stand as a valid reference to the material’s resistance growth rate with temperature?

If not, then where did I go wrong with this train of thought?

Best Answer

The temperature coefficient is often defined defined as $\alpha = \dfrac{R_{\rm T}-R_0}{R_0 \, T}$
where the temperatures are in degree Celsius and the reference resistance $R_0$ is measured at $0^\circ \rm C$.

Which gives the equation $R_{\rm T} = R_0(1 + \alpha\, T)$ ie a linear relationship between resistance and temperature.

As you have pointed out the relationship is not linear and so a better relationship like $R_{\rm T} = R_0(1 + \alpha\, T + \beta \, T^2 + . . . .)$ can be used for large temperature variations.
This implies that other constants like $\beta$ must be given as well as the temperature coefficient of resistance $\alpha$ as is shown in this reference for a platinum resistance thermometer.

So it does depend on the accuracy to which you are working and the range of temperatures.
If you are given just $\alpha$ with a stated reference temperature then the implication is that as the temperature deviation from the reference temperature increases there will be a corresponding reduction in the accuracy of resistance or temperature as a result of using the first order equation.

Related Question