[Math] Fitting curve for Newton’s cooling law data programatically

regression

The data are for the model $T(t) = T_{s} – (T_{s}-T_{0})e^{-\alpha t}$,
where $T_0$ is the temperature measured at time 0, and $T_{s}$ is the temperature at time $t=\infty$, or the environment temperature. $T_{s}$ and $\alpha$ are parameters to be determined.

How can I fit my data against this model? I'm trying to solve $T_{s}$ by $T_{s}=(T_{0}T_{2}-T_{1}^{2})/(T_{0}+T_{2}-2T_{1})$, where $T_{1}$ and $T_{2}$ are measurements in time $\Delta t$ and $2\Delta t$, respectively.

However, the results are varying a lot through the whole data set.

Shall I try gradient descent for the parameters?

Best Answer

Gradient descent might be overkill.

For convenience, use a temperature scale translated so that $T_0=0$ and the model is

$$T(t)=T_s(1-e^{-\alpha t}).$$

You want to minimize

$$E=\sum_i(T_i-T_s(1-e^{-\alpha t_i}))^2.$$

Setting an arbitrary value for $\alpha$, the least-squares estimate of $T_s$ is given by

$$\hat T_s(\alpha)=\frac{\sum_iT_i(1-e^{-\alpha t_i})}{\sum_i(1-e^{-\alpha t_i})^2},$$

from which you deduce

$$\hat E(\alpha)=\sum_i(T_i-\hat T_s(1-e^{-\alpha t_i}))^2.$$

The optimal $\alpha$ is found by unidimensional optimization.

Related Question