[Math] Exponential extrapolation

extrapolationoptimizationregression

Given a set of points on 2D surface $(x_1,y_1),(x_2,y_2),\ldots,(x_n,y_n)$ and a function $f(x)=k+ab^x$, the task is to find values of $k,a$ and $b$ that minimize the following sum:

$$\sum_{i=1}^n (k+ab^{x_i}-y_i)^2.$$

I tried to get derivatives (with respect to $k,a,b$) from this sum and got the system of three equations, but it's impossible for me to express unknowns from it. Also I'm looking for numerical answer.

Does anyone know how to do that? Thanks for the help.

Best Answer

Gauss-Newton algorithm directly deals with this type of problems. Given m data points $(x_i,y_i)$ for regression with a function of n parameters $\vec \beta =(\beta_1,...,\beta_n)$ $$min_{\vec \beta }\ S(\vec \beta)\ where\ S(\vec \beta)=\sum_{i=1}^m r_i(\vec \beta)^2=(y_i-f(\vec \beta,x_i))^2$$ I skip the derivation of algorithm which you can find in every textbook (First use Taylor approximation and then use Newton's method). $$\Delta \vec \beta=\big(J^T\ J\big)^{-1}\ J^T\ \vec r$$ $$\vec \beta=\vec \beta + \alpha\ \Delta \beta$$ where $\alpha$ is damping coefficient and $$J=\begin{pmatrix}\bigg(\frac{\partial f}{\partial \beta_1}\bigg)_{x=x_1}&...&\bigg(\frac{\partial f}{\partial \beta_n}\bigg)_{x=x_1}\\\ ...&...&...\\\ \bigg(\frac{\partial f}{\partial \beta_1}\bigg)_{x=x_m}&...&\bigg(\frac{\partial f}{\partial \beta_n}\bigg)_{x=x_m}\end{pmatrix}\quad \vec r=\begin{pmatrix}y_1-f(\vec \beta,x_1)\\\ ... \\\ y_m-f(\vec \beta,x_m) \end{pmatrix}$$ For your specific case $$\frac{\partial f}{\partial a}=b^{x_i}$$ $$\frac{\partial f}{\partial b}=a\ b^{x_i-1}\ x_i$$ $$\frac{\partial f}{\partial k}=1$$

Related Question