[Math] Least Squares Regression Matrix for Rational Functions

polynomialsregression

So first off no, this isn't a homework problem.
Second, I'm trying to understand how this works, NOT find a program that will do it for me.

Okay so I've known for a while how to use Gaussian-Jordan Elimination to interpolate a polynomial. Recently I came across this article, which explains how to use multiple linear regression to fit a polynomial

Similarly I also learned how to interpolate for Rational Polynomials. So now I'm trying to figure out how to fit a rational function to a set of data. Specifically, I'm trying to understand how to setup the Matrix so that I can use Gaussian-Jordan elimination to find the coefficients.

Because I know interpolation and regression are similar (and in case it wasn't clear from my question), when I refer to interpolation, I mean only having just enough points to solve for the variables. When I say regression, I mean you have more that enough points, and you're trying to find a function that adequately fits all of them

Thanks for any help you can provide.

Best Answer

You can't use linear regression to try to minimize $\sum_j \left(\dfrac{p(x_j)}{q(x_j)} - y_j \right)^2$: the equations you get are nonlinear. What you can do is minimize $\sum_j (p(x_j) - y_j q(x_j))^2$, where to keep things nontrivial you require e.g. $q(0)= 1$. If you write $p(x) = \sum_{i=0}^n p_i x^i$ and $q(x) = \sum_{i=0}^m q_i x^i$ with $q_0 = 1$, the equations are

$$\eqalign{\sum_j (p(x_j) - y_j q(x_j)) x_j^i &= 0, \ i = 0\ldots n \cr \sum_j (p(x_j) - y_j q(x_j)) y_j x_j^i &= 0, \ i = 1 \ldots m\cr}$$

Thus for equation $i$ in the first group of equations, the coefficient of $p_k$ is $\sum_j x_j^{k+i}$ and the coefficient of $q_k$ is $- \sum_j y_j x_j^{k+i}$. For equation $i$ in the second group, the coefficients are $\sum_j y_j x_j^{k+i}$ and $-\sum_j y_j^2 x_j^{k+i}$. Since $q_0 = 1$ is constant, its coefficients are moved over to the right side.

EDIT: For example, with $n = 3$ and $m=2$, the system is $$ \pmatrix{ \sum_j 1 & \sum_j x_j & \sum_j x_j^2 & \sum_j x_j^3 & - \sum_j y_j x_j & - \sum_j y_j x_j^2\cr \sum_j x_j & \sum_j x_j^2 & \sum_j x_j^3 & \sum_j x_j^4 & - \sum_j y_j x_j^2 & - \sum_j y_j x_j^3\cr \sum_j x_j^2 & \sum_j x_j^3 & \sum_j x_j^4 & \sum_j x_j^5 & - \sum_j y_j x_j^3 & - \sum_j y_j x_j^4\cr \sum_j x_j^3 & \sum_j x_j^4 & \sum_j x_j^5 & \sum_j x_j^6 & - \sum_j y_j x_j^4 & - \sum_j y_j x_j^5\cr \sum_j y_j x_j & \sum_j y_j x_j^2 & \sum_j y_j x_j^3 & \sum_j y_j x_j^4 & - \sum_j y_j^2 x_j^2 & - \sum_j y_j^2 x_j^3\cr \sum_j y_j x_j^2 & \sum_j y_j x_j^3 & \sum_j y_j x_j^4 & \sum_j y_j x_j^5 & - \sum_j y_j^2 x_j^3 & - \sum_j y_j^2 x_j^4\cr} \pmatrix{p_0\cr p_1\cr p_2\cr p_3\cr q_1\cr q_2\cr} = \pmatrix{\sum_j y_j\cr \sum_j y_j x_j \cr \sum_j y_j x_j^2 \cr \sum_j y_j x_j^3\cr \sum_j y_j^2 x_j\cr \sum_j y_j^2 x_j^2\cr}$$