Multivariate Quadratic Regression For 3+ input variables

linear regressionmultivariate-polynomialregression

A similar question has been asked here: Multivariate Quadratic Regression, but my question is, how do you take the same matrix:
$$
\pmatrix{N &\sum u_i &\sum v_i & \sum u_i^2 & \sum u_iv_i & \sum v_i^2 \\
\sum u_i & \sum u_i^2 & \sum u_i v_i & \sum u_i^3 & \sum u_i^2v_i & \sum u_i v_i^2 \\
\sum v_i & \sum u_iv_i & \sum v_i^2 & \sum u_i^2v_i & \sum u_iv_i^2 & \sum v_i^3 \\
\sum u_i^2 & \sum u_i^3 & \sum u_i^2 v_i & \sum u_i^4 & \sum u_i^3v_i & \sum u_i^2 v_i^2 \\
\sum u_iv_i & \sum u_i^2v_i & \sum u_i v_i^2 & \sum u_i^3v_i & \sum u_i^2v_i^2 & \sum u_i v_i^3 \\
\sum v_i^2 & \sum u_iv_i^2 & \sum v_i^3 & \sum u_i^2v_i^2 & \sum u_iv_i^3 & \sum v_i^4 }
\pmatrix{a\\b\\c\\d\\e\\f}
=\pmatrix{\sum y_i \\ \sum y_i u_i \\ \sum y_iv_i \\ \sum y_iu_i^2\\ \sum y_iu_iv_i \\
\sum y_iv_i^2}
$$

$$y = a + bu + cv + du^2 + e uv + fv^2$$

And apply it to a scenario with 3 or more independent variables in matrix form, versus the two independent variables used in this example?

Also, how would the resulting matrix scale to a higher order polynomial, for example, a cubic or quartic function?

The use case for this is in regression machine learning, where the data contains more than two features which directly affect the target variable.

Best Answer

Just create your design matrix with the variables that you interested in. For example, if you have $x_1$, $x_1^2$, and $x_3$. Then, denote $x_1^2 = x_2$, and construct the design matrix $X$. Assume that you have $n$ data points, then
\begin{align} X = \begin{pmatrix} 1, \quad x_{11}, \quad x_{21}, \quad x_{31}\\ 1, \quad x_{12}, \quad x_{22}, \quad x_{32}\\ 1 \quad x_{13}, \quad x_{23}, \quad x_{33} \\ \vdots \\ 1 \quad x_{1n}, \quad x_{2n}, \quad x_{3n} \end{pmatrix}, \end{align} thus you left-hand-side-matrix is $$ X^TX, $$ and the system of equations is $$ X^TXb= X^Ty. $$

Related Question