[Math] Fitting 2nd Order multivariate quadratic with matrices

matricesregression

Hopefully you at least entertain this question as it took forever to construct the below matrix using TeX. Any ways, so I have a list of data points ($X_1$,$X_2$,Y), with the X's being independent variables and Y being dependent. What I am attempting to find is the best fit multivariate quadratic using sum of the least squares.

The general 2nd order polynomial I am attempting to find is in the form of:
$$
y = B_0 + B_1 X_1 + B_2 X_2 + B_{11} X_1^2 + B_{22} X_2^2 + B_{12}X_1X_2
$$

1) Can anyone tell me if the below matrix is correct? In comparison, a traditional uni-variate quadratic fit can be seen on page 9.

$$
\begin{vmatrix}
n & \sum_{} X_1 & \sum_{} X_2 & \sum_{} X_1 X_2 & \sum_{} X_1^2 & \sum_{} X_2^2 \\
\sum X_1 & \sum_{} X_1^2 & \sum_{} X_1 X_2 & \sum_{} X_1^2X_2 & \sum_{} X_1^3 & \sum_{} X_1 X_2^2 \\
\sum X_2 & \sum_{} X_1 X_2 & \sum_{} X_2^2 & \sum_{} X_1 X_2^2 & \sum_{} X_1^2 X_2 & \sum_{} X_2^3 \\
\sum X_1 X_2 & \sum_{} X_1^2 X_2 & \sum_{} X_1 X_2^2 & \sum_{} X_1^2 X_2^2 & \sum_{} X_1^3 X_2 & \sum_{} X_1 X_2^3 \\
\sum X_1^2 & \sum_{} X_1^3 & \sum_{} X_1^2 X_2 & \sum_{} X_1^3X_2 & \sum_{} X_1^4 & \sum_{} X_1^2 X_2^2 \\
\sum X_2^2 & \sum_{} X_1 X_2^2 & \sum_{} X_2^3 & \sum_{} X_1 X_2^3 & \sum_{} X_1^2X_2^2 & \sum_{} X_2^4 \\
\end{vmatrix}
\begin{vmatrix}
B_0\\
B_1\\
B_2\\
B_{12}\\
B_{11}\\
B_{22}\\
\end{vmatrix}
=
\begin{vmatrix}
\sum_{} Y_i\\
\sum_{} X_1 Y_i\\
\sum_{} X_2 Y_i\\
\sum_{} X_1 X_2 Y_i\\
\sum_{} X_1^2 Y_i\\
\sum_{} X_2^2 Y_i\\
\end{vmatrix}
$$

2) When I solve for the $B_{12}$ which is the $X_1X_2$ coefficient, if that is ~ 0 or very small, then that means there is very little interaction between X & Y?

Best Answer

Yes, your derivation is correct.

$B_{12}$ being small (compared to $B_{11}$ and $B_{22}$) indeed indicates that the axis of the paraboloid are close to the coordinate axis.

Related Question