Solved – The impact of rescaling a predictor on the standard error of the corresponding coefficent

multiple regressionpolynomialstandard error

I am trying to form a polynomial regression model using SVD linear model. As the predictor at a large degree goes too large, say x6, I first scale it down if the mean of x6 is over a threshold and then put it in the design matrix X. I then can easily get the actual coefficient of x6 by multiplying the same factor. Now the problem is how to get the actual standard error of the coefficient of x6.

From
"Standard errors for multiple regression coefficients?"
for simple regression, y = a + b'x' if x' is scaled down by m. From ${\rm var}(b')=s^2/∑(X_i'−\bar X')^2$ we can easily see the actual
var(b) = var(b')/m² and hence SE(b) = SE(b')/m, and also the standard error of intercept is not impacted. However, for multivariate regression with the general form of covariance matrix
cov(β) = s²(X'X)-1. I can't see how to get the actual standard error back.

Another way to look at the problem is from SVD calculation. as X = UWV' from SVD, and
(X'X)-1 = VW -2V', as long as we know how the scaling impacts the V and W, we also can know the impact to the standard error. However, I also have difficulties knowing the impact on V and W.

Any suggestion is appreciated. I have been struggling with the issue for quite a while.

Best Answer

The standard error comes from the diagonal element of the covariance matrix. If you multiply one of the $X$ variables by a constant, only one element on the diagonal of the covariance matrix will be changed. (Also, $s^2$ won't be changed because, like you mention, the estimated parameter will just be adjusted up or down which means the residuals will be identical whether you scale or not)

Here's an example. Let's say $X$ has two variables:

Normal Case: $$ V(\hat{\beta}) = s^2 (X'X)^{-1} $$

$$ \implies se(\hat{\beta_1}) = \frac{s}{\sqrt{\sum{x_{1i}^2}}}$$

Scaled case $\tilde{X}_1 = c * X_1$, $\tilde{X}_2 = X_2 \implies \tilde{\beta} = \frac{1}{c} \hat{\beta}$, and $$ V(\tilde{\beta}) = s^2 (\tilde{X}'\tilde{X})^{-1}$$

$$se(\tilde{\beta_1}) = \sqrt{s^2 (\tilde{X}'\tilde{X})^{-1}_{1,1}}$$

i.e. it is the square root of the first element on the diagonal of the covariance matrix.

$$se(\tilde{\beta_1}) = \sqrt{s^2 [(c, 1)'X'X(c,1)]^{-1}_{1,1}}$$

$$se(\tilde{\beta_1}) = \frac{s}{\sqrt{c^2 \sum{x_{1i}^2}}}$$ $$ = \frac{s}{c \sqrt{\sum{x_{1i}^2}}} = \frac{1}{c} se(\hat{\beta})$$