Solved – Weighted least-squares negative fitted values

least squarespredictive-modelsregressionweighted-regression

I am running a weighted least-squares regression (where all weights are strictly positive), where my dependent variable is a cross-section of variance values.

Since variance is always positive (>=0), I would expect my fitted values to be positive as well, for them to be meaningful,

$\sigma^2_i = \alpha + \beta * x_i + \epsilon _i$

The problem is that I am getting some negative fitted values:

$\hat{\sigma}^2_i = \hat{\alpha} + \hat{\beta} * x_i $ for some $i$

Is there any suggestion as to how to constraint the predicted values to be positive?

Thanks!

Best Answer

Transform your dependent variable $\sigma^2$ with a logarithm and fit the following model

$$\log \sigma_i^2 = \alpha + \beta x_i + \epsilon_i$$

Get an estimate of the variance $\eta^2$ of the residuals as

$$\hat{\eta^2} = \frac{1}{N}\sum_{i=1}^N (\log \sigma_i^2 - \alpha - \beta x_i)^2$$

Finally, use the estimator

$$\hat{\sigma_i^2} = \exp\left(\hat{\alpha} + \hat{\beta} x_i + \frac{\hat{\eta^2}}{2}\right)$$

The reason there is a $\frac{1}{2}\hat{\eta^2}$ term is because if $\epsilon$ is normally distributed with mean 0, $e^\epsilon$ has expectation $e^{\eta^2/2}$.

Since $\hat{\sigma_i^2 }$ is an exponential, it will always be positive.