Solved – Proof of Normal Sampling Distributions of OLS Estimators

least squaresnormal distributionself-study

I'm trying to proof:

$\hat{\beta}_j \sim Normal[\beta_j, Var(\hat{\beta}_j)]$.

The starting equation is:

$\hat{\beta}_j=\beta_j+\sum w_{ij}u_i$, where $w_{ij}=\hat{r}_{ij}/SSR_j$.

$\hat{r}_{ij}$ is the $i^{th}$ residual from the regression of $x_j$ on all other independent variables, and $SSR_j$ is the sum of squared residuals from this regression.

The second equation above comes from:

$\hat{\beta}_j=\beta_j+\frac{\sum \hat{r_{ij}} u_i}{\sum \hat{r_{ij}^2}}$

$SSR_j=\hat{r_{ij}^2}$

The second equation says in this case:

$\hat{\beta}_j=\beta_j+\sum \frac{\hat{r_{ij}}}{\hat{r_{ij}^2}}u_i$

How can $\sum \frac{\hat{r_{ij}}}{\hat{r_{ij}^2}}u_i = \frac{\sum \hat{r_{ij}} u_i}{\sum \hat{r_{ij}^2}}$ ?

According to the summation property $\sum(x_i/y_i) \neq (\sum x_i)/(\sum y_i)$ this can't be.

I encountered the same problem when trying to proof the Gauss-Markov theorem.

Best Answer

Let the model be $y=X\beta+e$ where X contains $r$ (or r+1) covariates and the coefficient $\beta$ probably has $r+1$ elements, one for the intercept. We assume other standards assumptions hold unchanged. Then the OLS estimation is

$$ \hat\beta=(X'X)^{-1}X'y $$ and

\begin{align} & E(\hat\beta)=(X'X)^{-1}X' E(X\beta +e)=\beta\\ & Var(\hat\beta)=[(X'X)^{-1}X']^2 var(e)=(X'X)^{-1}var(e) \end{align} Now everything depend on the distribution of $e$ which in ordinary regression it is Gaussian. As a result $\hat\beta\sim N(\beta,var(\hat\beta))$

Related Question