$E[(\hat \beta_1 – \beta_1)X_1]$ where $\beta_1$ is a linear regression coefficient and $\hat \beta_1$ is its least squares estimate

expected valuelinear regressionrandom variablesregression

Define the linear regression model:
$$
Y_i = \beta_0 + \beta_1 X_i + \varepsilon_i, \quad \quad i = 1,\dots,n.
$$

Let $\hat \beta_0$ and $\hat \beta_1$ be the estimates of $\beta_0$ and $\beta_1$ when we solve the regression model with least squares. $\hat \beta_0$ and $\hat \beta_1$ are dependent on the $X_i$'s and $Y_i$'s since they were computed using them.

I am wondering what is the expected value of $\hat \beta_1 – \beta_1$ multipled by one of the $X_i$'s, say $X_1$ for example. That is, what is $E[(\hat \beta_1 – \beta_1)X_1]$?

Best Answer

$$ \mathsf{E}\!\left[(\hat{\beta}_1-\beta_1)X_j\right]=\mathsf{E}\!\left[\frac{\sum_{i=1}^n(X_i-\bar{X})\varepsilon_iX_j}{\sum_{i=1}^n (X_i-\bar{X})^2}\right]. $$ If $\mathsf{E}[\varepsilon_i\mid X_1,\ldots,X_n]=0$, $$ \mathsf{E}\!\left[(\hat{\beta}_1-\beta_1)X_j\right]=0. $$