[Math] Linear Regression: Expectation Proof

regressionstatistics

I found the following proof in my notes:

$E(Y_i) = E[\beta_0 + \beta X_i + \varepsilon_i] =\cdots= \beta_0 + \beta X_i$. This does not seem right to me, however. Why would $E(\beta_1 X_i) = \beta_1 X_i$? I wonder if i might have written it down incorrectly, with the actual proof meaning to be for the estimated value Yi hat (I don't know how to code this unfortunately). Does anyone recall this property of linear regression?

Best Answer

If $Y_i=\beta_0+\beta X_i+\epsilon_i$, where $\beta_0$ and $\beta$ are constants and $\epsilon_i$ is an "error" random variable with mean $0$, then $E(Y_i)=\beta_0+\beta E(X_i)$.