How to prove $\hat\beta_0$ and $\hat\beta_1$ are linear in $\hat Y_i$

linear algebralinear regressionproof-writingregression

A fitted regression line of a linear model is given by : $$ \hat Y = \hat\beta_0 + \hat\beta_1X $$

How can I prove $\hat\beta_0$ and $\hat\beta_1$ are linear in $\hat Y_i$ ?

I'm unsure where to begin with this other than I know if $ \hat Y $ is linear then $\hat\beta_0$ and $\hat\beta_1 $ must also be linear.

If I can prove the beta parameters have constant slope does this imply they are linear parameters ?

Best Answer

If you are looking for the OLS solution as best fit for your model, then the formula for finding the vector $\beta$ is given by

\begin{equation} \beta =(XX^T)^{-1} X^T Y \end{equation} provided the square matrix $XX^T$ (here $X^T$ denotes the transpose of $X$) is not singular. This formula came from forcing the condition \begin{align} \frac{\partial L}{\partial \beta_0} &= -2\sum_{i=1}^N (y_i-(\beta_0 + \beta_1 x_i))=0,\\ \frac{\partial L}{\partial \beta_1} &= -2\sum_{i=1}^N x_i(y_i-(\beta_0 + \beta_1 x_i))=0. \end{align}

The Previous formula assures you that the relation between the vectors $\beta$ and $Y$ is linear.