From the least squares estimation method, we know that $$\hat{\beta}=(X'X)^{-1}X'Y$$ and that $\hat{\beta}$ is an unbiased estimator of $\beta$, i.e $E[\hat{\beta}]=\beta$. Moreover, the linear model $$\begin{equation} Y=X\beta +u \end{equation}$$ has the assumption that $$Y\sim N(\mu=\beta_0+\beta_1x,\sigma)$$ or equivalently that $u \sim N(\mu=0,\sigma)$. Based on the above we can prove all three results (simultaneously) by calculating the variance-covariance matrix of $b$ which is equal to: $$Var(\hat{\beta)}:=\sigma^2(\hat{\beta})=\left( \begin{array}{ccc}
Var(\hat{\beta_0}) & Cov(\hat{\beta_0},\hat{\beta_1}) \\
Cov(\hat{\beta_0},\hat{\beta_1}) & Var(\hat{\beta_1}) \end{array} \right)$$ By the properties of variance we have that
$$\begin{align*}Var(\hat{\beta})&=E[\hat{\beta}\phantom{}^2]-E[\hat{\beta}]^2=E[((X'X)^{-1}X'Y)^2]-\beta^2=E[((X'X)^{-1}X'(X\beta +u))^2]-\beta^2=\\&=E[((X'X)^{-1}X'X\beta +(X'X)^{-1}X'u))^2]-\beta^2=E[(\beta+(X'X)^{-1}X'u))^2]-\beta^2=\\&=E[\beta^2]+2(X'X)^{-1}X'E[u]+E[((X'X)^{-1}X'u))^2]-\beta^2=\\&=\beta^2+0+E[((X'X)^{-1}X'u))^2]-\beta^2=E[((X'X)^{-1}X'u))^2]=\\&=\left((X'X)^{-1}X'\right)^2\cdot E[u^2]\end{align*}$$But, since $E[u]=0$ we have that $E[u^2]=Var(u)=\sigma^2$ and by substituting in the above equation we find that $$\begin{align*}Var(\hat{\beta})&=\left((X'X)^{-1}X'\right)^2\cdot E[u^2]=(X'X)^{-1}X'\cdot(X'X)^{-1}X'\cdot\sigma^2=\sigma^2(X'X)^{-1}\cdot I=\\&=\sigma^2(X'X)^{-1}.\end{align*}$$
Now, since $$(X'X)^{-1}=\left( \begin{array}{ccc}
\frac{\sum x_i^2}{n\sum (x_1-\bar{x})^2} & \frac{-\sum x_i}{n\sum (x_1-\bar{x})^2} \\
\frac{-\sum x_i}{n\sum (x_1-\bar{x})^2} & \frac{1}{\sum (x_1-\bar{x})^2} \end{array} \right)$$ (which is also known or can be easily derived algebraically) you have the result that: $$\begin{align*} Var(\hat{\beta})&=\left( \begin{array}{ccc}
Var(\hat{\beta_0}) & Cov(\hat{\beta_0},\hat{\beta_1}) \\
Cov(\hat{\beta_0},\hat{\beta_1} & Var(\hat{\beta_1}) \end{array} \right)=\sigma^2\left(X'X\right)^{-1}=\\&\phantom{kl}\\&=\left( \begin{array}{ccc}
\frac{\sigma^2 \sum x_i^2}{n\sum (x_1-\bar{x})^2} & \frac{-\sigma^2 \sum x_i}{n\sum (x_1-\bar{x})^2} \\
\frac{-\sigma^2 \sum x_i}{n\sum (x_1-\bar{x})^2} & \frac{\sigma^2}{\sum (x_1-\bar{x})^2} \end{array} \right) \end{align*}$$
This assumes that the $x_{i}$ are fixed, the model is quite different if they are not. However, this assumption is reasonable given the expected value and variance of the $Y_{i}$. Usually, for any linear relationship, we have the model $$Y_i=\beta_0+\beta_1 x_i+\epsilon_i,$$ where $\beta_{0},\beta_1 \in \mathbb{R}$, the $x_i$ are fixed, and $\mathbb{E}(\epsilon_i)=0, \text{Var}(\epsilon_i)=\sigma^2$ for a constant $\sigma^{2}$. This is very similar to your context, so I will assume this is the correct context. Denote the random variables $Y_i$ with lower case $y_i$ for consistency with typical regression notation. We have $$\text{Cov}(\hat{\beta}_0,\hat{\beta}_1)=\text{Cov}(\bar{y}-\hat{\beta_{1}}\bar{x},\hat{\beta}_1) \\=\text{Cov}(\bar{y},\hat{\beta}_1)-\text{Cov}(\hat{\beta}_1\bar{x},\hat{\beta}_1)\\=\text{Cov}(\bar{y},\hat{\beta}_1)-\bar{x}\text{Cov}(\hat{\beta}_1,\hat{\beta}_1)\\=\text{Cov}(\bar{y},\hat{\beta}_1)-\bar{x}\text{Var}(\hat{\beta}_1),$$
where the third equality comes from the fact that the $x_i$ are fixed and the fourth equality comes from the definitions of variance and covariance. From here, substitute the given expression for $\hat{\beta}_1$ (this expression and the one given for $\hat{\beta}_0$ can be derived from the usual method of maximum likelihood or from the least-squares estimators, they are equivalent), and use your knowledge about covariances and variances to finish the calculation (some information that you need will come from the proposition - "the following" in your box). Important hint: you will need to write $\hat{\beta}_1$ as a linear combination of the $y_{i}$, this is the key to completing the computation. Note that the covariance function and the vector space of random variables form an inner product space, so $\text{Cov}(\cdot,\cdot)$ is linear in the first argument (and second argument as a result of the symmetry clause of inner products) and thus,
$$\text{Cov}\left(\sum_{i=1}^n c_i Z_i,X\right)=\sum_{i=1}^n c_i \text{Cov}(Z_i,X)$$
(this will be very useful as well) for constants $c_1,c_2,\ldots,c_n$ and any random variables $Z_1,Z_2,\ldots,Z_n,X$. Also, here is yet another two useful identities/derivations:$$\sum_{i} x_i(x_i - \bar{x})=\sum_i x_i(x_i - \bar{x}) - \bar{x} \sum_i(x_i - \bar{x})\\=\sum_i(x_i - \bar{x})(x_i-\bar{x})=\sum_i (x_i - \bar{x})^2= S_{xx},$$
and similarly, $$\sum_i x_i(y_i - \bar{y})=\sum_i x_i(y_i - \bar{y}) - \bar{x} \sum_i(y_i - \bar{y})\\=\sum_{i}(x_i - \bar{x})(y_i-\bar{y})= S_{xy},$$
since we know that $\sum_i(x_i-\bar{x})=\sum_i(y_i-\bar{y})=0$.
Best Answer
You have not defined your symbols, so presumably $$S_{xx}=\sum (x_i-\bar x)^2 \quad\text{ and }\quad S_{xy}=\sum (x_i-\bar x)(y_i-\bar y)$$
Use the bilinearity of covariance.
We have
\begin{align} \operatorname{Cov}(b_0,b_1)&=\operatorname{Cov}(\bar y-b_1\bar x,b_1) \\&=\operatorname{Cov}(\bar y,b_1)-\bar x\operatorname{Var}(b_1) \end{align}
Recall that $\bar y$ and $b_1$ are independently distributed, so their covariance vanishes.
Moreover, the exact distribution of $b_1$ is $$b_1\sim\mathcal N\left(\beta_1,\frac{\sigma^2}{S_{xx}}\right)$$
Hence you finally get as the covariance between the least square estimates $$\operatorname{Cov}(b_0,b_1)=-\frac{\bar x\sigma^2}{S_{xx}}$$