This is a self-study question, so I provide hints that will hopefully help to find the solution, and I'll edit the answer based on your feedbacks/progress.
The parameter estimates that minimize the sum of squares are
\begin{align}
\hat{\beta}_0 &= \bar{y} - \hat{\beta}_1 \bar{x} , \\
\hat{\beta}_1 &= \frac{ \sum_{i = 1}^n(x_i - \bar{x})y_i }{ \sum_{i = 1}^n(x_i - \bar{x})^2 } .
\end{align}
To get the variance of $\hat{\beta}_0$, start from its expression and substitute the expression of $\hat{\beta}_1$, and do the algebra
$$
{\rm Var}(\hat{\beta}_0) = {\rm Var} (\bar{Y} - \hat{\beta}_1 \bar{x}) = \ldots
$$
Edit:
We have
\begin{align}
{\rm Var}(\hat{\beta}_0)
&= {\rm Var} (\bar{Y} - \hat{\beta}_1 \bar{x}) \\
&= {\rm Var} (\bar{Y}) + (\bar{x})^2 {\rm Var} (\hat{\beta}_1)
- 2 \bar{x} {\rm Cov} (\bar{Y}, \hat{\beta}_1).
\end{align}
The two variance terms are
$$
{\rm Var} (\bar{Y})
= {\rm Var} \left(\frac{1}{n} \sum_{i = 1}^n Y_i \right)
= \frac{1}{n^2} \sum_{i = 1}^n {\rm Var} (Y_i)
= \frac{\sigma^2}{n},
$$
and
\begin{align}
{\rm Var} (\hat{\beta}_1)
&= \frac{ 1 }{ \left[\sum_{i = 1}^n(x_i - \bar{x})^2 \right]^2 }
\sum_{i = 1}^n(x_i - \bar{x})^2 {\rm Var} (Y_i) \\
&= \frac{ \sigma^2 }{ \sum_{i = 1}^n(x_i - \bar{x})^2 } ,
\end{align}
and the covariance term is
\begin{align}
{\rm Cov} (\bar{Y}, \hat{\beta}_1)
&= {\rm Cov} \left\{
\frac{1}{n} \sum_{i = 1}^n Y_i,
\frac{ \sum_{j = 1}^n(x_j - \bar{x})Y_j }{ \sum_{i = 1}^n(x_i - \bar{x})^2 }
\right \} \\
&= \frac{1}{n} \frac{ 1 }{ \sum_{i = 1}^n(x_i - \bar{x})^2 }
{\rm Cov} \left\{ \sum_{i = 1}^n Y_i, \sum_{j = 1}^n(x_j - \bar{x})Y_j \right\} \\
&= \frac{ 1 }{ n \sum_{i = 1}^n(x_i - \bar{x})^2 }
\sum_{i = 1}^n (x_j - \bar{x}) \sum_{j = 1}^n {\rm Cov}(Y_i, Y_j) \\
&= \frac{ 1 }{ n \sum_{i = 1}^n(x_i - \bar{x})^2 }
\sum_{i = 1}^n (x_j - \bar{x}) \sigma^2 \\
&= 0
\end{align}
since $\sum_{i = 1}^n (x_j - \bar{x})=0$.
And since
$$\sum_{i = 1}^n(x_i - \bar{x})^2
= \sum_{i = 1}^n x_i^2 - 2 \bar{x} \sum_{i = 1}^n x_i
+ \sum_{i = 1}^n \bar{x}^2
= \sum_{i = 1}^n x_i^2 - n \bar{x}^2,
$$
we have
\begin{align}
{\rm Var}(\hat{\beta}_0)
&= \frac{\sigma^2}{n} + \frac{ \sigma^2 \bar{x}^2}{ \sum_{i = 1}^n(x_i - \bar{x})^2 } \\
&= \frac{\sigma^2 }{ n \sum_{i = 1}^n(x_i - \bar{x})^2 }
\left\{ \sum_{i = 1}^n(x_i - \bar{x})^2 + n \bar{x}^2 \right\} \\
&= \frac{\sigma^2 \sum_{i = 1}^n x_i^2}{ n \sum_{i = 1}^n(x_i - \bar{x})^2 }.
\end{align}
Edit 2
Why do we have
${\rm var} ( \sum_{i = 1}^n Y_i) = \sum_{i = 1}^n {\rm Var} (Y_i) $?
The assumed model is $ Y_i = \beta_0 + \beta_1 X_i + \epsilon_i$, where the $\epsilon_i$ are independant and identically distributed random variables with ${\rm E}(\epsilon_i) = 0$ and ${\rm var}(\epsilon_i) = \sigma^2$.
Once we have a sample, the $X_i$ are known, the only random terms are the $\epsilon_i$. Recalling that for a random variable $Z$ and a constant $a$, we have ${\rm var}(a+Z) = {\rm var}(Z)$. Thus,
\begin{align}
{\rm var} \left( \sum_{i = 1}^n Y_i \right)
&= {\rm var} \left( \sum_{i = 1}^n \beta_0 + \beta_1 X_i + \epsilon_i \right)\\
&= {\rm var} \left( \sum_{i = 1}^n \epsilon_i \right)
= \sum_{i = 1}^n \sum_{j = 1}^n {\rm cov} (\epsilon_i, \epsilon_j)\\
&= \sum_{i = 1}^n {\rm cov} (\epsilon_i, \epsilon_i)
= \sum_{i = 1}^n {\rm var} (\epsilon_i)\\
&= \sum_{i = 1}^n {\rm var} (\beta_0 + \beta_1 X_i + \epsilon_i)
= \sum_{i = 1}^n {\rm var} (Y_i).\\
\end{align}
The 4th equality holds as ${\rm cov} (\epsilon_i, \epsilon_j) = 0$ for $i \neq j$ by the independence of the $\epsilon_i$.
Best Answer
Well the $y_i$'s are present in a way in your $\sigma^2$: the larger the spread of the $y_i$'s around $\theta_0 + \theta_1 x_i$, the larger the variances of these estimators.
The clue is that these equations are only correct under the model assumption of constant variance $\sigma^2$. Once you set the $x_i$'s and assume constant variances, the $y_i$'s are known to be $Y_i \sim N(E(Y_i|x_i),\sigma^2)$ and thus no longer "affect" the theoretical variances of the estimators.