This is a self-study question, so I provide hints that will hopefully help to find the solution, and I'll edit the answer based on your feedbacks/progress.
The parameter estimates that minimize the sum of squares are
\begin{align}
\hat{\beta}_0 &= \bar{y} - \hat{\beta}_1 \bar{x} , \\
\hat{\beta}_1 &= \frac{ \sum_{i = 1}^n(x_i - \bar{x})y_i }{ \sum_{i = 1}^n(x_i - \bar{x})^2 } .
\end{align}
To get the variance of $\hat{\beta}_0$, start from its expression and substitute the expression of $\hat{\beta}_1$, and do the algebra
$$
{\rm Var}(\hat{\beta}_0) = {\rm Var} (\bar{Y} - \hat{\beta}_1 \bar{x}) = \ldots
$$
Edit:
We have
\begin{align}
{\rm Var}(\hat{\beta}_0)
&= {\rm Var} (\bar{Y} - \hat{\beta}_1 \bar{x}) \\
&= {\rm Var} (\bar{Y}) + (\bar{x})^2 {\rm Var} (\hat{\beta}_1)
- 2 \bar{x} {\rm Cov} (\bar{Y}, \hat{\beta}_1).
\end{align}
The two variance terms are
$$
{\rm Var} (\bar{Y})
= {\rm Var} \left(\frac{1}{n} \sum_{i = 1}^n Y_i \right)
= \frac{1}{n^2} \sum_{i = 1}^n {\rm Var} (Y_i)
= \frac{\sigma^2}{n},
$$
and
\begin{align}
{\rm Var} (\hat{\beta}_1)
&= \frac{ 1 }{ \left[\sum_{i = 1}^n(x_i - \bar{x})^2 \right]^2 }
\sum_{i = 1}^n(x_i - \bar{x})^2 {\rm Var} (Y_i) \\
&= \frac{ \sigma^2 }{ \sum_{i = 1}^n(x_i - \bar{x})^2 } ,
\end{align}
and the covariance term is
\begin{align}
{\rm Cov} (\bar{Y}, \hat{\beta}_1)
&= {\rm Cov} \left\{
\frac{1}{n} \sum_{i = 1}^n Y_i,
\frac{ \sum_{j = 1}^n(x_j - \bar{x})Y_j }{ \sum_{i = 1}^n(x_i - \bar{x})^2 }
\right \} \\
&= \frac{1}{n} \frac{ 1 }{ \sum_{i = 1}^n(x_i - \bar{x})^2 }
{\rm Cov} \left\{ \sum_{i = 1}^n Y_i, \sum_{j = 1}^n(x_j - \bar{x})Y_j \right\} \\
&= \frac{ 1 }{ n \sum_{i = 1}^n(x_i - \bar{x})^2 }
\sum_{i = 1}^n (x_j - \bar{x}) \sum_{j = 1}^n {\rm Cov}(Y_i, Y_j) \\
&= \frac{ 1 }{ n \sum_{i = 1}^n(x_i - \bar{x})^2 }
\sum_{i = 1}^n (x_j - \bar{x}) \sigma^2 \\
&= 0
\end{align}
since $\sum_{i = 1}^n (x_j - \bar{x})=0$.
And since
$$\sum_{i = 1}^n(x_i - \bar{x})^2
= \sum_{i = 1}^n x_i^2 - 2 \bar{x} \sum_{i = 1}^n x_i
+ \sum_{i = 1}^n \bar{x}^2
= \sum_{i = 1}^n x_i^2 - n \bar{x}^2,
$$
we have
\begin{align}
{\rm Var}(\hat{\beta}_0)
&= \frac{\sigma^2}{n} + \frac{ \sigma^2 \bar{x}^2}{ \sum_{i = 1}^n(x_i - \bar{x})^2 } \\
&= \frac{\sigma^2 }{ n \sum_{i = 1}^n(x_i - \bar{x})^2 }
\left\{ \sum_{i = 1}^n(x_i - \bar{x})^2 + n \bar{x}^2 \right\} \\
&= \frac{\sigma^2 \sum_{i = 1}^n x_i^2}{ n \sum_{i = 1}^n(x_i - \bar{x})^2 }.
\end{align}
Edit 2
Why do we have
${\rm var} ( \sum_{i = 1}^n Y_i) = \sum_{i = 1}^n {\rm Var} (Y_i) $?
The assumed model is $ Y_i = \beta_0 + \beta_1 X_i + \epsilon_i$, where the $\epsilon_i$ are independant and identically distributed random variables with ${\rm E}(\epsilon_i) = 0$ and ${\rm var}(\epsilon_i) = \sigma^2$.
Once we have a sample, the $X_i$ are known, the only random terms are the $\epsilon_i$. Recalling that for a random variable $Z$ and a constant $a$, we have ${\rm var}(a+Z) = {\rm var}(Z)$. Thus,
\begin{align}
{\rm var} \left( \sum_{i = 1}^n Y_i \right)
&= {\rm var} \left( \sum_{i = 1}^n \beta_0 + \beta_1 X_i + \epsilon_i \right)\\
&= {\rm var} \left( \sum_{i = 1}^n \epsilon_i \right)
= \sum_{i = 1}^n \sum_{j = 1}^n {\rm cov} (\epsilon_i, \epsilon_j)\\
&= \sum_{i = 1}^n {\rm cov} (\epsilon_i, \epsilon_i)
= \sum_{i = 1}^n {\rm var} (\epsilon_i)\\
&= \sum_{i = 1}^n {\rm var} (\beta_0 + \beta_1 X_i + \epsilon_i)
= \sum_{i = 1}^n {\rm var} (Y_i).\\
\end{align}
The 4th equality holds as ${\rm cov} (\epsilon_i, \epsilon_j) = 0$ for $i \neq j$ by the independence of the $\epsilon_i$.
Best Answer
This appears to be simple linear regression. If the $x_i$'s are treated as deterministic, then things like "variance" are not associated with them, and so the expression holds, under the additional assumption that the the error term (and hence $y$ also) has identical distribution for all $i$, and also, that the error terms (and hence $y$ also) are independent for all $j\neq i$.
For compactness, denote $$z_i = \frac{x_i-\bar x}{\sum (x_i- \bar x)^2}$$
Then
$$\text{Var}(\beta_1) = \text{Var}\left(\sum z_iy_i\right)$$
The assumption of deterministic $x$'s permits us to treat them as constants. The assumption of independence permits us to set the covariances between $y_i$ and $y_j$ equal to zero. These two give
$$\text{Var}(\beta_1) = \sum z_i^2\text{Var}(y_i)$$
Finally, the assumption of identically distributed $y$'s implies that $\text{Var}(y_i)= \text{Var}(y_j) \;\; \forall i,j$ and so permits us to write
$$\text{Var}(\beta_1) = \text{Var}(y_i)\sum z_i^2$$