Eventhough this question is old, here is the answer:
The Autocovariance function for a causal time series is: $\\$
$\gamma(h) = \sigma^2 \sum_{j=0}^{\infty}\psi_j \psi_{j+|h|}$ $\\$
The MA($\infty)$ representation of $X_t$ is:
$X_t = Z_t + \sum_{j=1}^{\infty} (\phi+\theta)\phi^{j-1} Z_{t-j}$ $\\$
,where $\psi_j = (\phi+\theta)\phi^{j-1}$
Note that I will write the white noise $Z_{t-j}$ and not directly $\sigma^2$ to hopefully generate a better understanding of what is going on
$\\$
Case h = 0:
$Cov(X_t,X_t) = E(X_t X_t) = \gamma(0) = \sigma^2 \sum_{j=0}^{\infty}\psi_j^2$ $\\$
$E(X_t X_t)$ $= E(( Z_t + (\phi+\theta)\sum_{j=1}^{\infty} \phi^{j-1} Z_{t-j}) ^2) = E(Z_t^2) + E( ((\phi+\theta)\sum_{j=1}^{\infty} \phi^{j-1} Z_{t-j})^2 ) +$
$2(\phi+\theta) \sum_{j=1}^{\infty} \phi^{j-1}E( Z_{t-j}Z_t)$
since $Z_t$ is a White noise; $E( Z_{t-1}Z_t) = 0$, hence:
$E(X_t X_t)$ $= E(Z_t^2) + (\phi+\theta)^2 \sum_{j=1}^{\infty} \phi^{2j-2} E(Z_{t-j}^2)=
\sigma^2 + (\phi+\theta)^2 \sum_{j=1}^{\infty} \phi^{2j-2} \sigma^2=
\sigma^2 ( 1 + (\phi+\theta)^2 \sum_{j=1}^{\infty} \phi^{2j-2})=
\sigma^2 ( 1 + \frac{(\phi+\theta)^2}{1-\phi^2})$
where $\sum_{j=1}^{\infty} \phi^{2j-2} = \sum_{j=0}^{\infty} \phi^{2j} = $$ \sum_{j=0}^{\infty} (\phi^{j})^2$ is a geometric series converging to $\frac{1}{1-\phi^2}$
$\\$
Case h = 1:
$Cov(X_t,X_{t-1}) = E(X_t X_{t-1}) = \gamma(1) = \sigma^2 \sum_{j=0}^{\infty}\psi_j \psi_{j-1}$ $\\$
$E(X_t X_{t-1})$ $ = E( (Z_t + (\phi+\theta)\sum_{j=1}^{\infty} \phi^{j-1} Z_{t-j}) \cdot ( Z_{t-1} + (\phi+\theta)\sum_{j=2}^{\infty} \phi^{j-2} Z_{t-j}) ) =
E( (Z_t + Z_{t-1}(\phi+\theta) + (\phi+\theta)\sum_{j=2}^{\infty} \phi^{j-1} Z_{t-j}) \cdot ( Z_{t-1} + (\phi+\theta)\sum_{j=2}^{\infty} \phi^{j-2} Z_{t-j}) ) =
E( (Z_t + Z_{t-1}(\phi+\theta) + (\phi+\theta)\phi\sum_{j=2}^{\infty} \phi^{j-2} Z_{t-j}) \cdot ( Z_{t-1} + (\phi+\theta)\sum_{j=2}^{\infty} \phi^{j-2} Z_{t-j}) ) =
(\phi+\theta)E(Z_{t-1}^2) + (\phi+\theta)^2\phi \sum_{j=2}^{\infty} \phi^{2j-4} E(Z_{t-j}^2)=
(\phi+\theta)E(Z_{t-1}^2) + (\phi+\theta)^2\phi \sum_{j=2}^{\infty} \phi^{2j-4} E(Z_{t-j}^2)=
(\phi+\theta)\sigma^2 + (\phi+\theta)^2\phi \sum_{j=2}^{\infty} \phi^{2j-4} \sigma^2)=
\sigma^2(\phi+\theta + \frac{(\phi+\theta)^2\phi}{1-\phi^2} )$
$\\$
Case h = 2:
$Cov(X_t,X_{t-2}) = E(X_t X_{t-2}) = \gamma(2) = \sigma^2 \sum_{j=0}^{\infty}\psi_j \psi_{j-2}$ $\\$
$E(X_t X_{t-2})$ $ = E( (Z_t + (\phi+\theta)\sum_{j=1}^{\infty} \phi^{j-1} Z_{t-j}) \cdot ( Z_{t-2} + (\phi+\theta)\sum_{j=3}^{\infty} \phi^{j-3} Z_{t-j}) ) =
E( (Z_t + Z_{t-1}(\phi+\theta) + Z_{t-2}(\phi+\theta)\phi + (\phi+\theta)\phi^2\sum_{j=3}^{\infty} \phi^{j-3} Z_{t-j}) \cdot ( Z_{t-2} + (\phi+\theta)\sum_{j=3}^{\infty} \phi^{j-3} Z_{t-j}) ) =
(\phi+\theta)\phi E(Z_{t-2}^2) + (\phi+\theta)^2\phi^2\sum_{j=3}^{\infty} \phi^{2j-6} E(Z_{t-j}^2)) =
\sigma^2( (\phi+\theta)\phi + \frac{(\phi+\theta)^2\phi^2}{1-\phi^2})$
We see the pattern and follow that:
$\gamma(h) = \sigma^2 ( 1 + \frac{(\phi+\theta)^2}{1-\phi^2}) ~$ if h=0
$\gamma(h) = \sigma^2( (\phi+\theta)\phi^{h-1} + \frac{(\phi+\theta)^2\phi^h}{1-\phi^2}) ~$ if h>0
Clearly for odd $t$:$$E(x_t)=E(w_t)=0$$and for even $t$:$$E(x_t)=\dfrac{1}{\sqrt 2}E(w_{t-1}^2-1)=\dfrac{1}{\sqrt 2}(\sigma_{w_{t-1}}^2-1)=\dfrac{1}{\sqrt 2}(1-1)=0$$Also:$$C(t_1,t_2)=E((x_{t_1}-E(x_{t_1}))(x_{t_2}-E(x_{t_2})))=E(x_{t_1}x_{t_2})$$ according to definition for distinct $t_1$ and $t_2$, $x_{t_1}$ and $x_{t_2}$ are dependent only if $t_1$ is even and $t_2=t_1+1$ or $t_2$ is even and $t_1=t_2+1$. Assume the former case. Then $t_1=t$ is even and $t_2=t+1$ is odd so:$$C(t,t+1)=E(x_tx_{t+1})=E(x_tx_{t+1})=\dfrac{1}{\sqrt 2}E(w_t^3-w_t)=0$$So the covariance function is zero for distinct $t_1$ and $t_2$ and for $t_1=t_2=t$ we have:$$C(t,t)=E(x^2_t)$$for odd $t$:$$C(t,t)=E(w_t^2)=1$$and for even $t$:$$C(t,t)=\dfrac{1}{2}E(w^4_{t-1}-2w^2_{t-1}+1)=\dfrac{1}{2}(E(w^4_{t-1})-1)=\dfrac{1}{2}(3-1)=1$$and we finally obtain:$$C(t_1,t_2)=\delta[t_1-t_2]$$where $\delta[n]=1$ for $n=0$ and zero elsewhere.
For the second question we have $$x_1=w_1$$ and $$x_2=\dfrac{1}{\sqrt 2}(w_1^2-1)$$ so $x_1$ can vary in $(-\infty,\infty)$ but $x_2$ can vary in $[-\dfrac{1}{\sqrt 2},\infty)$ so they can't be identically distributed.
Best Answer
Because $w_i$ is a white noise, $\operatorname{cov}[w_i,w_j]=0$ when $i\ne j$.
It follows that $E[w_iw_j]=0$ when $i\ne j$.
When you expand formula (1) you can get rid of all the terms $w_iw_j$ where $i\ne j$.
The terms you are left with all have an offset of $h$ in the exponent of $\phi$, i.e. $(1)=E[(\phi^hw_t)(w_t)+(\phi^{h+2}w_{t-1})(\phi w_{t-1})+...].$