Solved – Variance Estimation of MA(1) with known autocovariance function

estimationprobabilitytime seriesvariance

I haven't worked with time-series in a while now and stumbled upon them in a different setting.
Given $X_t\sim\mathcal{N}(0,\sigma^2)$ for $t=1,\ldots,n$ and the process $Y_t$ for $t=1,\ldots,n-1$ defined by:
$$ Y_t = X_{t+1} – X_t$$.
Imagine I only observe $Y_t$ and now want to estimate $\sigma^2$ from these observations. It seems $Var(Y)=2\sigma^2$, but I have problems deriving this theoretically. I want to be able to show that the properties of a consistent estimator for the variance of $X$ also work for $Y$, especially the Mean Absolute Deviation $MAD$.
$Y_t$ can be interpreted as a $MA(1)$ process with known autocovariance function, but I have problems deriving this result.

Any help or hint how to assess this would be greatly appreciated.

Best Answer

I assume that $X_t$ are iid normal distributed with mean $0$ and variance $\sigma^2$. $Var(Y_t) = Var(X_{t+1}-X_t) = Var(X_{t+1})+Var(X_t)-2Cov(X_{t+1},X_{t}) \\= Var(X_{t+1})+Var(X_t) = \sigma^2 + \sigma^2 - 0 = 2\sigma^2$.

Related Question