Could someone please help with the formula for the Newey West standard error of $\beta_1$ (without matrix notation) for the following regression:
$Y_t=\beta_0+\beta_1X_t+\epsilon_t$
where $\epsilon_t$ is believed to be serial correlated and/or heteroskedastic.
Best Answer
Take your model of $$y_t=\beta_0+\beta_1x_t+u_t,$$ where $t=1,...,T$. We will assume there are no other regressors and that the serial correlation only lasts up to one period (so shocks do not persist for very long).
To get the Newey-West/HAC standard error of $\beta_1$ that is robust to heteroskedasticity and autocorrelation up to 1 lag, you should:
Here's an example with Stata (with the full data shown by the list command in case you want to use other software). This has a slight wrinkle in that Stata uses a default finite sample correction of $\frac{T}{T-k}$ that is not default in other statistics packages and is usually not shown in textbook formulas, though it is a sensible thing to do:
As you can see, with the finite sample correction,
newey
matches what we did by hand at 0.07774301. I am not sure this example contains a whole lot of intuition, but YMMV.This is based on Wooldridge, Jeffrey M. "A computationally simple heteroskedasticity and serial correlation robust standard error for the linear regression model." Economics Letters 31.3 (1989): 239-243.
Stata Code: