Solved – Standard deviation of the sum of regression coefficients

least squaresregression coefficientsstandard deviation

I'm doing OLS estimation with an independent variable lagged as t-1, t-2, t-3, and t-4 (four beta coefficients). I would like to have the sum of these coefficients for interpreting the net impact of these variables on the endogenous one (for example the net influence of the US's GDP growth lagged in t-1, t-2, t-3, and t-4 on the Thai GDP growth).

How can I compute the standard deviation of this new coefficient for computing a t-test?

I have found this formula but i'm not sure…

$$ SE_{b_{new}} = \sqrt{SE_1^2 + SE_2^2+2Cov(b_1,b_2)} $$

(see this link Adding coefficients to obtain interaction effects – what to do with SEs?)

But how do I calculate the covariance of two (four in my case) coefficients? Does that make any sense?

Thank you very much.


Ok,

I need the covariance matrix of coefficient to retrieve the cov(b1,b2) cov(b1,b3)…

After this, i should be able to compute the standard errors of the sum of coefficients with the formula above…

Best Answer

I think what you might be looking for is this formula: For a vector $A$ and a constant $c$, to test the hypothesis $H_0: A\beta=c$, use the following statistic: $$\frac{(A\hat{\beta}-c)^T (A(X^T X)^{-1}A^T)(A\hat{\beta}-c)}{\sum \varepsilon_i/(n-p)}$$ which has an $F_{(1,n-p)}$ distribution under the null hypothesis. You can use the same statistic to calculate a confidence interval for $c$.

In your case the vector $A$ would be just a $(p\times 1)$ vector of 1's.

Related Question