I have no idea what a Garch regression is, since my knowledge of econometrics is essentially nonexistent.
For a simple linear regression, the calculations would be:
x = 1:10;
y = (11:20) + randn(1, 10)*0.5;
N = numel(x);
X = [x(:) ones(size(x(:)))];
B = X \ y(:)
rsd = y(:) - X*B;
Var_rsd = var(rsd);
CovB = Var_rsd*inv(X'*X);
tstat = tinv(0.975, N-2);
CI95 = B + sqrt(diag(CovB))*[-tstat tstat]
CI95_Slope = CI95(1,:)
CI95_Intercept = CI95(2,:)
The result is close to the results the Statistics and Machine Learning Toolbox function regress (link) produces. They are not the same because regress ‘Studentizes’ the residuals. (I will let you pursue that at your leisure.) The ‘Eqn. 18’ reference is to On the Covariance of Regression Coefficients.
Long day here, with much travel. My apologies for the delay.
EDIT — ‘CovB’ clarification.
Best Answer