Regression – Is Square Root of Variance of Regression Coefficient the Standard Error?

multiple regressionr-squaredregressionregression coefficientssums-of-squares

Quick question, in the textbook "Introductory Econometrics", the variance of a Regression Coefficient is given as:

$var(\hat\beta_j) = \frac{\sigma^2}{SST_j(1-R_j^2)}$

where,
$SST_j$= $\sum_{i=1}^n (x_{ij}-\bar x)^2$

$\sigma^2$ is the variance of errors of y against its predicted values

$R_j^2$ is the strength of fit when regressing the jth predictor against the other predictors.

So am I right in assuming that the Standard Error of the Regression Coefficient is the square root of this $var(\hat\beta)$ value?

Best Answer

Yes that is correct. You may be confused because, while we give separate terminology to standard deviation and standard error, the square is called a "variance" in both cases.