Can both model parameters in linear regression be called regression coefficients

linear regressionregressionstatistics

I was going through a review and I found that in the equation

$$
f(x_{i}) = w_{0} + w_{1}x_{i}
$$

both $w_{1}$ and $w_{0}$ are being called regression coefficients.

I have found references on the internet and I see that there are examples supporting the same and against it in text such as:

Against

"The slope b of a line obtained using linear least squares fitting is called the regression coefficient."

" In linear regression, coefficients are the values that multiply the predictor values. Suppose you have the following regression equation: $y = 3X + 5$ In this equation, +3 is the coefficient, X is the predictor, and +5 is the constant."

References that call them regression coefficients

  • Link of a paper here

The regression coefficients ($a$ and $b$) are estimated by minimizing the sum of squares of deviations of $y_i$ from the regression line. For a point $x_i$, the corresponding given by the regression equation will be: $$y_{i} = a + bx_{i}$$

As per my understanding it should be that $a$ is constant and $b$ is the coefficient here. Can someone please shed some light on this one.

Best Answer

Both are valid and used.

Yes, $a$ or $w_0$ are often called constant. However, one would also count constants among the coefficients.

The constant can be viewed as the "slope" of the constant predictor $x_0=1$, so that $w_0x_0=w_0$. In this case, even the constant would be "a value that multiplies the predictor values".

Related Question