Hypothesis Testing – How to Test If the Slopes in the Linear Model Are Equal to a Fixed Value

hypothesis testingregression

Suppose we have a simple linear regression model $Z = aX + bY$ and would like to test the null hypothesis $H_0: a=b=\frac{1}{2}$ against the general alternative.

I think one can use the estimate of $\hat{a}$ and $SE(\hat{a})$ and further apply a $Z$-test to get the confidence interval around $\frac{1}{2}$. Is this ok?

The other question is strongly related to this one. Suppose that we have a sample $\{(x_1,y_1,z_1),\ldots ,(x_n,y_n,z_n) \}$ and we compute $\chi^2$ statistics

\begin{equation}
\sum_{i=1}^n \frac{(z_i-\frac{x_i+y_i}{2})^2}{\frac{x_i+y_i}{2}}.
\end{equation}
Can these statistics be used to test the same null hypothesis?

Best Answer

In linear regression the assumption is that $X$ and $Y$ are not random variables. Therefore, the model

$$Z = a X + b Y + \epsilon$$

is algebraically the same as

$$Z - \frac{1}{2} X - \frac{1}{2} Y = (a - \frac{1}{2})X + (b - \frac{1}{2})Y + \epsilon = \alpha X + \beta Y + \epsilon.$$

Here, $\alpha = a - \frac{1}{2}$ and $\beta =b - \frac{1}{2}$. The error term $\epsilon$ is unaffected. Fit this model, estimating the coefficients as $\hat{\alpha}$ and $\hat{\beta}$, respectively, and test the hypothesis $\alpha = \beta = 0$ in the usual way.


The statistic written at the end of the question is not a chi-squared statistic, despite its formal similarity to one. A chi-squared statistic involves counts, not data values, and must have expected values in its denominator, not covariates. It's possible for one or more of the denominators $\frac{x_i+y_i}{2}$ to be zero (or close to it), showing that something is seriously wrong with this formulation. If even that isn't convincing, consider that the units of measurement of $Z$, $X$, and $Y$ could be anything (such as drams, parsecs, and pecks), so that a linear combination like $z_i - (x_i+y_i)/2$ is (in general) meaningless. It doesn't test anything.

Related Question