Solved – Comparing coefficients of two variables: one is significant, the other is not significant

hypothesis testingregression coefficientsstata

I'm trying to test whether the coefficient for one independent variable ($X_1$) is larger than coefficient for another variable ($X_2$) in predicting the dependent variable ($Y$). For example, my hypothesis states: The effect of $X_1$ on $Y$ is larger than the effect of $X_2$ on $Y$. In my regression analysis, it turns out that $X_1$ is significant while $X_2$ is not.

In this case, can I conclude that the effect of $X_1$ on $Y$ is larger that that of $X_2$? If I use the test command in Stata, the two effects are not statistically different. Is the test still meaningful when one variable is not significant?
I look forward to your input.

Best Answer

You can conclude from multiple regression that $Y$ relates more strongly or consistently to $X_1$ than to $X_2$ in your sample, but your slope coefficients are parameter estimates, so if you're looking to infer the same conclusion about the population, the test for differences in the coefficients is the appropriate one to interpret. Yes, it is still appropriate when the test of either individual coefficient is insignificant.

The default null hypothesis significance test for a regression coefficient compares it to zero, which presumably is not the value of your parameter estimate $\hat\beta_{X_2}$. Thus if, e.g., $\hat\beta_{X_1}=.3, p<.05$, but $\hat\beta_{X_2}=.1,p>.05$, you might expect $\hat\beta_{X_1}$ to differ more significantly from zero than from 0.1.

Furthermore, the uncertainty in your estimate should be taken into account, whereas there is no standard error of the null hypothesis. To extend the previous example, even a $\hat\beta_{X_2}=-.1$ would admit values of $\beta_{X_2}>0$ within its confidence interval. Thus your test of differences in the coefficients may have two reasons to be more conservative than your tests of individual coefficients.