Regression Analysis – How to Compare Coefficients of Two Independent Variables for Statistical Difference

multiple regressionregressionself-studystatastatistical significance

If I have two independent variables and they are dummy variable along with other independent variables and I run a linear probability model, I want to compare whether the coefficients of two dummy variables are statistically different from each other. I do not know how to compare that. Could someone help me on that?

Best Answer

I want to compare whether the coefficients of two dummy variables are statistically different from each other

I understand that you want to test the hypotheses $$H_0: A=B$$ $$H_1: A\not=B$$.

So we are performing inference on the quantity $A-B$ and evaluating whether it is sufficiently far from 0 relative to its standard error. Ordinarily, in the single-parameter case, we just compare, e.g., $A$ to its standard error. But we have two parameters here. So we have $$\sqrt{\frac{(A-B)^2}{\text{var}(A)+\text{var}(B)-2\text{cov}(A,B)}}$$ as our test statistic, where $\text{cov}(A,B)$ is the covaraince of $A$ and $B$. Now we compare the test statistic to a standard normal distribution.

Your last comment:

there are two null in mind let say they are A and B...one scenario for null is A + B = 0. second scenario is A*B = 0... Are these the right appraoch?

confuses me, because it suggests that your null hypothesis is not that $H_0:A=B$, but instead that $H_0:A=-B$, or $H_0:A=1/B$. While not typical, this certainly might make sense for your particular research topic.