Solved – Not-significant F but a significant coefficient in multiple linear regression

f-testmultiple regression

I have a regression with two continuous predictors and one dichotomous predictor in Model 1 and two interactions of each of the continuous predictors with the dichotomous predictor in Model 2. The coefficient for one of the interaction terms is significant. However, the F test is not significant (neither for Model 1 nor for Model 2). Should I still interpret that significant interaction or should I just assume that neither the predictors, nor the interaction terms are useful in predicting the outcome variable?

Best Answer

It sounds like your F-tests are not the ones you want, the default F-tests returned by most software compare the fitted model to a null (intercept-only) model. It sounds like you want to do an F-test comparing Model 2 to Model 1.

If you do this, you could still find that the F-test comparing these two models is insignificant, while the t-test on one of the interactions is significant. It is important to keep in mind that these are different tests, answering slightly different questions. The t-test on the interaction term is a test of how strongly the data reject the hypothesis that the interaction term is zero, holding fixed all the other coefficients in the extended model, including the other added interaction term. The F-test is a test of how strongly the data reject the hypothesis that all the added terms are zero.

If the two terms you added in the extended model have one t-stat that is narrowly significant and another that is insignificant, the F-test can be insignificant. One way to think about why this can happen is that adding multiple terms to a model creates a multiple testing problem. The chances of spuriously finding a large coefficient on one of the interaction terms increases as you add more terms. The F-test takes this into account, while the individual t-tests for each term do not.

Related Question