Solved – Why there are two different results

regressionspss

enter image description hereI found my four independent variables have a high positive association with my depepednet variable. However, except one independent variable, P value of all other indepdent variables are not significant in my regressino analysis. My hypothesis is that each of my independent variables separately has a postive relationship with my dependent variable, so I have four hypothese. I was wondering if I should reject three of my hypothese under such situation?
Sorry, forgot to say, There is no collinearity problem with my independent varibales Thank you.

More information:Please see attached picture.enter image description here It is my research model. After bivariate correlation analysis and partial correlation, all independent variables are positively associated with dependent variable. A hierarchical regression was used with control variable.I first put control variable in first block of SPSS, and then add the 4 other independent variables in second block. The results are in attached docuement. Three independent variables are not significant…. Correlation table:
enter image description here

Best Answer

The results of your model make perfect sense. Independent Variable 3 has the highest correlation with your dependent variable at 0.649. Additionally, Variable 3 has pretty high correlations with all the other independent variables. As a result, the majority of the information in your multiple regression model is derived from Variable 3.

Even though you do not have an explicit multicollinearity issue within your independent variables, they are correlated enough that the information they impart to the model overlap enough. As a result, you have a Winner-take-all situation. And, the winner is Variable 3. Thus, it is the only one that is statistically significant.

If you removed Variable 3 from the model, it is likely that Variable 4 would become statistically significant. If you removed Variables 3 & 4, it is possible that Variable 2 would be statistically significant. If you kept only Variable 1, it is quite possible it would be statistically significant.

The above is quite likely because if you test the statistical significance of the correlation of each of the independent variables with the dependent variable with a sample of 119, it is large enough that I think all those correlations will be statistically significant.

I don't think there is anything weird about any of the above. A variable can have a reasonably strong correlation with a dependent variable. But, if you find another independent variable with an even stronger correlation with the dependent variable, and you include this second one in your regression model it will kick out the first one. This happens all the time.

Related Question