Solved – decreasing trend in residual plot for linear regression

regressionresiduals

I am fitting a linear model with one dependent variable and 5 independent variables. All variables are continuous. The residual plot (plot of residuals vs. predicted values) appears to have a decreasing trend. See the plot attached. Someone told me this means a non-linear relationship between the DV and the IVs. But I thought only a when a curvilinear relationship between the predicted values and the residuals implies a non-linear relationship between the DV and the IVs.

Anyway, can someone tell me if the residual plot indicates any violation of the assumptions of the linear regression (independence, linearity, normality, and homogeneous variance)? And, it does, how to fix the problems? Thanks. enter image description here

Best Answer

I count nine separate groupings of residuals, each denser in the center. Because the residuals do not fit a normal distribution but show a definite pattern, a linear regression on this data alone is insufficiently explanatory of the relationship between the independent and dependent variables - that is, something is certainly missing from the existing linear model. To find what is missing I advise isolating the raw data that produces the uppermost and lowermost groupings, and then:

A)fit only that subset to verify that this pattern in the residuals remains. Since this is a linear regression it should.

B) attempting to analyse the difference in those two isolated groups of raw data.