Hey!
So, I'm a little confused about the reported performance of my algorithm that uses the matlab neural network toolbox.
After training/testing with my dataset, I get a great Mean Square Error performance value, and a reasonably high R value on the regression plot (R = ~0.88).
However, when I look at the actual mapping of target and predicted values, it's not quite right. See this plot:
The diagonal dotted line is obviously the ideal output, and the black circles (and black line showing line of best fit) is my actual output. As you can see, all my outputs are negative and not on the diagonal line. However, there does seem to be a decent correlation between the target and actual output scores, hence the decent R value.
Am I just not mapping/scaling the output values correctly? Any tips or insight into this?
Best Answer