MATLAB: Neural Network Regression Score

dataset scalingDeep Learning Toolboxmachine learningneural network

Hey!
So, I'm a little confused about the reported performance of my algorithm that uses the matlab neural network toolbox.
After training/testing with my dataset, I get a great Mean Square Error performance value, and a reasonably high R value on the regression plot (R = ~0.88).
However, when I look at the actual mapping of target and predicted values, it's not quite right. See this plot:
The diagonal dotted line is obviously the ideal output, and the black circles (and black line showing line of best fit) is my actual output. As you can see, all my outputs are negative and not on the diagonal line. However, there does seem to be a decent correlation between the target and actual output scores, hence the decent R value.
Am I just not mapping/scaling the output values correctly? Any tips or insight into this?

Best Answer

In general,
Linear regression model
y = m*t + b + e % e random error, uncorrelated with t
y0 = m*t0+b +e0 % average
y-y0 = m*(t-t0)+(e-e0) % avg((t-t0)*(e-e0))=0
var(y) = m^2*var(t) + var(e)
Rsquare = var(y)/var(t)
= m^2 + var(e)/var(t)
~ m^2 % var(e) << var(t)
Remember, this is just one way to view the fit
y = net(x);
obtained from
[ net tr y err ] = train(net,x,t) % err=t-y
Hope this helps.
Thank you for formally accepting my answer
Greg