MATLAB: Problems with neural network training

MATLABneural networkneural networkstoolbox

I´m trying to train a neural network using nprtool and also manually, calling newpr and train methods. I use samples oriented as rows, instead of default as columns. Using nprtool there is no problem, but when I call to the automatically generated M-file, the output is:
??? Error using ==> network.train at 145
Targets are incorrectly sized for network.
Matrix must have 24 columns.
Error in ==> create_pr_net at 29
[net,tr] = train(net,inputs,targets);
My inputs are 140×24, and my targets are 140×3.
The generated code by Matlab is:
function net = create_pr_net(inputs,targets)
%CREATE_PR_NET Creates and trains a pattern recognition neural network.
%




% NET = CREATE_PR_NET(INPUTS,TARGETS) takes these arguments:
% INPUTS - RxQ matrix of Q R-element input samples
% TARGETS - SxQ matrix of Q S-element associated target samples, where
% each column contains a single 1, with all other elements set to 0.
% and returns these results:
% NET - The trained neural network
%
% For example, to solve the Iris dataset problem with this function:
%
% load iris_dataset
% net = create_pr_net(irisInputs,irisTargets);
% irisOutputs = sim(net,irisInputs);
%
% To reproduce the results you obtained in NPRTOOL:
%
% net = create_pr_net(inputs,targets);
% Create Network
numHiddenNeurons = 2000; % Adjust as desired



net = newpr(inputs,targets,numHiddenNeurons);
net.divideParam.trainRatio = 90/100; % Adjust as desired
net.divideParam.valRatio = 5/100; % Adjust as desired
net.divideParam.testRatio = 5/100; % Adjust as desired
% Train and Apply Network
[net,tr] = train(net,inputs,targets);
outputs = sim(net,inputs);
% Plot
plotperf(tr)
plotconfusion(targets,outputs)
If I transpose inputs and outputs before calling create_pr_net funcion (inputs=inputs';targets=targets';), the results of the traning are not the same as with nprtool (the results present far worse performance).
I am using Matlab R2010a.
Thanks.

Best Answer

Difference in performance can be caused by different random initial weights.
Be sure you have initialized the RNG to the same state.
Hope this helps.
Thank you for formally accepting my answer.
Greg