You should scale and center the data. zero-mean/unit-variance inputs with tansig hidden node activation units are recommended..
help prestd
doc prestd
similarly for trastd and poststd.
>
Initialize the uniform random number generator RAND so that
you can duplicate the results if necessary.
> net=newff(minmax(x),[3,12,1],{'logsig','logsig','purelin'},'trainlm');
Do not include the number of input nodes. That info is obtained
from minmax(x)
>net.performFcn = 'mse';
Delete. It is a default value.
>net.trainParam.goal = 1e-4;
Use a better informed choice that takes the data scaling into account..
>net.trainParam.min_grad = 1e-20;
>net.trainParam.epochs = 5000;
Where did you get these values?
Use as many defaults of newff as possible.
help newff
doc newff
>net=init(net);
Delete. NEWFF is self-initializing.
>net=train(net,x,y);
>z=sim(net,x);
Use the multiple output option.
>test=sim(net,a);
MSEtrn00 = mse(y-mean(y))
MSEgoal = MSEtrn00/100
state0 = 0
rand('state',state0)
net = newff(minmax(x),[12 3]);
net.trainParam.goal = MSEgoal;
[ net tr Y E] = train(net,x,y);
MSEtrn = tr.perf(end)
R2trn = 1-MSEtrn/MSEtrn00
MSEtst00 = mse(b-mean(b))
tst = sim(net,a);
MSEtst = mse(b-tst);
R2tst = 1-MSEtst/MSEtst00
-----SNIP
>i tried with 10-30 neurons but the result is not good
>can you tell where its going wrong
Implement a double loop over choice of H and creation of a
number of nets for each value of H.
Hope this helps.
Greg
Best Answer