UGH.
1. That doesn't make any sense since the range of tansig is (-1,1).
2. The advice in help/doc/type patternnet all conflict
3. Use { 'tansig' logsig } with mapstd or mapminmax(the default) inputs
4. Use 'trainscg' for unipolar binary targets {0,1}
5. class = 1+ round(net(x)). Can modify if have unequal priors and/or
unequal misclassification costs.
6. Can use softmax for more than 2 classes. MATLAB now has the
derivative for softmax
7. Use a val set with round(0.15*N) <= Nval = Ntst <= round(0.2*N)
8. Use MSEgoal = max( 0, 0.01*Ndof*MSE00a/Ntrneq)
a. MSE00a = mean(var(t'))
b. Ntrneq = round(0.7*prodsize(t))
c. Ndof = Ntrneq-Nw
d. Nw = (I+1)*H+(H+1)*O
9. Stopping on any misclassification rate cannot be done unless
a. Either the training is broken up into a loop of few epochs at a time
with breaks to check the classification rate
b. Or patternnet is modified
c. It's not worth the time(a) or effort (b).
d. If you disagree, please send me a copy of your code.
10. Dividetrain is only useful if Ntrn >> Nw and the generalization error
is estimated using the DOF adjusted value Ntrneq*MSE/Ndof.
Unfortunately, MATLAB does not allow Nval = 0, Ntst > 0. The closest
fudge that I can think of is Nval = 1 (ratio = 1/N) , max_fail = inf.
11. I find a good value for the number of hidden nodes, H, by using an
outer loop over j = Hmin:dH:Hmax and an inner loop over random weight
intializations i = 1:Ntrials with Ntrials ~ 10 and
Hmax <= Hub = -1+ceil( (Ntrneq-O)/(I+O+1))
12. Nw > Ntrneq and Ndof < 0 when H > Hmax.
Best Answer