[ I N ] = size(x)
[ O N ] = size(t)
Ntrn = N - 2*round(0.15)*N)
Ntrneq = Ntrn*O
Nw = (I+1)*H+(H+1)*O = (I+O+1)*H + O
Nw <= Ntrneq <=> H <= Hub
Hub = floor( ( Ntrneq-O) / ( I + O + 1) )
h = Hmin:dH:Hmax
For each h candidate, design multiple nets. To keep the task manageable, I usually search 10 h candidates at a time with Ntrials = 10 designs per candidate to obtain 100 designs. Sometimes it may be necessary to start with a wide search followed by one or more narrow searches.
Don't forget to choose a repeatable initial random number seed so that you can reproduce any individual design.
Choose the smallest H that satisfies your training goal. For example: if the degree-of-freedom adjusted training Rsquare, R2trna, is greater than 0.995 then 99.5% of the training target variance is modeled by the net.
I have posted zillions of examples in the NEWSGROUP and ANSWERS. Search with one or more of
greg neural h = Hmin:dH:Hmax Ntrials R2trna
Hope this helps.
Thank you for formally accepting my answer
Greg
Best Answer