MATLAB: Trial and error approach to find optimal hidden neurones number

Deep Learning Toolboxhidden nodes numberneural networktrial and error

I have decomposed the data into three parts: 70% (training), 10% (validation) and 20% (testing). When I used trial and error approch, I found the smallest MSE (0.53088525) of training with 15 hidden nodes but focusing on MSE of validation, the smallest MSE (0.27098756) was achieved with only one node!!!!!! it's makes sense???
we started with 1 hidden node and added one each time up to 20. trials=10.
Is 15 the optimal hidden neurone number????
Thanks in advance

Best Answer

What are the sizes of the input [I N] and target [O N ] matrices?
What is Hub for Ntrneq = Ntrn*O = 0.7*N*O >= Nw = (I+1)*H+(H+1)*O?
You designed 200 nets? 10 for each of 20 values for H << Hub? (Typically, I only look at 10 values for H)
Were the random initial weights and data divisions different for each net?
If you use the degree-of-freedom adjustment for the training set performance
MSEtrna = SSEtrn/(Ntrneq-Nw)
MSEtrn00a = mean(var(target',0))
R2trna = 1 - MSEtrna/MSEtrn00a
you can plot the Rsquare summary statistics e.g., min, median, mean and max vs H for the trna, val and tst sets; i.e., Four plots, 3 curves each.
I typically use the median and mean plots to determine the smallest acceptable value for H.
Of course, if N isn't large enough to insure relatively stable trna/val/tst data division estimates you might want to use Bayesian regularization via TRAINBR. I am not that familiar with regularization. However, it tends to make the results much less sensitive to the value of H. Then the question of an "optimal value" tends to beome mute.
Greg
PS: The normalized NMSE = MSE/mean(var(target',1)) is scale dependent and therefore, easier to use.