MATLAB: Is that possible to have the same training-MSEs for the same training data set with different weights at end of the training phase

back propagationlocal or global minimaneural networknn toolboxtraining msestraining nn

Matlab NN toolbox starts its weights and bias randomly. Therefore, I trained the network 200 times with random weights and bias to the same given training data set. I planed to get the minimum MSE out of 200 training MSEs, so that I can use the corresponding weights and bias in the testing phase. But in that case, I found it gives the same training-MSEs (most of the time) for the same training data set with different weight values at end of the training phase.
Is that possible? Can training stop at a local or global minima with different values for weights and bias?
really appreciate if anyone can clarify this issue.
Regards, Dara

Best Answer

If you have H hidden nodes you can reorder them H! ways.
If you have an odd parity transfer function you can change the sign of its input weights and counter that by changing the sign of it's output weight.
Consequently, there are M = H! *2^H different nets that have identical input/output performance.
The default value of H = 10 yields M = 3.7159e+09.
Hope this helps.
Thank you for formally accepting my answer
Greg