Hi, I'm a new user of this community and I apologize for my english at first. I'm using nntool for develop a neural network that can predict solar irradiance. Now i'm training to use this tool. I use a feedforward backpropagation with 2 input neurons, 1 output and 1 hidden layer formed by one neuron. I use trainlm algorithm and mse function error. Tha transfer function is tansig for hidden layer and output layer. The training parameters are: epochs 100 goal 0 max_fail 5 mem_reduc 1 min_grad 1e-010 mu 0.001 mu_dec 0.1 mu_inc 10 mu_max 10000000000 show 25 time inf. I initialized the weights and i have trained my neural network but after 11 epochs the training stopped with a value error about 1e-012. So the result are ok but i don't understand why the train stop. In fact the slope of the error function is good when the train stop. This happens also with XOR network. About that I have another question because after the training, output is [-1 1 1 -1] but it must be [0 1 1 0]. But the simulation works well. In this case I have used a ffbp with trainlm and the same training parameters. The network was 2-2-1 with logsig as transfer function for output and hidden layers. I hope in your explanation. Thanks.
MATLAB: The number of epochs, using trainlm, is so low
[neural network] [nntool] [trainlm] [xor]Deep Learning Toolbox
Related Question
- Calculation process of artificial neural networks by nntool
- Question about neural network weights/biases initialization
- What is best BP training function to train data with following input in MLP
- How to create a neural network with 1 layer only (no hidden layers)
- Control the epochs while training a neural network
- Bad classification even after training neural network
- ??? Error using ==> network.train at 145 Inputs are incorrectly sized for network. Matrix must have 1000 rows.
Best Answer