Solved – How to initialize the weights of neural networks so that sum of probabilities at output layer adds up to 1

neural networks

I have been randomly initializing weights for my neuron layers. I did some calculations on the paper and realized at least for the initial few iterations the probabilities might not add up while summing the results from the output layer. Is it acceptable or there should be weight distribution from the initial iteration itself so that the probabilities add up to one. If so how to initialize the weights for the neural networks.

Best Answer

You don't need to hack the initial weights. You may need to revisit the architecture of the neural network. For example, add a softmax layer as a final layer so that the output can be interpreted as probabilities for classes.