Solved – What happens when many neurons have same weights

neural networks

Does it make any sense to have two (or more) neurons in a neural network with the same weights? (intuitively it makes no sense, since all the neurons would behave the same way).

Please consider both the input and hidden layers. What if the weights are equal 0?

Best Answer

The weights are normally updated and can have many different values. You can have 2 or more weights with the same value.

If some weights are equal to zero, it just means that the neuron has no impact on the neuron of the next layer.

And one more thing to know is that you shouldn't initialize the values of all your weights at the beginning with the same value. See my previous answer here: stats.stackexchange.com/questions/45087/backpropagation/45092

Related Question