Solved – Difference between dropout and neurons with 0 weights

dropoutneural networksregularization

In the dropout method of regularization, we randomly delete half of the hidden neurons, leaving the input and output layers the same.

In a theoretical sense, wouldn't the same effect occur if we just randomly assign half of the hidden neurons a weight of 0, since this would effectively null out that neuron?

Best Answer

If you set all the output weights for certain neurons to 0, yes, it is the same. But just to clarify, dropout doesn't actually delete neurons, it just deactivate them temporarily and randomly for each input. But if you deactivate a portion of your weights without regard to specific neurons (you may cut only some weights from one neuron), you end with DropConnect.