Dropout – How to Adjust Dropout Value Across Neural Network Layers

conv-neural-networkdropouthyperparameterneural networks

I'm confused about dropout values that people set.
Sometimes it's the same value, say 0.4.
Sometimes they increase them gradually from 0.2 to 0.5. For example after maxpooling in CNNs.
Sometimes I see that numbers go up and down and then up again.
Is there a rule for this kind of decisions?

Best Answer

You mostly just have to figure this out through trial and error (and metrics on your validation data). But in say, a classification CNN, the majority of the parameters are typically concentrated in the last few layers, so it makes sense to use more dropout there, as there's more need for regularization where there are more parameters.

Related Question