Solved – Where to include Dropout in stacked autoencoder

autoencodersdeep learningdropoutkeras

I'm using Keras to implement a stacked autoencoder, and I think it may be overfitting. I wanted to include dropout, and keep reading about the use of dropout in autoencoders, but I cannot find any examples of dropout being practically implemented into a stacked autoencoder.

Where would the dropout layer(s) go, between every layer, only after the input layer, is anyone able to let me know/provide some resource implementing this?

Best Answer

We have tried adding it in few different ways:

  1. Add only after input layer. That will make some inputs zero
  2. Add after input and every encoder layer. That will make some inputs and encoded outputs zero. We didn't want decoder layers to lose information while trying to deconstructing the input.

However, we could not eliminate the overfitting completely. We have also tried adding noise to input data (Denoising AE), adding regularization (Sparse AE) to encoding layers. But still our hand made features performed better than AE created features.

Related Question