I am training a CNN classifier on a binary balanced dataset. The dataset has 4500 numbers of tweet data along with the class of the tweet. During training, I am applying, GLOVE embedding of 300 dimensions, 'adam' solver to run the model for 33 times of epochs. Besides, the sequence length I have considered is 31.
I have applied 200 filters which include a number of convolution2dlayers,batch normalization layers, relu layers, dropout layers and max-pooling layers. The drop out I have considered is 0.2 and the max pool layer is of size [1 sequence length].
The training curve is approaching smoothly until the end period where it has fallen sharply. Here, I have attached the training plot I receive:
Would you please explain to me why does this sudden fall occur? And how could I get rid of this?
Thanks,
Best Answer