Solved – What does it mean when during neural network training validation loss AND validation accuracy drop after an epoch

accuracyconv-neural-networkloss-functionsneural networks

I have a simple question which I cannot find a straight answer to. I am training a neural network to classify some medical images. I initially focused on validation accuracy after each epoch (to determine how the network was generalising) and then after that, test accuracy on an unseen dataset.

But I see that validation loss is also important – and sometimes my validation loss drops even thought for that epoch, the validation accuracy also goes down slightly. Ultimately test accuracy is going to be the gold standard I guess, but thats no use for guiding training and adjusting parameters. Obviously if the the validation loss starts to go up and validation accuracy starts to drop it indicates overfitting. But what about when both validation loss and validation accuracy drop after an epoch?

Best Answer

Expanding on my comment, The main goal of your training algorithm is to decrease the net loss. So as long as your validation loss is decreasing, your algorithm is doing good, and is generalizing well. But the validation accuracy is not improving or getting worse, consider relooking at the loss function you are optimizing, it indicates that the loss function is not suitable for the task, rather than training algorithm/model being defective. Go for some other loss function, and then select a model/algorithm that is suitable for that loss function.

Related Question