I have trained a neural network on DNA sequences data and my training set has exactly the same number of data in both classes. When I select a softmax function at the end, my accuracy remains at 47% and loss for both validation and training stays the same at around 7.6 regardless of how many batches and epochs I choose. But once I change the softmax function to sigmoid, the validation accuracy starts at 50% for the first epoch and reaches above 98% at the end which is odd cause I think at best my network should achieve an accuracy of around 80% since I know some of my data is misclassified. Why is this happening?
Solved – Sigmoid vs Softmax Accuracy Difference
accuracykerasneural networkssigmoid-curvesoftmax
Related Question
- Solved – NNs: Multiple Sigmoid + Binary Cross Entropy giving better results than Softmax + Categorical Cross Entropy
- Solved – What does it mean when during neural network training validation loss AND validation accuracy drop after an epoch
- Solved – Neural Networks – Epochs with 10-fold Cross Validation – doing something wrong
- Solved – Why the accuracy of the neural network is falling when epoch increases
Best Answer
Using sigmoid with dummy encoded output (one binary column) vs using softmax with two one-hot encoded columns (one of the columns is equal to one, the other is zero) is mathematically equivalent and should give same results. Your likely have a bug in the code.