I am working on an image classification problem with only a few samples (10 images). As part of the challenge, we aren't allowed to use any external data or pretrained models. I was wondering whether the self-supervised learning techniques that are normally done on unlabelled data would work if I only used them on just the 10 labeled images.
Neural Networks – Can Self-Supervised Pretraining Work with Only Labeled Data
conv-neural-networkneural networksself-supervised-learning
Best Answer
Self-supervised learning is effective because it allows researchers to use much more data than they would have time/money to label. If you can't expand your training dataset using self-supervised learning, you may as well stick with supervised learning.