Solved – Can you brief the training procedure of SSD in TF Object Detection API

computer visiondeep learningobject detectiontensorflow

In the paper it has mentioned how they consider classification and localization loss . But it is not clear how the TF-OD API for SSD model does that. Can any one brief this ?

Best Answer

First of all it is necessary to understand the loss function .

1. Localization loss

enter image description here

2. Classification Loss

enter image description here

You can read the SSD paper inorder to explore more about the loss .

So from this function it's important to make sure you understand only the positive examples (Predictions with some object) will take to the Localization loss and set of selected positive and negative examples will take to the classification loss .

Here the Xij is the similarity of default boxes at each location to ground truth boxes

We call this as Localization Weight - 1 or 0 with similarity matching

The Tensorflow object detection do the same but it uses an training method called Online hard example mining

You can read more about with this script in object detection API

Here I will point out what is actually happening ,

  1. First we calculate classification and localization loss for all the default boxes
  2. Then we will extract boxes with highest number of losses (Hard Examples) - Here in the API it takes only the classification loss to select the hardest examples because if we get regression losses they can be zero for negatives as in the ssd loss function .

  3. Then we select number of positives and negatives from the given number of hard examples (We can change it if number of positives are low we can maximize the size of hard examples ) . There will be lost of negative examples

  4. When selecting the number of positive and negative examples from the list there should be a certain ration . Normally we take number of negatives should be 3 times as the number of positives .

  5. Then for each indices we calculate the classification and localization loss and back propagate it .