Classification – Rescaling Neural Network Sigmoid Output for Binary Classification Probability

classificationneural networksprecision-recall

I have set up a neural network which has a single output with a sigmoid activation function, which I understand by default is used as a binary classifier where values over 0.5 should belong to class 1 else class 0. After looking at the results of training, it would be a better balance of precision/recall for my task if I set the classification threshold at a lower number, say 0.25.

Is there a proper way to rescale around this new threshold to give a probability of being in a certain class? So for values close to 0.25, its actually around 50% probability of belonging to class 1.

from keras.models import Sequential
from keras.layers import Dense
model = Sequential()
model.add(Dense(1, activation='sigmoid'))
...layers layers layers
model.add(Dense(1, activation='sigmoid'))
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy',precision,recall,f1])
hist = model.fit(X_train.values,y_train.values, epochs=50, batch_size=64,
          verbose=1, validation_data=(X_val.values,y_val.values),
            callbacks=callbacks_list, shuffle=True)

model.predict(X_test) # ... want to use 0.25 as the cutoff threshold
# but also want the probability of belonging to class 1

Best Answer

model.predict will output a matrix in which each row is the probability of that input to be in class 1.

If you print it, it should look like this:

[[ 0.7310586 ]
 [ 0.26896983]]

You just need to loop through those values.

for i, predicted in enumerate(predictions):
    if predicted[0] > 0.25:
        print "bigger than 0.25"
        #assign i to class 1
    else:
        print "smaller than 0.25"
        #assign i to class 0

EDIT: It might be worth to play with the weight of the classes. If you weight the 1 class 3 times more, you might get something close to what you want, in a more elegant way.

Here is an example.

Related Question