Solved – Area under Precision-Recall Curve (AUC of PR-curve) and Average Precision (AP)

aucaverage-precisionprecision-recallscikit learn

Is Average Precision (AP) the Area under Precision-Recall Curve (AUC of PR-curve) ?

EDIT:

here is some comment about difference in PR AUC and AP.

The AUC is obtained by trapezoidal interpolation of the precision. An
alternative and usually almost equivalent metric is the Average
Precision (AP), returned as info.ap. This is the average of the
precision obtained every time a new positive sample is recalled. It is
the same as the AUC if precision is interpolated by constant segments
and is the definition used by TREC most often.

http://www.vlfeat.org/overview/plots-rank.html

Moreover, the auc and the average_precision_score results are not the same in scikit-learn. This is strange, because in the documentation we have:

Compute average precision (AP) from prediction scores This score
corresponds to the area under the precision-recall curve.

here is the code:

# Compute Precision-Recall and plot curve
precision, recall, thresholds = precision_recall_curve(y_test, clf.predict_proba(X_test)[:,1])
area = auc(recall, precision)
print "Area Under PR Curve(AP): %0.2f" % area  #should be same as AP?

print 'AP', average_precision_score(y_test, y_pred, average='weighted')
print 'AP', average_precision_score(y_test, y_pred, average='macro')
print 'AP', average_precision_score(y_test, y_pred, average='micro')
print 'AP', average_precision_score(y_test, y_pred, average='samples')

for my classifer I have something like:

Area Under PR Curve(AP): 0.65
AP 0.676101781304
AP 0.676101781304
AP 0.676101781304
AP 0.676101781304

Best Answer

Short answer is: YES. Average Precision is a single number used to summarise a Precision-Recall curve:

enter image description here

You can approximate the integral (area under the curve) with:

enter image description here

Please take a look at this link for a good explanation.