Classification – Bayes Optimal Classifier vs Likelihood Ratio Comparison

bayes-optimal-classifierclassificationlikelihood-ratio

I am getting slightly confused by all the probabilistic classifiers.

  1. The bayes optimal classifier is given as $ max (p(x|C)p(C)) $ and if all classes have equal prior then it reduces to $ max (p(x|C)) $

  2. The likelihood ratio is given as $ \frac{p(x|C1)}{p(x|C2)}$

If I only have 2 classes with equal prior then what is the difference between the bayes optimal classifier and the likelihood ratio ? Wont they both return me the same class as the output ?

Best Answer

They are not the same, but in you case they could be used for the same purpose.

Optimal Bayes classifier is

$$ \DeclareMathOperator*{\argmax}{arg\,max} \argmax_{c \in C} p(c|X) $$

i.e., among all hypotheses, take the $c$ that maximizes the posterior probability. You use Bayes theorem

$$ \underbrace{p(c|X)}_{\text{posterior}} \propto \underbrace{p(X|c)}_{\text{likelihood}} \underbrace{p(c)}_{\text{prior}} $$

but since using uniform prior (all $c$ are equally likely, so $p(c) \propto 1$) it reduces to the likelihood function

$$ p(c|X) \propto p(X|c) $$

The difference between maximizing the likelihood function and comparing the likelihood ratios, is that with likelihood ratio you compare only two likelihoods, while in maximizing the likelihood you may consider multiple hypothesis. So if you have only two hypotheses, then they will do essentially the same thing. However imagine that you had multiple classes, in such case comparing each of them with all the others pair by pair would be a really inefficient way to go.

Notice that likelihood ratio serves also other purpose than finding which of the two models has greater likelihood. Likelihood ratio can be used for hypothesis testing and it tells you how much more (or less) likely is is one of the models comparing to the other. Moreover, you can do the same when comparing the posterior distributions by using Bayes factor in similar fashion.

Related Question