[Math] Conditional probability, Bayes’ rule and chain rule

bayes-theoremconditional probabilityprobability

I'm reading the following paper A Rational Account of the Perceptual Magnet Effect and I'm puzzled by equation (3) on page 2:

$$p(T|S,c) \propto p(S|T)p(T|c)$$

where $T$, $S$ and $c$ are random variables. Apparently it should be trivial, but I tried to play with conditional probability, Bayes' rule and chain rule without getting anything like this. Am I missing something very obvious?

Best Answer

It doesn't seem to be explicitly written out, but it appears as if they assume $S$ and $c$ to be independent conditional on $T$ such that $p(S|T, c)=p(S|T)$. $$ p(T|S, c)=\frac{p(T, S, c)}{p(S, c)}=\frac{p(S|T, c)p(T, c)}{p(S, c)}=\frac{p(S|T, c)p(T| c)p(c)}{p(S, c)}\propto p(S|T, c)p(T| c)=p(S|T)p(T| c) $$ If they are not, then the last equality does not hold.