Solved – Can the mutual information of a “cell” be negative

information theorymutual informationprobability

Please forgive me if this is not the right Stack Exchange (and for inventing terms).

For discrete random variables X and Y, the mutual information of X and Y can be defined as follows:
$I(X;Y) = \sum_{y \in Y} \sum_{x \in X}
p(x,y) \log{ \left( \frac{p(x,y)}{p_1(x)\,p_2(y)}
\right) }, \,\!$

I will define the mutual information of a "cell" $x_0$ to be:
$CI(x_0,Y) = \sum_{y \in Y} p(x_0,y) \log{ \left( \frac{p(x_0,y)}{p_1(x_0)\,p_2(y)}
\right) }, \,\!$

I'm not sure if this quantity goes by another name. Essentially I'm restricting focus to a single state of variable X (and then the full MI can be calculated by summing all the cell MIs).

My question: is it guaranteed that $CI(x_0,Y) \ge 0$? We know $I(X;Y)\ge0$ and we know that the pointwise mutual information can be negative. I feel like CI should be nonnegative and that I might be missing some obvious proof.

Best Answer

In,

$I(x,y)= \sum_{x \in X} \sum_{y \in Y} p(x,y) \log_2 (\frac{p(x , y)}{p(x)p(y)})$

$CI(x,y)= \sum_{y \in Y} p(x,y) \log_2 (\frac{p(x , y)}{p(x)p(y)})$ for $x \in X$

we have,

$p(\cdot) \in [0,1]$

$\rightarrow\frac{p(\cdot)}{p(\cdot)p(\cdot)} \in [0...\infty ]$

$\rightarrow log_2\frac{p(\cdot)}{p(\cdot)p(\cdot)} \in (0...\infty ]$

$\rightarrow p(\cdot) log_2\frac{p(\cdot)}{p(\cdot)p(\cdot)} \in [0...\infty ]$

the codomain of $I$ and $CI$ is defined only on positive real values.

Related Question