[Math] Kullback–Leibler divergence of product distributions

entropyprobability distributionsst.statistics

Say the KL divergence between two distributions $A$ and $B$ is $\varepsilon$. Can we give bounds, or a precise computation, of the KL divergence between $A^k$ and $B^k$ (the product distributions)?

Best Answer

Transforming usul's comment into a proper answer: if the KL divergence between $A$ and $B$ is $\varepsilon$, the KL divergence between $A^k$ and $B^k$ is $k\varepsilon$. This follows directly from the chain rule (Theorem 5.3 of this PDF, applied to a situation where $x$ and $y$ are independent).

Related Question