Can I normalize KL-divergence to be \leq 1\leq 1?

The Kullback-Leibler divergence has a strong relationship with mutual information, and mutual information has a number of normalized variants. Is there some similar, entropy-like value that I can use to normalize KL-divergence such that the normalized KL-divergence is bounded above by 1 (and below by 0)?

Answer

In the most general class of distributions your multiplicative normalization approach is not possible because one can trivially select the comparison density to be zero in some interval leading to an unbounded divergence. Therefore your approach only makes sense with positive densities, or where both share the same support (perhaps densities of the same class?)

Instead you might consider normalization through a nonlinear transformation such as 1-\exp(-D_{KL}).

Attribution
Source : Link , Question Author : Michael McGowan , Answer Author : Emre

Leave a Comment