Can I normalize KL-divergence to be \leq 1\leq 1?

The Kullback-Leibler divergence has a strong relationship with mutual information, and mutual information has a number of normalized variants. Is there some similar, entropy-like value that I can use to normalize KL-divergence such that the normalized KL-divergence is bounded above by 1 (and below by 0)?

Instead you might consider normalization through a nonlinear transformation such as $1-\exp(-D_{KL})$.