- Published on
KL Divergence
KL divergence or relative entropy is a measure of how one probability distribution is different from a reference probability distribution.
KL divergence or relative entropy is a measure of how one probability distribution is different from a reference probability distribution.