
 The Kullback Leibler distance (KLdistance, KLdivergence)
is a natural distance function
from a "true" probability distribution, p,
to a "target" probability distribution, q.
It can be interpreted as the expected extra messagelength per datum
due to using a code based on the wrong (target) distribution compared to
using a code based on the true distribution.

 For discrete (not necessarily finite) probability distributions,
p={p_{1}, ..., p_{n}} and
q={q_{1}, ..., q_{n}},
the KLdistance is defined to be

 KL(p, q) =
Σ_{i}
p_{i} . log_{2}( ^{pi} / _{qi} )

 For continuous probability densities,
the sum is replaced by an integral.

 Note that

 KL(p, p) = 0
 KL(p, q) ≥ 0

 and that the KLdistance is not, in general, symmetric.
 However, a symmetric distance can be made, e.g.,
 KL(p, q) + KL(q, p)
 (sometimes divided by two).

