Professional Documents
Culture Documents
Poster Distances
Poster Distances
Poster Distances
Hamming distance
Euclidean geometry (|{i : pi 6= qi }|) Statistical geometry
Physics entropy JK −1 Additive entropy
Euclidean distance Manhattan Pdistance
R
pP −k p log p cross-entropy
d2 (p, q) = (p − qi )2 (Pythagoras’
i i
d1 (p, q) = i |pi − qi | conditional entropy
(Boltzmann-Gibbs 1878) mutual information
theorem circa 500 BC) (city block-taxi cab) (chain rules)
Kullback-Leibler
R divergence
KL(p||q) = p log pq = Ep [log Q
P
]
Quadratic
p distance (relative entropy, 1951)
dQ = (p − q)T Q(p − q)
Non-Euclidean geometries Jeffrey divergence
(Jensen-Shannon)
Fisher information (local entropy)
∂
I(θ) = E[ ∂θ
2
ln p(X|θ) ] Bhattacharya q distance (1967)
R √ √
(R. A. Fisher 1890-1962) d(p, q) = − ln ( p − q)2
Kolmogorov
R
K(p||q) = |q − p|
α = −1 (Kolmogorov-Smirnoff max |p − q|)
Riemannian
q metric tensor
dx
gij dx
R i j exponential families
ds ds
ds Matsushita distance (1956)
q
(B. Riemann 1826-1866,) Mα (p, q) = α
R 1 1
|q α − p α |
Csiszár’ f -divergence
T
R
Df (p||q) = pf ( pq ) Neyman
Kullback-Leibler
Non-additive entropy
Log Det divergence
Tsallis entropy (1998)
D(P||Q) =< P, Q−1 > − log det PQ−1 − dimP
(Non-additivePentropy)
Tα (p) = 1−α1
( i pαi − 1) Von Neumann divergence
pα
1
R
Tα (p||q) = 1−α
(1 − q α−1
) D(P||Q) = Tr(P(log P − log Q) − P + Q)
Earth mover distance
(EMD 1998)