Hellinger Distance

The Hellinger distance measures the similarity or dissimilarity between two probability distributions. It is often used in statistics and information theory to compare how two probability distributions differ or overlap.

The Hellinger distance is often used in various applications, such as statistical hypothesis testing, image processing, machine learning, and ecology, where comparing and quantifying the similarity or difference between probability distributions is important.

Where:

The Hellinger distance has several useful properties:

  1. It is a metric: It satisfies the properties of a metric, which means it is non-negative, symmetric, and obeys the triangle inequality. In other words, it measures the distance between two distributions in a mathematically consistent way.

  2. Interpretability: The Hellinger distance has a meaningful interpretation in terms of probability distributions. It quantifies how much the square root of the probability density functions of the two distributions differ.

  3. Square-root transformation: The square root transformation in the formula gives more weight to the differences in the tails of the distributions compared to other distance metrics like the Kullback-Leibler (KL) divergence or the Bhattacharyya distance.

Last updated

Logo

Bayesia USA

info@bayesia.us

Bayesia S.A.S.

info@bayesia.com

Bayesia Singapore

info@bayesia.com.sg

Copyright Ā© 2024 Bayesia S.A.S., Bayesia USA, LLC, and Bayesia Singapore Pte. Ltd. All Rights Reserved.