Skip to main content

Relative Entropy

Meaning

Relative Entropy, also known as Kullback-Leibler (KL) divergence, is a measure quantifying how one probability distribution differs from a second, reference probability distribution. In the context of crypto, it can assess the statistical distance between observed market behavior (e.g., price movements, transaction volumes) and an expected or historical distribution. This metric provides a robust indicator for detecting anomalies, shifts in market regimes, or significant divergences in asset price dynamics from a baseline.