Kybernetika 45 no. 6, 885-900, 2009

On Metric Divergences of Probability Measures

Igor Vajda


Standard properties of $\phi$-divergences of probability measures are widely applied in various areas of information processing. Among the desirable supplementary properties facilitating employment of mathematical methods is the metricity of $\phi $-divergences, or the metricity of their powers. This paper extends the previously known family of $\phi $-divergences with these properties. The extension consists of a continuum of $\phi $-divergences which are squared metric distances and which are mostly new but include also some classical cases like e. g. the Le Cam squared distance. The paper establishes also basic properties of the $\phi $-divergences from the extended class including the range of values and the upper and lower bounds attained under fixed total variation.


metric divergences, total variation, Hellinger divergence, Le Cam divergence, Information divergence, Jensen-Shannon divergence


94A17, 62B10, 62H30, 68T10