Kybernetika 43 no. 5, 675-696, 2007

On generalized entropies, Bayesian decisions and statistical diversity

Igor Vajda and Jana Zvárová

Abstract:

The paper summarizes and extends the theory of generalized $\phi $-entropies $H_{\phi }(X)$ of random variables $X$ obtained as $\phi $-informations $I_{\phi }(X;Y)$ about $X$ maximized over random variables $Y$. Among the new results is the proof of the fact that these entropies need not be concave functions of distributions $p_{X}$. An extended class of power entropies $H_{\alpha }(X)$ is introduced, parametrized by $\alpha \in {\mathbb{R}}$, where $H_{\alpha }(X)$ are concave in $p_{X}$ for $\alpha \geq 0$ and convex for $\alpha <0$. It is proved that all power entropies with $\alpha \leq 2$ are maximal $\phi $-informations $I_{\phi }(X;X)$ for appropriate $\phi $ depending on $\alpha $. Prominent members of this subclass of power entropies are the Shannon entropy $H_{1}(X)$ and the quadratic entropy $H_{2}(X)$. The paper investigates also the tightness of practically important previously established relations between these two entropies and errors $e(X)$ of Bayesian decisions about possible realizations of $X$. The quadratic entropy is shown to provide estimates which are in average more than 100 \than those based on the Shannon entropy, and this tightness is shown to increase even further when $\alpha $ increases beyond $\alpha =2$. Finally, the paper studies various measures of statistical diversity and introduces a general measure of anisotony between them. This measure is numerically evaluated for the entropic measures of diversity $H_1(X)$ and $H_2(X)$.