Kybernetika 48 no. 4, 714-749, 2012

Generalized information criteria for Bayes decisions

Domingo Morales and Igor Vajda

Abstract:

This paper deals with Bayesian models given by statistical experiments and standard loss functions. Bayes probability of error and Bayes risk are estimated by means of classical and generalized information criteria applicable to the experiment. The accuracy of the estimation is studied. Among the information criteria studied in the paper is the class of posterior power entropies which include the Shannon entropy as special case for the power $\alpha =1$. It is shown that the most accurate estimate is in this class achieved by the quadratic posterior entropy of the power $\alpha =2$. The paper introduces and studies also a new class of alternative power entropies which in general estimate the Bayes errors and risk more tightly than the classical power entropies. Concrete examples, tables and figures illustrate the obtained results.

Keywords:

power entropies, Shannon entropy, Bayes error, Bayes risk, alternative Shannon entropy, alternative power entropies, sub-Bayes risk

Classification:

62C10, 62B10

References:

  1. M. Ben Bassat: $f$-entropies, probability of error, and feature selection. Inform. Control 39 (1978), 227-242.   CrossRef
  2. M. Ben Bassat and J. Raviv: Rényi's entropy and probability of error. IEEE Trans. Inform. Theory 24 (1978), 324-331.   CrossRef
  3. J. O. Berger: Statistical Decision Theory and Bayesian Analysis. Second edition. Springer, Berlin 1986.   CrossRef
  4. T. M. Cover and P. E. Hart: Nearest neighbor pattern classification. IEEE Trans. Inform. Theory 13 (1967), 21-27.   CrossRef
  5. P. Devijver and J. Kittler: Pattern Recognition. A Statistical Approach. Prentice Hall, Englewood Cliffs, New Jersey 1982.   CrossRef
  6. L. Devroye, L. Györfi and G. Lugosi: A Probabilistic Theory of Pattern Recognition 1996. Springer, Berlin 1996.   CrossRef
  7. D. K. Faddeev: Zum Begriff der Entropie einer endlichen Wahrscheinlichkeitsschemas. Vol. I. Deutscher Verlag der Wissenschaften, Berlin 1957.   CrossRef
  8. M. Feder and N. Merhav: Relations between entropy and error probability. IEEE Trans. Inform. Theory 40 (1994), 259-266.   CrossRef
  9. P. Harremoës and F. Topsøe: Inequalities between entropy and index of coincidence derived from information diagrams. IEEE Trans. Inform. Theory 47 (2001), 2944-2960.   CrossRef
  10. J. Havrda and F. Charvát: Concept of structural $a$-entropy. Kybernetika 3 (1967), 30-35.   CrossRef
  11. L. Kanal: Patterns in pattern recognittion. IEEE Trans. Inform. Theory 20 (1974), 697-707.   CrossRef
  12. V. A. Kovalevsky: The problem of character recognition from the point of view of mathematical statistics. In: Reading Automata and Pattern Recognition (in Russian) (Naukova Dumka, Kyjev, ed. 1965). English translation in: Character Readers and Pattern Recognition, Spartan Books, New York 1968, pp. 3-30.   CrossRef
  13. D. Morales, L. Pardo and I. Vajda: Uncertainty of discrete stochastic systems: general theory and statistical inference. IEEE Trans. System, Man and Cybernetics, Part A 26 (1996), 1-17.   CrossRef
  14. A. Rényi: Proceedings of 4th Berkeley Symp. on Probab. Statist. University of California Press, Berkeley, California 1961.   CrossRef
  15. N. P. Salikhov: Confirmation of a hypothesis of I. Vajda (in Russian). Problemy Peredachi Informatsii 10 (1974), 114-115.   CrossRef
  16. D. L. Tebbe and S. J. Dwyer III: Uncertainty and probability of error. IEEE Trans. Inform. Theory 14 (1968), 516-518.   CrossRef
  17. G. T. Toussaint: A generalization of Shannon's equivocation and the Fano bound. IEEE Trans. System, Man and Cybernetics 7 (1977), 300-302.   CrossRef
  18. I. Vajda: Bounds on the minimal error probability and checking a finite or countable number of hypotheses. Inform. Transmission Problems 4 (1968), 9-17.   CrossRef
  19. I. Vajda: A contribution to informational analysis of patterns. In: Methodologies of Pattern Recognition (M. S. Watanabe, ed.), Academic Press, New York 1969.   CrossRef
  20. I. Vajda and K. Vašek: Majorization, concave entropies and comparison of experiments. Problems Control Inform. Theory 14 (1985), 105-115.   CrossRef
  21. I. Vajda and J. Zvárová: On generalized entropies, Bayesian decisions and statistical diversity. Kybernetika 43 (2007), 675-696.   CrossRef