Kybernetika 48 no. 2, 242-253, 2012

Convexity inequalities for estimating generalized conditional entropies from below

Alexey Rastegin

Abstract:

Generalized entropic functionals are in an active area of research. Hence lower and upper bounds on these functionals are of interest. Lower bounds for estimating Rényi conditional $\alpha$-entropy and two kinds of non-extensive conditional $\alpha$-entropy are obtained. These bounds are expressed in terms of error probability of the standard decision and extend the inequalities known for the regular conditional entropy. The presented inequalities are mainly based on the convexity of some functions. In a certain sense, they are complementary to generalized inequalities of Fano type.

Keywords:

Rényi $\alpha $-entropy, non-extensive entropy of degree $\alpha $, error probability, Bayesian problems, functional convexity

Classification:

94E17, 60E15, 62C10, 39B62

References:

  1. L. Baladová: Minimum of average conditional entropy for given minimum probability of error. Kybernetika 2 (1966), 416-422.   CrossRef
  2. T. Cover and J. Thomas: Elements of Information Theory. John Wiley \& Sons, New York 1991.   CrossRef
  3. I. Csiszár: Axiomatic characterizations of information measures. Entropy 10 (2008), 261-273.   CrossRef
  4. Z. Daróczy: Generalized information functions. Inform. and Control 16 (1970), 36-51.   CrossRef
  5. M. H. DeGroot: Optimal Statistical Decisions. McGraw-Hill, New York 1970.   CrossRef
  6. D. Erdogmus and J. C. Principe: Lower and upper bounds for misclassification probability based on Rényi's information. J. VLSI Signal Process. 37 (2004), 305-317.   CrossRef
  7. R. M. Fano: Transmission of Information: A Statistical Theory of Communications. MIT Press and John Wiley\ \&\ Sons, New York 1961.   CrossRef
  8. M. Feder and N. Merhav: Relations between entropy and error probability. IEEE Trans. Inform. Theory 40 (1994), 259-266.   CrossRef
  9. S. Furuichi: Information theoretical properties of Tsallis entropies. J. Math. Phys. 47 (2006), 023302.   CrossRef
  10. M. Gell-Mann, C. Tsallis and eds.: Nonextensive Entropy - Interdisciplinary Applications. Oxford University Press, Oxford 2004.   CrossRef
  11. G. H. Hardy, J. E. Littlewood and G. Polya: Inequalities. Cambridge University Press, London 1934.   CrossRef
  12. J. Havrda and F. Charvát: Quantification methods of classification processes: concept of structural $\alpha$-entropy. Kybernetika 3 (1967), 30-35.   CrossRef
  13. P. Jizba and T. Arimitsu: The world according to Rényi: thermodynamics of multifractal systems. Ann. Phys. 312 (2004), 17-59.   CrossRef
  14. R. Kamimura: Minimizing $\alpha$-information for generalization and interpretation. Algorithmica 22 (1998), 173-197.   CrossRef
  15. A. Novikov: Optimal sequential procedures with Bayes decision rules. Kybernetika 46 (2010), 754-770.   CrossRef
  16. A. Perez: Information-theoretic risk estimates in statistical decision. Kybernetika 3 (1967), 1-21.   CrossRef
  17. A. E. Rastegin: Rényi formulation of the entropic uncertainty principle for POVMs. J. Phys. A: Math. Theor. 43 (2010), 155302.   CrossRef
  18. A. E. Rastegin: Entropic uncertainty relations for extremal unravelings of super-operators. J. Phys. A: Math. Theor. 44 (2011), 095303.   CrossRef
  19. A. E. Rastegin: Continuity estimates on the Tsallis relative entropy. E-print arXiv:1102.5154v2 [math-ph] (2011).   CrossRef
  20. A. E. Rastegin: Fano type quantum inequalities in terms of $q$-entropies. Quantum Information Processing (2011), doi 10.1007/s11128-011-0347-6.   CrossRef
  21. A. Rényi: On measures of entropy and information. In: Proc. 4th Berkeley Symposium on Mathematical Statistics and Probability, University of California Press, Berkeley - Los Angeles 1961, pp. 547-561.   CrossRef
  22. A. Rényi: On the amount of missing information in a random variable concerning an event. J. Math. Sci. 1 (1966), 30-33.   CrossRef
  23. A. Rényi: Statistics and information theory. Stud. Sci. Math. Hung. 2 (1967), 249-256.   CrossRef
  24. A. Rényi: On some basic problems of statistics from the point of view of information theory. In: Proc. 5th Berkeley Symposium on Mathematical Statistics and Probability, University of California Press, Berkeley - Los Angeles 1967, pp. 531-543.   CrossRef
  25. B. Schumacher: Sending entanglement through noisy quantum channels. Phys. Rev. A 54 (1996), 2614-2628.   CrossRef
  26. C. Tsallis: Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 52 (1988), 479-487.   CrossRef
  27. I. Vajda: On the statistical decision problem with discrete paprameter space. Kybernetika 3 (1967), 110-126.   CrossRef
  28. I. Vajda: Bounds of the minimal error probability on checking a finite or countable number of hypotheses. Problemy Peredachii Informacii 4 (1968), 9-19 (in Russian); translated as Problems of Information Transmission 4 (1968), 6-14.   CrossRef
  29. K. \.{Z}yczkowski: Rényi extrapolation of Shannon entropy. Open Sys. Inform. Dyn. 10 (2003), 297-310; corrigendum in the e-print version arXiv:quant-ph/0305062v2.   CrossRef