Kybernetika 51 no. 5, 725-738, 2015

On limiting towards the boundaries of exponential families

František MatúšDOI: 10.14736/kyb-2015-5-0725

Abstract:

This work studies the standard exponential families of probability measures on Euclidean spaces that have finite supports. In such a family parameterized by means, the mean is supposed to move along a segment inside the convex support towards an endpoint on the boundary of the support. Limit behavior of several quantities related to the exponential family is described explicitly. In particular, the variance functions and information divergences are studied around the boundary.

Keywords:

exponential family, relative entropy, information divergence, variance function, Kullback-Leibler divergence, mean parametrization, convex support

Classification:

94A17, 62B10, 60A10

References:

  1. N. Ay: An information-geometric approach to a theory of pragmatic structuring. The Annals of Probability 30 (2002), 416-436.   DOI:10.1214/aop/1020107773
  2. O. Barndorff-Nielsen: Information and Exponential Families in Statistical Theory. Wiley, New York 1978.   CrossRef
  3. L. D. Brown: Fundamentals of Statistical Exponential Families. Inst. of Math. Statist. Lecture Notes - Monograph Series 9 (1986).   CrossRef
  4. N. N. Chentsov: Statistical Decision Rules and Optimal Inference. Translations of Mathematical Monographs, Amer. Math. Soc., Providence - Rhode Island 1982 (Russian original: Nauka, Moscow, 1972).   CrossRef
  5. I. Csiszár and F. Matúš: Closures of exponential families. The Annals of Probability 33 (2005), 582-600.   DOI:10.1214/009117904000000766
  6. I. Csiszár and F. Matúš: Generalized maximum likelihood estimates for exponential families. Probability Theory and Related Fields 141 (2008), 213-246.   DOI:10.1007/s00440-007-0084-z
  7. R. Graham, D. Knuth and O. Patashnik: Concrete Mathematics. Second edition. Addison-Wesley, Reading, Massachusetts 1994, p. 446.   CrossRef
  8. G. Letac: Lectures on Natural Exponential Families and their Variance Functions. Monografias de Matemática 50, Instituto de Matemática Pura e Aplicada, Rio de Janeiro 1992.   CrossRef
  9. F. Matúš and N. Ay: On maximization of the information divergence from an exponential family. In: Proc. WUPES'03 (J. Vejnarová, ed.), University of Economics, Prague 2003, pp. 99-204.   CrossRef
  10. F. Matúš: Optimality conditions for maximizers of the divergence from an EF. Kybernetika 43 (2007), 731-746.   CrossRef
  11. F. Matúš: Divergence from factorizable distributions and matroid representations by partitions. IEEE Trans. Inform. Theory 55 (2009), 5375-5381.   DOI:10.1109/tit.2009.2032806
  12. F.Matúš F. and J. Rauh: Maximization of the information divergence from an exponential family and criticality. In: Proc. IEEE ISIT 2011, St. Petersburg 2011, pp. 809-813.   DOI:10.1109/isit.2011.6034269
  13. G. Montúfar, J. Rauh J. and N. Ay: Maximal information divergence from statistical models defined by neural networks. In: Proc. GSI 2013, Paris 2013, Lecture Notes in Computer Science 8085 (2013), 759-766.   DOI:10.1007/978-3-642-40020-9_85
  14. J. Rauh: Finding the maximizers of the information divergence from an exponential family. IEEE Trans. Inform. Theory 57 (2011), 3236-3247.   DOI:10.1109/tit.2011.2136230
  15. R. T. Rockafellar: Convex Analysis. Princeton University Press, 1970.   DOI:10.1017/s0013091500010142