# Abstract:

In this paper we present a result that relates merging of closed convex sets of discrete probability functions respectively by the squared Euclidean distance and the Kullback-Leibler divergence, using an inspiration from the Rényi entropy. While selecting the probability function with the highest Shannon entropy appears to be a convincingly justified way of representing a closed convex set of probability functions, the discussion on how to represent several closed convex sets of probability functions is still ongoing. The presented result provides a perspective on this discussion. Furthermore, for those who prefer the standard minimisation based on the squared Euclidean distance, it provides a connection to a probabilistic merging operator based on the Kullback-Leibler divergence, which is closely connected to the Shannon entropy.

# Keywords:

Kullback-Leibler divergence, probabilistic merging, information geometry, Rényi entropy

52A99, 52C99

# References:

1. M. Adamčík: The information geometry of Bregman divergences and some applications in multi-expert reasoning. Entropy 16 (2014), 6338-6381.   DOI:10.3390/e16126338
2. M. Adamčík: Collective Reasoning under Uncertainty and Inconsistency.
3. M. Adamčík: On the applicability of the `number of possible states' argument in multi-expert reasoning. J. Appl. Logic 19 (2016), 20-49.   DOI:10.1016/j.jal.2016.10.001
4. M. Adamčík: A logician's approach to meta-analysis with unexplained heterogeneity. J. Biomed. Inform. 71 (2017), 110-129.   DOI:10.1016/j.jbi.2017.05.017
5. M. Adamčík and G. M. Wilmers: Probabilistic merging operators. Logique Analyse 228 (2014), 563-590.   DOI:10.2143/LEA.228.0.3078175
6. S. Amari and A. Cichocki: Families of Alpha- Beta- and Gamma- divergences: Flexible and robust measures of similarities. Entropy 12 (2010), 1532-1568.   DOI:10.3390/e12061532
7. A. Basu, I. R. Harris, N. Hjort and M. Jones: Robust and efficient estimation by minimising a density power divergence. Biometrika 85 (1998), 549-559.   DOI:10.1093/biomet/85.3.549
8. L. M. Bregman: The relaxation method of finding the common points of convex sets and its application to the solution of problems in convex programming. USSR Comput. Mathematics Math. Physics 1 (1967), 200-217.   DOI:10.1016/0041-5553(67)90040-7
9. P. Hawes: Investigation of Properties of Some Inference Processes.
10. E. T. Jaynes: Where do we stand on maximum entropy? In: The Maximum Entropy Formalism (R. D. Levine, M. Tribus, eds.), M.I.T. Press, 1979, pp. 15-118.   CrossRef
11. G. Kern-Isberner and W. Rödder: Belief revision and information fusion on optimum entropy. Int. J. Intell. Systems 19 (2004), 837-857.   DOI:10.1002/int.20027
12. D. Osherson and M. Vardi: Aggregating disparate estimates of chance. Games Econom. Behavior 56 (2006), 148-173.   DOI:10.1016/j.geb.2006.04.001
13. J. B. Paris: The Uncertain Reasoner Companion. Cambridge University Press, Cambridge 1994.   CrossRef
14. J. B. Paris and A. Vencovská: On the applicability of maximum entropy to inexact reasoning. Int. J. Approx. Reason. 3 (1989), 1-34.   DOI:10.1016/0888-613x(89)90012-1
15. J. B. Paris and A. Vencovská: A note on the inevitability of maximum entropy. Int. J. Approx. Reason. 4 (1990), 183-224.   DOI:10.1016/0888-613x(90)90020-3
16. J. B. Predd, D. N. Osherson, S. R Kulkarni and H. V. Poor: Aggregating probabilistic forecasts from incoherent and abstaining experts. Decision Analysis 5 (2008), 177-189.   DOI:10.1287/deca.1080.0119
17. A. Rényi: On measures of entropy and information. In: Proc. Fourth Berkeley Symposium on Mathematics, Statistics and Probability 1 (1961), 547-561.   CrossRef
18. C. E. Shannon: A mathematical theory of communication. Bell System Techn. J. 27 (1948), 379-423, 623-656.   DOI:10.1002/j.1538-7305.1948.tb00917.x
19. G. M. Wilmers: A foundational approach to generalising the maximum entropy inference process to the multi-agent context. Entropy 17 (2015), 594-645.   DOI:10.3390/e17020594