Kybernetika 50 no. 2, 175-188, 2014

The irrelevant information principle for collective probabilistic reasoning

Martin Adamčík and George WilmersDOI: 10.14736/kyb-2014-2-0175

Abstract:

Within the framework of discrete probabilistic uncertain reasoning a large literature exists justifying the maximum entropy inference process, $\ME$, as being optimal in the context of a single agent whose subjective probabilistic knowledge base is consistent. In particular Paris and Vencovská completely characterised the $\ME$ inference process by means of an attractive set of axioms which an inference process should satisfy. More recently the second author extended the Paris-Vencovská axiomatic approach to inference processes in the context of several agents whose subjective probabilistic knowledge bases, while individually consistent, may be collectively inconsistent. In particular he defined a natural multi-agent extension of the inference process $\ME$ called the social entropy process, $\SEP$. However, while $\SEP$ has been shown to possess many attractive properties, those which are known are almost certainly insufficient to uniquely characterise it. It is therefore of particular interest to study those Paris-Vencovská principles valid for $\ME$ whose immediate generalisations to the multi-agent case are not satisfied by $\SEP$. One of these principles is the Irrelevant Information Principle, a powerful and appealing principle which very few inference processes satisfy even in the single agent context. In this paper we will investigate whether $\SEP$ can satisfy an interesting modified generalisation of this principle.

Keywords:

maximum entropy, uncertain reasoning, discrete probability function, social inference process, Kullback-Leibler, irrelevant information principle

Classification:

03B42, 03B48, 68T37

References:

  1. M. Adamčík and G. M. Wilmers: Probabilistic merging operators. Logique et Analyse (2013), to appear.   CrossRef
  2. R. Carnap: On the application of inductive logic. Philosophy and Phenomenological Research 8 (1947), 133-148.   CrossRef
  3. S. French: Group consensus probability distributions: A critical survey. In: Bayesian Statistics (J. M. Bernardo, M. H. De Groot, D. V. Lindley, and A. F. M. Smith, eds.), Elsevier, North Holland 1985, pp. 183-201.   CrossRef
  4. G. H. Hardy, J. E. Littlewood and G. Pólya: Inequalities. Cambridge University Press, 1934.   CrossRef
  5. P. Hawes: An Investigation of Properties of Some Inference Processes. Ph.D. Thesis, The University of Manchester, Manchester 2007.   CrossRef
  6. E. T. Jaynes: Where do we stand on maximum entropy? In: The Maximum Entropy Formalism (R. D. Levine and M. Tribus, eds.), M.I.T. Press, Cambridge 1979.   CrossRef
  7. G. Kern-Isberner and W. Rödder: Belief revision and information fusion on optimum entropy. Internat. J. of Intelligent Systems 19 (2004), 837-857.   CrossRef
  8. J. Kracík: Cooperation Methods in Bayesian Decision Making with Multiple Participants. Ph.D. Thesis, Czech Technical University, Prague 2009.   CrossRef
  9. F. Matúš: On Iterated Averages of $I$-projections. Universität Bielefeld, Germany 2007.   CrossRef
  10. D. Osherson and M. Vardi: Aggregating disparate estimates of chance. Games and Economic Behavior 56 (2006), 1, 148-173.   CrossRef
  11. J. B. Paris: The Uncertain Reasoner's Companion. Cambridge University Press, Cambridge 1994.   CrossRef
  12. J. B. Paris and A. Vencovská: On the applicability of maximum entropy to inexact reasoning. Internat. J. of Approximate Reasoning 3 (1989), 1-34.   CrossRef
  13. J. B. Paris and A. Vencovská: A note on the inevitability of maximum entropy. Internat. J. of Approximate Reasoning 4 (1990), 183-224.   CrossRef
  14. J. B. Predd, D. N. Osherson, S. R. Kulkarni and H. V. Poor: Aggregating probabilistic forecasts from incoherent and abstaining experts. Decision Analysis 5 (2008), 4, 177-189.   CrossRef
  15. J. E. Shore and R. W. Johnson: Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Trans. Inform. Theory 26 (1980), 1, 26-37.   CrossRef
  16. J. Vomlel: Methods of Probabilistic Knowledge Integration. Ph.D. Thesis, Czech Technical University, Prague 1999.   CrossRef
  17. G. M. Wilmers: The social entropy process: Axiomatising the aggregation of probabilistic beliefs. In: Probability, Uncertainty and Rationality (H. Hosni and F. Montagna, eds.), 10 CRM series, Scuola Normale Superiore, Pisa 2010, pp. 87-104.   CrossRef
  18. G. M. Wilmers: Generalising the Maximum Entropy Inference Process to the Aggregation of Probabilistic Beliefs. available from \url{http://manchester.academia.edu/GeorgeWilmers/Papers}   CrossRef