Kybernetika 56 no. 5, 979-1014, 2020

Information decomposition based on cooperative game theory

Nihat Ay, Daniel Polani and Nathaniel VirgoDOI: 10.14736/kyb-2020-5-0979

Abstract:

We offer a new approach to the \emph{information decomposition} problem in information theory: given a `target' random variable co-distributed with multiple `source' variables, how can we decompose the mutual information into a sum of non-negative terms that quantify the contributions of each random variable, not only individually but also in combination? We define a new way to decompose the mutual information, which we call the \emph{Information Attribution} (IA), and derive a solution using cooperative game theory. It can be seen as assigning a "fair share'' of the mutual information to each combination of the source variables. Our decomposition is based on a different lattice from the usual `partial information decomposition' (PID) approach, and as a consequence {the IA} has a smaller number of terms {than PID}: it has analogs of the synergy and unique information terms, but lacks separate terms corresponding to redundancy, instead sharing redundant information between the unique information terms. Because of this, it is able to obey equivalents of the axioms known as `local positivity' and `identity', which cannot be simultaneously satisfied by a PID measure.

Keywords:

information geometry, partial information decomposition, cooperative game theory

Classification:

94A15, 94A17, 91A12, 91A80, 06B05, 53B12, 62B11

References:

  1. S. Amari: Information geometry on hierarchical decomposition of stochastic interactions. IEEE Trans. Inform. Theory 47 (2001), 1701-1711.   DOI:10.1109/18.930911
  2. S. Amari and H. Nagaoka: Methods of Information Geometry. American Mathematical Soc. 191 (2007).   DOI:10.1090/mmono/191
  3. S. Amari, N. Tsuchiya, M. and Oizumi: Geometry of information integration. In: Information Geometry and its Applications IV, Springer 2016, pp. 3-17.   DOI:10.1007/978-3-319-97798-0\_1
  4. N. Ay: Information geometry on complexity and stochastic interaction. Entropy 17 (2015), 4, 2432-2458.   DOI:10.3390/e1704243
  5. N. Ay, J. Jost, H. V\^an L\^e and L. Schwachhöfer: Information Geometry. Springer 2017.   DOI:10.1007/978-3-319-56478-4
  6. A. B. Barrett: Exploration of synergistic and redundant information sharing in static and dynamical gaussian systems. Phys. Rev. E 91 (2015), 052802.   DOI:10.1103/physreve.91.052802
  7. N. Bertschinger, J. Rauh, E. Olbrich and J. Jost: Shared information: New insights and problems in decomposing information in complex systems. In: Proc. European Conference on Complex Systems, Springer 2012, pp. 251-269.   DOI:10.1007/978-3-319-00395-5\_35
  8. N. Bertschinger, J. Rauh, E. Olbrich, J. Jost and N. Ay: Quantifying unique information. Entropy 16 (2014), 4, 2161-2183.   DOI:10.3390/e16042161
  9. J. M. Bilbao: Axioms for the Shapley value on convex geometries. Europ. J. Oper. Res. 110 (1998), 2, 368-376.   DOI:10.1016/s0377-2217(97)00263-4
  10. J. M. Bilbao and P. H. Edelman: The Shapley value on convex geometries. Discrete Appl. Math. 103 (2000), 1-3, 33-40.   DOI:10.1016/s0166-218x(99)00218-8
  11. I. Csiszár and F. Matúš: Information projections revisited. IEEE Trans. Inform. Theory 49 (2003), 6, 1474-1490.   DOI:10.1109/tit.2003.810633
  12. I. Csiszár and F. Matúš: On information closures of exponential families: A counterexample. IEEE Trans. Inform. Theory 50 (2004), 5, 922-924.   DOI:10.1109/tit.2004.826661
  13. I. Csiszár and P. C. Shields: Information theory and statistics: A tutorial. Found. Trends Commun. Inform. Theory 1 (2004), 4, 417-528.   DOI: 10.1561/0100000004
  14. U. Faigle and M. Grabisch: Values for {M}arkovian coalition processes. Econom. Theory 51 (2012), 3, 505-538.   DOI:10.1007/s00199-011-0617-7
  15. U. Faigle and M. Grabisch: A concise axiomatization of a Shapley-type value for stochastic coalition processes. Econom. Theory Bull. (2013), 189-199.   DOI:10.1007/s40505-013-0020-6
  16. U. Faigle and W. Kern: The Shapley value for cooperative games under precedence constraints. Int. J. Game Theory 21 (1992), 3, 249-266.   DOI:10.1007/bf01258278
  17. C. Finn and J. T. Lizier: Pointwise partial information decomposition using the specificity and ambiguity lattices. Entropy 20 (2018), 4, 297.   DOI:10.3390/e20040297
  18. C. Finn and J. T. Lizier: Generalised measures of multivariate information content. Entropy 22 (2020), 2, 216.   DOI:10.3390/e22020216
  19. M. Grabisch: Set Functions, Capacities and Games. Springer 2016.   DOI:10.1007/978-3-319-30690-2\_2
  20. V. Griffith and C. Koch: Quantifying synergistic mutual information. In: Guided Self-Organization: Inception, Springer 2014, pp. 159-190.   DOI:10.1007/978-3-642-53734-9\_6
  21. M. Harder, C. Salge and D. Polani: Bivariate measure of redundant information. Phys. Rev. E 87 (2013), 012130.   DOI:10.1103/physreve.87.012130
  22. R. A. Ince: The partial entropy decomposition: Decomposing multivariate entropy and mutual information via pointwise common surprisal. \em arXiv:1702.01591.   CrossRef
  23. R. G. James and J. P. Crutchfield: Multivariate dependence beyond {Shannon} information. Entropy 19 (2017), 10, 531.   DOI:10.3390/e19100531
  24. R. G. James, J. Emenheiser and J. P. Crutchfield: Unique information via dependency constraints. J.f Physics A: Math. Theoret. 52 (2019), 1, 014002.   DOI:10.1088/1751-8121/aaed53
  25. A. Kolchinsky: A novel approach to multivariate redundancy and synergy.\\ arXiv:1908.08642.   CrossRef
  26. F. Lange and M. Grabisch: Values on regular games under Kirchhoff's laws. Math. Soc. Sci. 58 (2009), 322-340.   DOI:10.1016/j.mathsocsci.2009.07.003
  27. S. L. Lauritzen: Graphical Models. Oxford Science Publications 1996.   CrossRef
  28. M. Oizumi, N. Tsuchiya and S. Amari: Unified framework for information integration based on information geometry. Proc. National Academy of Sciences 113 (2016), 51, 14817-14822.   DOI:
  29. E. Olbrich, N. Bertschinger and J. Rauh: Information decomposition and synergy. Entropy 17 (2015), 5, 3501-3517.   DOI:10.1073/pnas.1603583113
  30. P. Perrone and N. Ay: Hierarchical quantification of synergy in channels. Frontiers in Robotics and AI 2 (2016), 35.   DOI:10.3390/e17053501
  31. J. Rauh, N. Bertschinger, E. Olbrich and J. Jost: Reconsidering unique information: Towards a multivariate information decomposition. In: 2014 IEEE International Symposium on Information Theory, IEEE, pp. 2232-2236.   DOI:10.1109/isit.2014.6875230
  32. F. Rosas, V. Ntranos, C. Ellison, S. Pollin and M. Verhelst: Understanding interdependency through complex information sharing. Entropy 18 (2016), 2, 38.   DOI:10.3390/e18020038
  33. L. S. Shapley: A Value for $n$-person Games. In: Annals of Mathematics Studies (H. Kuhn and A. Tucker, eds.), Princeton University Press. 28 (1953), pp. 307-317.   DOI:10.1515/9781400881970-018
  34. D. A. Simovici: On submodular and supermodular functions on lattices and related structures. In: Proc. The International Symposium on Multiple-Valued Logic, 2014, pp. 202-207.   DOI:10.1109/ismvl.2014.43
  35. R. P. Stanley: Enumerative Combinatorics. Cambridge University Press 2011.\\   DOI:10.1017/cbo9781139058520
  36. P. L. Williams and R. D. Beer: Nonnegative decomposition of multivariate information. arXiv:1004.2515.   CrossRef
  37. M. Zwick: An overview of reconstructability analysis. Kybernetes 33 (2004), 5/6, 877-905.   DOI:10.1108/03684920410533958