Kybernetika 55 no. 5, 782-801, 2019

Graphical model selection for a particular class of continuous-time processes

Mattia ZorziDOI: 10.14736/kyb-2019-5-0782

Abstract:

{Graphical models provide an undirected graph representation of relations between the components of a random vector. In the Gaussian case such an undirected graph is used to describe conditional independence relations among such components. In this paper, we consider a continuous-time Gaussian model which is accessible to observations only at time $T$. We introduce the concept of infinitesimal conditional independence for such a model. Then, we address the corresponding graphical model selection problem, i. e. the problem to estimate the graphical model from data. Finally, simulation studies are proposed to test the effectiveness of the graphical model selection procedure.}

Keywords:

entropy, optimization, sparse inverse covariance selection, regularization, graphical models

Classification:

93B30, 65K10

References:

  1. D. Alpago, M. Zorzi and A. Ferrante: Identification of sparse reciprocal graphical models. IEEE Control Systems Lett. 2 (2018), 4, 659-664.   DOI:10.1109/lcsys.2018.2845943
  2. E. Avventi, A. Lindquist and B. Wahlberg: ARMA identification of graphical models. IEEE Trans. Automat. Control 58 (2013), 1167-1178.   DOI:10.1109/tac.2012.2231551
  3. G. Baggio: Further results on the convergence of the Pavon-Ferrante algorithm for spectral estimation. IEEE Trans. Automat- Control 63 (2018), 10, 3510-3515.   DOI:10.1109/tac.2018.2794407
  4. O. Banerjee, L. El Ghaoui and A. d'Aspremont: Model selection through sparse maximum likelihood estimation for multivariate Gaussian or binary data. J. Machine Learning Res. 9 (2008), 485-516.   CrossRef
  5. S. Boyd and L. Vandenberghe: Convex Optimization. Cambridge Univ. Press, Cambridge 2004.   DOI:10.1017/cbo9780511804441
  6. C. Byrnes, S. Gusev and A. Lindquist: A convex optimization approach to the rational covariance extension problem. SIAM J. Optim. 37 (1998), 211-229.   DOI:10.1137/s0363012997321553
  7. C. I. Byrnes, T. T. Georgiou and A. Lindquist: A new approach to spectral estimation: A tunable high-resolution spectral estimator. IEEE Trans. Signal Process. 48 (2000), 3189-3205.   DOI:10.1109/78.875475
  8. E. Candes and Y. Plan: Matrix completion with noise. Proc. IEEE 98 (2010), 925-936.   DOI:10.1109/jproc.2009.2035722
  9. E. Candes and B. Recht: Exact matrix completion via convex optimization. Comm. ACM 55 (2012), 111-119.   DOI:10.1145/2184319.2184343
  10. V. Chandrasekaran, P. Parrilo and A. Willsky: Latent variable graphical model selection via convex optimization. Ann. Statist. 40 (2010), 1935-2013.   DOI:10.1214/12-aos1020
  11. V. Chandrasekaran and P. Shah: Relative entropy optimization and its applications. Math. Program. 161 (2017), (1-2), 1-32.   DOI:10.1007/s10107-016-0998-2
  12. T. Cover and J. Thomas: Information Theory. Wiley, New York 1991.   DOI:10.1002/0471200611
  13. A. d'Aspremont, O. Banerjee and L. El Ghaoui: First-order methods for sparse covariance selection. SIAM J. Matrix Analysis Appl. 30 (2008), 56-66.   DOI:10.1137/060670985
  14. A. Dempster: Covariance selection. Biometrics 28 (1972), 157-175.   DOI:10.2307/2528966
  15. A. Ferrante and M. Pavon: Matrix completion à la Dempster by the principle of parsimony. IEEE Trans. Inform. Theory 57 (2011), 3925-3931.   DOI:10.1109/tit.2011.2143970
  16. A. Ferrante, M. Pavon and F. Ramponi: Hellinger versus {K}ullback-{L}eibler multivariable spectrum approximation. IEEE Trans. Autom. Control 53 (2008), 954-967.   DOI:10.1109/tac.2008.920238
  17. J. Friedman, T. Hastie and R. Tibshirani: Sparse inverse covariance estimation with the graphical lasso. Biostatistics 9 (2008), 432-441.   DOI:10.1093/biostatistics/kxm045
  18. M. Grant and S. Boyd: CVX: Matlab software for disciplined convex programming, version 2.1. 2014.   CrossRef
  19. S. Gu, R. Betzel, M. Mattar, M. Cieslak, P. Delio, S. Grafton, F. Pasqualetti and D. Bassett: Optimal trajectories of brain state transitions. NeuroImage 148 (2017), 305-317.   DOI:10.1016/j.neuroimage.2017.01.003
  20. J. Huang, N. Liu, M. Pourahmadi and L. Liu: Covariance matrix selection and estimation via penalised normal likelihood. Biometrika 93 (2006), 85-98.   DOI:10.1093/biomet/93.1.85
  21. N. Huotari, L. Raitamaa, H. Helakari, J. Kananen, V. Raatikainen, A. Rasila, T. Tuovinen, J. Kantola, V. Borchardt, V. Kiviniemi and V. Korhonen: Sampling rate effects on resting state f{MRI} metrics. Frontiers Neurosci. 13 (2019), 279.   DOI:10.3389/fnins.2019.00279
  22. A. Jalali and S. Sanghavi: Learning the dependence graph of time series with latent factors. In: International Conference on Machine Learning Edinburgh 2012.   CrossRef
  23. D. Koller and N. Friedman: Probabilistic Graphical Models: Principles and Techniques. MIT Press, 2009.   CrossRef
  24. S. Lauritzen: Graphical Models. Oxford University Press, Oxford 1996.   CrossRef
  25. N. Meinshausen and P. Bühlmann: High-dimensional graphs and variable selection with the lasso. Annals Statist. 34 (2006), 1436-1462.   DOI:10.1214/009053606000000281
  26. J. Pearl: Graphical models for probabilistic and causal reasoning. In: Quantified representation of uncertainty and imprecision, Springer 1998, pp. 367-389.   DOI:10.1007/978-94-017-1735-9\_12
  27. A. Ringh, J. Karlsson and A. Lindquist: Multidimensional rational covariance extension with approximate covariance matching. SIAM J. Control Optim. 56 (2018), 2, 913-944.   DOI:10.1137/17m1127922
  28. J. Songsiri, J. Dahl and L. Vandenberghe: Graphical models of autoregressive processes. In: Convex Optimization in Signal Processing and Communications (D. Palomar and Y. Eldar, eds.), Cambridge Univ. Press, Cambridge 2010, pp. 1-29.   CrossRef
  29. J. Songsiri and L. Vandenberghe: Topology selection in graphical models of autoregressive processes. J. Machine Learning Res. 11 (2010), 2671-2705.   CrossRef
  30. Z. Yue, J. Thunberg, L. Ljung and J. Gonçalves: Identification of sparse continuous-time linear systems with low sampling rate: Exploring matrix logarithms. arXiv preprint arXiv:1605.08590, 2016.   CrossRef
  31. B. {Zhu} and G. {Baggio}: On the existence of a solution to a spectral estimation problem a la Byrnes-Georgiou-Lindquist. IEEE Trans. Automat. Control 64 (2019), 2, 820-825.   DOI:10.1109/tac.2018.2836984
  32. M. Zorzi: A new family of high-resolution multivariate spectral estimators. IEEE Trans. Automat. Control 59 (2014), 892-904.   DOI:10.1109/tac.2013.2293218
  33. M. Zorzi: Rational approximations of spectral densities based on the Alpha divergence. Math. Control Signals Systems 26 (2014), 259-278.   DOI:10.1007/s00498-013-0118-2
  34. M. Zorzi: An interpretation of the dual problem of the THREE-like approaches. Automatica 62 (2015), 87-92.   DOI:10.1016/j.automatica.2015.09.023
  35. M. Zorzi: Multivariate Spectral Estimation based on the concept of Optimal Prediction. IEEE Trans. Automat. Control 60 (2015), 1647-1652.   DOI:10.1109/tac.2014.2359713
  36. M. Zorzi: Empirical Bayesian learning in AR graphical models. Automatica 109 (2019), 108516.   DOI:10.1016/j.automatica.2019.108516
  37. M. Zorzi and R. Sepulchre: AR identification of latent-variable graphical models. IEEE Trans. Automat. Control 61 (2016), 2327-2340.   DOI:10.1109/tac.2015.2491678