Kybernetika 54 no. 2, 336-350, 2018

Existence, consistency and computer simulation for selected variants of minimum distance estimators

Václav Kůs, Domingo Morales, Jitka Hrabáková and Iva FrýdlováDOI: 10.14736/kyb-2018-2-0336

Abstract:

The paper deals with sufficient conditions for the existence of general approximate minimum distance estimator (AMDE) of a probability density function $f_0$ on the real line. It shows that the AMDE always exists when the bounded $\phi$-divergence, Kolmogorov, Lévy, Cramér, or discrepancy distance is used. Consequently, $n^{-1/2}$ consistency rate in any bounded $\phi$-divergence is established for Kolmogorov, Lévy, and discrepancy estimators under the condition that the degree of variations of the corresponding family of densities is finite. A simulation experiment empirically studies the performance of the approximate minimum Kolmogorov estimator (AMKE) and some histogram-based variants of approximate minimum divergence estimators, like power type and Le Cam, under six distributions (Uniform, Normal, Logistic, Laplace, Cauchy, Weibull). A comparison with the standard estimators (moment/maximum likelihood/median) is provided for sample sizes $n=10,20,50,120,250$. The simulation analyzes the behaviour of estimators through different families of distributions. It is shown that the performance of AMKE differs from the other estimators with respect to family type and that the AMKE estimators cope more easily with the Cauchy distribution than standard or divergence based estimators, especially for small sample sizes.

Keywords:

$\phi $-divergence, Kolmogorov distance, minimum distance estimator, consistency rate, computer simulation

Classification:

62B05, 62H30

References:

  1. D. Al Mohamad: Towards a better understanding of the dual representation of phi divergences. Statistical Papers (published on-line 2016.)   DOI:10.1007/s00362-016-0812-5
  2. A. R. Barron: The convergence in information of probability density estimators. In: IEEE Int. Symp. Information Theory, Kobe 1988.   CrossRef
  3. R. Beran: Minimum Hellinger distance estimator for parametric models. Ann. Statist. 5 (1977), 455-463.   DOI:10.1214/aos/1176343842
  4. A. Berger: Remark on separable spaces of probability measures. An. Math. Statist. 22 (1951), 119-120.   DOI:10.1214/aoms/1177729701
  5. M. Broniatowski, A. Toma and I. Vajda: Decomposable pseudodistances and applications in statistical estimation. J. Statist. Plann. Inference. 142 (2012), 9, 2574-2585.   DOI:10.1016/j.jspi.2012.03.019
  6. I. Csiszár: Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizit on Markhoffschen Ketten. Publ. Math. Inst. Hungar. Acad. Sci., Ser. A 8 (1963), 84-108.   CrossRef
  7. I. Csiszár: Information-type measures of difference of probability distributions and indirect observations. Studia Sci. Math. Hungar. 2 (1967), 299-318.   CrossRef
  8. I. Frýdlová, I. Vajda and V. Kůs: Modified power divergence estimators in normal model - simulation and comparative study. Kybernetika 48 (2012), 4, 795-808.   CrossRef
  9. A. L. Gibbs and F. E. Su: On choosing and bounding probability metrics. Int. Statist. Rev. 70 (2002), 419-435.   DOI:10.1111/j.1751-5823.2002.tb00178.x
  10. L. Győrfi, I. Vajda and E. C. van der Meulen: Family of point estimates yielded by $L_1$-consistent density estimate. In: $L_1$-Statistical Analysis and Related Methods (Y. Dodge, ed.), Elsevier, Amsterdam 1992, pp. 415-430.   CrossRef
  11. L. Győrfi, I. Vajda and E. C. van der Meulen: Minimum Hellinger distance point estimates consistent under weak family regularity. Math. Methods Statist. 3 (1994), 25-45.   CrossRef
  12. L. Győrfi, I. Vajda and E. C. van der Meulen: Minimum Kolmogorov distance estimates of parameters and parametrized distributions. Metrika 43 (1996), 237-255.   DOI:10.1007/bf02613911
  13. J. Hrabáková and V. Kůs: The Consistency and Robustness of Modified Cramér-Von Mises and Kolmogorov-Cramér Estimators. Comm. Statist. - Theory and Methods 42 (2013), 20, 3665-3677.   DOI:10.1080/03610926.2013.802806
  14. J. Hrabáková and V. Kůs: Notes on consistency of some minimum distance estimators with simulation results. Metrika 80 (2017), 243-257.   DOI:10.1007/s00184-016-0601-0
  15. P. Kafka, F. Ősterreicher and I. Vincze: On powers of $f$-divergences defining a distance. Studia Sci. Mathem. Hungarica 26 (1991), 415-422.   CrossRef
  16. V. Kůs: Blended $\phi$-divergences with examples. Kybernetika 39 (2003), 43-54.   CrossRef
  17. V. Kůs: Nonparametric Density Estimates Consistent of the Order of $n^{-1/2}$ in the $L_1$-norm. Metrika 60 (2004), 1-14.   DOI:10.1007/s001840300286
  18. V. Kůs, D. Morales and I. Vajda: Extensions of the parametric families of divergences used in statistical inference. Kybernetika 44 (2008), 1, 95-112.   CrossRef
  19. L. Le Cam: Asymptotic Methods in Statistical Decision Theory. Springer, New York 1986.   DOI:10.1007/978-1-4612-4946-7
  20. F. Liese and I. Vajda: Convex Statistical Distances. Teubner, Leipzig 1987.   CrossRef
  21. F. Liese and I. Vajda: On divergences and informations in statistics and information theory. IEEE Trans. Inform. Theory 52 (2006), 4394-4412.   DOI:10.1109/tit.2006.881731
  22. K. Matusita: Distance and decision rules. Ann. Inst. Statist. Math. 16 (1964), 305-315.   DOI:10.1007/bf02868578
  23. F. Ősterreicher: On a class of perimeter-type distances of probability distributions. Kybernetika 32 (1996), 4, 389-393.   CrossRef
  24. L. Pardo: Statistical Inference Based on Divergence Measures. Chapman and Hall, Boston 2006.   DOI:10.1201/9781420034813
  25. J. Pfanzagl: Parametric Statistical Theory. W. de Gruyter, Berlin 1994.   DOI:10.1515/9783110889765
  26. I. Vajda: Theory of Statistical Inference and Information. Kluwer, Boston 1989.   CrossRef