# Abstract:

Point estimators based on minimization of information-theoretic divergences between empirical and hypothetical distribution induce a problem when working with continuous families which are measure-theoretically orthogonal with the family of empirical distributions. In this case, the $\phi$-divergence is always equal to its upper bound, and the minimum $\phi$-divergence estimates are trivial. Broniatowski and Vajda \cite{IV09} proposed several modifications of the minimum divergence rule to provide a solution to the above mentioned problem. We examine these new estimation methods with respect to consistency, robustness and efficiency through an extended simulation study. We focus on the well-known family of power divergences parametrized by $\alpha \in \mathbb{R}$ in the Gaussian model, and we perform a comparative computer simulation for several randomly selected contaminated and uncontaminated data sets, different sample sizes and different $\phi$-divergence parameters.

# Keywords:

robustness, minimum $\phi$-divergence estimation, subdivergence, superdivergence, PC simulation, relative efficiency

62B05, 62H30

# References:

1. M. Broniatowski and A. Keziou: Minimization of $\phi$-divergences on sets of signed measures. Studia Sci. Math. Hungar. 43 (2006), 403-442.   CrossRef
2. M. Broniatowski and A. Keziou: Parametric estimation and tests through divergences and the duality technique. J. Multivariate Anal. 100 (2009), 16-36.   CrossRef
3. M. Broniatowski and I. Vajda: Several Applications of Divergence Criteria in Continuous Families. Research Report No. 2257. Institute of Information Theory and Automation, Prague 2009.   CrossRef
4. I. Frýdlová: Minimum Kolmogorov Distance Estimators. Diploma Thesis. Czech Technical University, Prague 2004.   CrossRef
5. I. Frýdlová: Modified Power Divergence Estimators and Their Performances in Normal Models. In: Proc. FernStat2010, Faculty of Social and Economic Studies UJEP, Ústí n. L. 2010, 28-33.   CrossRef
6. F. Liese and I. Vajda: On divergences and informations in statistics and information theory. IEEE Trans. Inform. Theory 52 (2006), 4394-4412.   CrossRef
7. A. Toma and S. Leoni-Aubin: Robust tests based on dual divergence estimators and saddlepoint approximations. J. Multivariate Anal. 101 (2010), 1143-1155.   CrossRef
8. A. Toma and M. Broniatowski: Dual divergence estimators and tests: Robustness results. J. Multivariate Analysis 102 (2011), 20-36.   CrossRef
9. I. Vajda: Theory of Statistical Inference and Information. Kluwer, Boston 1989.   CrossRef