Kybernetika 50 no. 3, 310-321, 2014

Admissible invariant estimators in a linear model

Czesław StępniakDOI: 10.14736/kyb-2014-3-0310

Abstract:

Let $\mathbf{y}$ be observation vector in the usual linear model with expectation $\mathbf{A\beta }$ and covariance matrix known up to a multiplicative scalar, possibly singular. A linear statistic $\mathbf{a}^{T} \mathbf{y}$ is called invariant estimator for a parametric function $\phi = \mathbf{c}^{T}\mathbf{\beta }$ if its MSE depends on $\mathbf{\beta }$ only through $\phi $. It is shown that $ \mathbf{a}^{T}\mathbf{y}$ is admissible invariant for $\phi $, if and only if, it is a BLUE of $\phi ,$ in the case when $\phi $ is estimable with zero variance, and it is of the form $k\widehat{\phi }$, where $k\in \left\langle 0,1\right\rangle $ and $ \widehat{\phi }$ is an arbitrary BLUE, otherwise. This result is used in the one- and two-way ANOVA models. Our paper is self-contained and accessible, also for non-specialists.

Keywords:

admissibility, linear estimator, invariant estimator, one-way/two-way ANOVA

Classification:

62J05, 62C05

References:

  1. J. K. Baksalary and A. Markiewicz: Admissible linear estimators in the general Gauss-Markov model. J. Statist. Plann. Inference 19 (1988), 349-359.   CrossRef
  2. J. K. Baksalary and A. Markiewicz: A matrix inequality and admissibility of linear estimators with respect to the mean squared error criterion. Linear Algebra Appl. 112 (1989), 9-18.   CrossRef
  3. J. K. Baksalary and A. Markiewicz: Admissible linear estimators of an arbitrary vector parametric function in the general Gauss-Markov model. J. Statist. Plann. Inference 26 (1990), 161-173.   CrossRef
  4. A. Cohen: Estimates of linear combinations of the parameters of the mean vector of a multivariate distribution. Ann. Math. Statist. 36 (1965), 78-87.   CrossRef
  5. A. Cohen: All admissible estimates of the mean vector. Ann. Math. Statist. 37 (1966), 458-463.   CrossRef
  6. J. Gro\ss: More on concavity of a matrix function. SIAM J. Matrix Anal. Appl. 19 (1998), 365-368.   CrossRef
  7. J. Gro\ss: Löwner partial ordering and space preordering of Hermitian non-negative definite matrices. Linear Algebra Appl. 326 (2001), 215-223.   CrossRef
  8. J. Gro\ss and A. Markiewicz: Characterization of admissible linear estimators in the linear model. Linear Algebra Appl. 388 (2004), 239-248.   CrossRef
  9. P. R. Halmos: Finite-Dimensional Vector Spaces. Second edition. Springer-Verlag, New York 1993.   CrossRef
  10. W. Ip, H. Wong and J. Liu: Sufficient and admissible estimators in general multivariate linear model. J. Statist. Plann. Inference 135 (2005), 371-383.   CrossRef
  11. W. Klonecki: Linear estimators of mean vector in linear models: Problem of admissibility. Probab. Math. Statist. 2 (1982), 167-178.   CrossRef
  12. W. Klonecki and S. Zontek: On the structure of admissible linear estimators. J. Multivariate Anal. 24 (1988), 11-30.   CrossRef
  13. W. Kruskal: When are Gauss-Markov and least squares estimators identical? A coordinate-free approach. {Ann. Math. Statist. 39 (1968), 70-75.   CrossRef
  14. L. R. LaMotte: Admissibility in linear models. Ann. Statist. 10 (1982), 245-255.   CrossRef
  15. L. R. LaMotte: On limits of uniquely best linear estimators. Metrika 45 (1997), 197-211.   CrossRef
  16. E. L. Lehmann and H. Scheffé: Completeness, similar regions, and unbiased estimation - Part 1. Sankhy}\={a} A, 10 (1950), 305-340.   CrossRef
  17. A. Olsen, J. Seely and D. Birkes: Invariant quadratic unbiased estimation for two variance components. Ann. Statist. 4 (1976), 823-1051.   CrossRef
  18. C. R. Rao: Linear Statistical Inference. Wiley, New York 1973.   CrossRef
  19. C. R. Rao: Estimation of parameters in a linear model. Ann. Statist. 4 (1976), 1023-1037. Correction Ann. Statist. 7 (1979), 696-696.   CrossRef
  20. H. Scheffé: The Analysis of Variance. Wiley, New York 1959.   CrossRef
  21. C. St\k{e}pniak: On admissible estimators in a linear model. Biom. J. 26 (1984), 815-816.   CrossRef
  22. C. St\k{e}pniak: Ordering of nonnegative definite matrices with application to comparison of linear models. Linear Algebra Appl. 70 (1985), 67-71.   CrossRef
  23. C. St\k{e}pniak: A complete class for linear estimation in a general linear model. Ann. Inst. Statist. Math. A 39 (1987), 563-573.   CrossRef
  24. C. St\k{e}pniak: Admissible linear estimators in mixed linear models. J. Multivariate Anal. 31 (1989), 90-106.   CrossRef
  25. C. St\k{e}pniak: Perfect linear models and perfect parametric functions. J. Statist. Plann. Inference 139 (2009), 151-163.   CrossRef
  26. C. St\k{e}pniak: From equivalent linear equations to Gauss-Markov theorem. J. Inequal. Appl. (2010), ID 259672, 5 pages.   CrossRef
  27. C. St\k{e}pniak: On a problem with singularity in comparison of linear experiments. J. Statist. Plann. Inference 141 (2011), 2489-2493.   CrossRef
  28. E. Synówka-Bejenka and S. Zontek: A characterization of admissible linear estimators of fixed and random effects in linear models. Metrika 68 (2008), 157-172.   CrossRef
  29. S. Zontek: Admissibility of limits of the unique locally best estimators with application to variance components models. Probab. Math. Statist. 9 (1988), 29-44.   CrossRef
  30. G. Zyskind: On canonical forms, non-negative covariance matrices and best and simple least squares estimators in linear models. Ann. Math. Statist. 38 (1967), 1092-1109.   CrossRef