Kybernetika 37 no. 5, 585-603, 2001

A note on the rate of convergence of local polynomial estimators in regression models

Friedrich Liese and Ingo Steinke

Abstract:

Local polynomials are used to construct estimators for the value $m(x_{0})$ of the regression function $m$ and the values of the derivatives $D_{\gamma }m(x_{0})$ in a general class of nonparametric regression models. The covariables are allowed to be random or non-random. Only asymptotic conditions on the average distribution of the covariables are used as smoothness of the experimental design. This smoothness condition is discussed in detail. The optimal stochastic rate of convergence of the estimators is established. The results cover the special cases of regression models with i.i.d. errors and the case of observations at an equidistant lattice.