Kybernetika 30 no. 2, 187-198, 1994

Second-order approximation of the entropy in nonlinear least-squares estimation

Luc Pronzato and Andrej Pázman


Measures of variability of the least-squares estimator $\hat{\theta}$ are essential to assess the quality of the estimation. In nonlinear regression, an accurate approximation of the covariance matrix of $\hat{\theta}$ is difficult to obtain (Clarke 80). In this paper, a second-order approximation of the entropy of the distribution of $\hat{\theta}$ is proposed, which is only slightly more complicated than the widely used bias approximation of Box (Box 71). It is based on the "flat" or "saddle-point approximation" of the density of $\hat{\theta}$. The neglected terms are of order ${\cal O}(\sigma^4)$, while the classical first order approximation neglects terms of order ${\cal O}(\sigma^2)$. Various illustrative examples are presented, including the use of the approximate entropy as a criterion for experimental design.


62K05, 62F12, 62E17, 62J02, 62E20