Title: Analytic bias reduction for $k$--sample functionals

Author(s): Christopher S. Withers and Saralees Nadarajah
Issue: Volume 70 Series A Part 2 Year 2008
Pages: 186 -- 222
We give analytic methods for nonparametric bias reduction that remove the need for computationally intensive methods like the bootstrap and the jackknife. We call an estimate $p^{th}$ order if its bias has magnitude $n^{-p}_0$ as $n_0 \to \infty$, where $n_0$ is the sample size (or the minimum sample size if the estimate is a function of more than one sample). Most estimates are only first order and require $O(N)$ calculations, where $N$ is the total sample size. The usual bootstrap and jackknife estimates are second order but they are computationally intensive, requiring $O(N^2)$ calculations for one sample. By contrast Jaeckel’s infinitesimal jackknife is an analytic second order one sample estimate requiring only $O(N)$ calculations. When $p^{th}$ order bootstrap and jackknife estimates are available, they require $O(N^p)$ calculations, and so become even more computationally intensive if one chooses $p > 2$. For general $p$ we provide analytic $p^{th}$ order nonparametric estimates that require only $O(N)$ calculations. Our estimates are given in terms of the von Mises derivatives of the functional being estimated, evaluated at the empirical distribution. For products of moments an unbiased estimate exists: our for m for this “polykay” is much simpler than the usual form in terms of power sums.
AMS (2000) subject classification. Primary 62G03; secondary 62G20, 62G30.
Keywords and phrases: Bias reduction; $k$ -samples; nonparametric; unbiased estimate; $U$ -statistics; Von Mises derivatives.